code
stringlengths
2.5k
150k
kind
stringclasses
1 value
go Package reflect Package reflect ================ * `import "reflect"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package reflect implements run-time reflection, allowing a program to manipulate objects with arbitrary types. The typical use is to take a value with static type interface{} and extract its dynamic type information by calling TypeOf, which returns a Type. A call to ValueOf returns a Value representing the run-time data. Zero takes a Type and returns a Value representing a zero value for that type. See "The Laws of Reflection" for an introduction to reflection in Go: <https://golang.org/doc/articles/laws_of_reflection.html> Index ----- * [Constants](#pkg-constants) * [func Copy(dst, src Value) int](#Copy) * [func DeepEqual(x, y any) bool](#DeepEqual) * [func Swapper(slice any) func(i, j int)](#Swapper) * [type ChanDir](#ChanDir) * [func (d ChanDir) String() string](#ChanDir.String) * [type Kind](#Kind) * [func (k Kind) String() string](#Kind.String) * [type MapIter](#MapIter) * [func (iter \*MapIter) Key() Value](#MapIter.Key) * [func (iter \*MapIter) Next() bool](#MapIter.Next) * [func (iter \*MapIter) Reset(v Value)](#MapIter.Reset) * [func (iter \*MapIter) Value() Value](#MapIter.Value) * [type Method](#Method) * [func (m Method) IsExported() bool](#Method.IsExported) * [type SelectCase](#SelectCase) * [type SelectDir](#SelectDir) * [type SliceHeader](#SliceHeader) * [type StringHeader](#StringHeader) * [type StructField](#StructField) * [func VisibleFields(t Type) []StructField](#VisibleFields) * [func (f StructField) IsExported() bool](#StructField.IsExported) * [type StructTag](#StructTag) * [func (tag StructTag) Get(key string) string](#StructTag.Get) * [func (tag StructTag) Lookup(key string) (value string, ok bool)](#StructTag.Lookup) * [type Type](#Type) * [func ArrayOf(length int, elem Type) Type](#ArrayOf) * [func ChanOf(dir ChanDir, t Type) Type](#ChanOf) * [func FuncOf(in, out []Type, variadic bool) Type](#FuncOf) * [func MapOf(key, elem Type) Type](#MapOf) * [func PointerTo(t Type) Type](#PointerTo) * [func PtrTo(t Type) Type](#PtrTo) * [func SliceOf(t Type) Type](#SliceOf) * [func StructOf(fields []StructField) Type](#StructOf) * [func TypeOf(i any) Type](#TypeOf) * [type Value](#Value) * [func Append(s Value, x ...Value) Value](#Append) * [func AppendSlice(s, t Value) Value](#AppendSlice) * [func Indirect(v Value) Value](#Indirect) * [func MakeChan(typ Type, buffer int) Value](#MakeChan) * [func MakeFunc(typ Type, fn func(args []Value) (results []Value)) Value](#MakeFunc) * [func MakeMap(typ Type) Value](#MakeMap) * [func MakeMapWithSize(typ Type, n int) Value](#MakeMapWithSize) * [func MakeSlice(typ Type, len, cap int) Value](#MakeSlice) * [func New(typ Type) Value](#New) * [func NewAt(typ Type, p unsafe.Pointer) Value](#NewAt) * [func Select(cases []SelectCase) (chosen int, recv Value, recvOK bool)](#Select) * [func ValueOf(i any) Value](#ValueOf) * [func Zero(typ Type) Value](#Zero) * [func (v Value) Addr() Value](#Value.Addr) * [func (v Value) Bool() bool](#Value.Bool) * [func (v Value) Bytes() []byte](#Value.Bytes) * [func (v Value) Call(in []Value) []Value](#Value.Call) * [func (v Value) CallSlice(in []Value) []Value](#Value.CallSlice) * [func (v Value) CanAddr() bool](#Value.CanAddr) * [func (v Value) CanComplex() bool](#Value.CanComplex) * [func (v Value) CanConvert(t Type) bool](#Value.CanConvert) * [func (v Value) CanFloat() bool](#Value.CanFloat) * [func (v Value) CanInt() bool](#Value.CanInt) * [func (v Value) CanInterface() bool](#Value.CanInterface) * [func (v Value) CanSet() bool](#Value.CanSet) * [func (v Value) CanUint() bool](#Value.CanUint) * [func (v Value) Cap() int](#Value.Cap) * [func (v Value) Close()](#Value.Close) * [func (v Value) Comparable() bool](#Value.Comparable) * [func (v Value) Complex() complex128](#Value.Complex) * [func (v Value) Convert(t Type) Value](#Value.Convert) * [func (v Value) Elem() Value](#Value.Elem) * [func (v Value) Equal(u Value) bool](#Value.Equal) * [func (v Value) Field(i int) Value](#Value.Field) * [func (v Value) FieldByIndex(index []int) Value](#Value.FieldByIndex) * [func (v Value) FieldByIndexErr(index []int) (Value, error)](#Value.FieldByIndexErr) * [func (v Value) FieldByName(name string) Value](#Value.FieldByName) * [func (v Value) FieldByNameFunc(match func(string) bool) Value](#Value.FieldByNameFunc) * [func (v Value) Float() float64](#Value.Float) * [func (v Value) Grow(n int)](#Value.Grow) * [func (v Value) Index(i int) Value](#Value.Index) * [func (v Value) Int() int64](#Value.Int) * [func (v Value) Interface() (i any)](#Value.Interface) * [func (v Value) InterfaceData() [2]uintptr](#Value.InterfaceData) * [func (v Value) IsNil() bool](#Value.IsNil) * [func (v Value) IsValid() bool](#Value.IsValid) * [func (v Value) IsZero() bool](#Value.IsZero) * [func (v Value) Kind() Kind](#Value.Kind) * [func (v Value) Len() int](#Value.Len) * [func (v Value) MapIndex(key Value) Value](#Value.MapIndex) * [func (v Value) MapKeys() []Value](#Value.MapKeys) * [func (v Value) MapRange() \*MapIter](#Value.MapRange) * [func (v Value) Method(i int) Value](#Value.Method) * [func (v Value) MethodByName(name string) Value](#Value.MethodByName) * [func (v Value) NumField() int](#Value.NumField) * [func (v Value) NumMethod() int](#Value.NumMethod) * [func (v Value) OverflowComplex(x complex128) bool](#Value.OverflowComplex) * [func (v Value) OverflowFloat(x float64) bool](#Value.OverflowFloat) * [func (v Value) OverflowInt(x int64) bool](#Value.OverflowInt) * [func (v Value) OverflowUint(x uint64) bool](#Value.OverflowUint) * [func (v Value) Pointer() uintptr](#Value.Pointer) * [func (v Value) Recv() (x Value, ok bool)](#Value.Recv) * [func (v Value) Send(x Value)](#Value.Send) * [func (v Value) Set(x Value)](#Value.Set) * [func (v Value) SetBool(x bool)](#Value.SetBool) * [func (v Value) SetBytes(x []byte)](#Value.SetBytes) * [func (v Value) SetCap(n int)](#Value.SetCap) * [func (v Value) SetComplex(x complex128)](#Value.SetComplex) * [func (v Value) SetFloat(x float64)](#Value.SetFloat) * [func (v Value) SetInt(x int64)](#Value.SetInt) * [func (v Value) SetIterKey(iter \*MapIter)](#Value.SetIterKey) * [func (v Value) SetIterValue(iter \*MapIter)](#Value.SetIterValue) * [func (v Value) SetLen(n int)](#Value.SetLen) * [func (v Value) SetMapIndex(key, elem Value)](#Value.SetMapIndex) * [func (v Value) SetPointer(x unsafe.Pointer)](#Value.SetPointer) * [func (v Value) SetString(x string)](#Value.SetString) * [func (v Value) SetUint(x uint64)](#Value.SetUint) * [func (v Value) SetZero()](#Value.SetZero) * [func (v Value) Slice(i, j int) Value](#Value.Slice) * [func (v Value) Slice3(i, j, k int) Value](#Value.Slice3) * [func (v Value) String() string](#Value.String) * [func (v Value) TryRecv() (x Value, ok bool)](#Value.TryRecv) * [func (v Value) TrySend(x Value) bool](#Value.TrySend) * [func (v Value) Type() Type](#Value.Type) * [func (v Value) Uint() uint64](#Value.Uint) * [func (v Value) UnsafeAddr() uintptr](#Value.UnsafeAddr) * [func (v Value) UnsafePointer() unsafe.Pointer](#Value.UnsafePointer) * [type ValueError](#ValueError) * [func (e \*ValueError) Error() string](#ValueError.Error) * [Bugs](#pkg-note-BUG) ### Examples [Kind](#example_Kind) [MakeFunc](#example_MakeFunc) [StructOf](#example_StructOf) [StructTag](#example_StructTag) [StructTag.Lookup](#example_StructTag_Lookup) [TypeOf](#example_TypeOf) [Value.FieldByIndex](#example_Value_FieldByIndex) [Value.FieldByName](#example_Value_FieldByName) ### Package files abi.go deepequal.go float32reg\_generic.go makefunc.go swapper.go type.go value.go visiblefields.go Constants --------- Ptr is the old name for the Pointer kind. ``` const Ptr = Pointer ``` func Copy --------- ``` func Copy(dst, src Value) int ``` Copy copies the contents of src into dst until either dst has been filled or src has been exhausted. It returns the number of elements copied. Dst and src each must have kind Slice or Array, and dst and src must have the same element type. As a special case, src can have kind String if the element type of dst is kind Uint8. func DeepEqual -------------- ``` func DeepEqual(x, y any) bool ``` DeepEqual reports whether x and y are “deeply equal,” defined as follows. Two values of identical type are deeply equal if one of the following cases applies. Values of distinct types are never deeply equal. Array values are deeply equal when their corresponding elements are deeply equal. Struct values are deeply equal if their corresponding fields, both exported and unexported, are deeply equal. Func values are deeply equal if both are nil; otherwise they are not deeply equal. Interface values are deeply equal if they hold deeply equal concrete values. Map values are deeply equal when all of the following are true: they are both nil or both non-nil, they have the same length, and either they are the same map object or their corresponding keys (matched using Go equality) map to deeply equal values. Pointer values are deeply equal if they are equal using Go's == operator or if they point to deeply equal values. Slice values are deeply equal when all of the following are true: they are both nil or both non-nil, they have the same length, and either they point to the same initial entry of the same underlying array (that is, &x[0] == &y[0]) or their corresponding elements (up to length) are deeply equal. Note that a non-nil empty slice and a nil slice (for example, []byte{} and []byte(nil)) are not deeply equal. Other values - numbers, bools, strings, and channels - are deeply equal if they are equal using Go's == operator. In general DeepEqual is a recursive relaxation of Go's == operator. However, this idea is impossible to implement without some inconsistency. Specifically, it is possible for a value to be unequal to itself, either because it is of func type (uncomparable in general) or because it is a floating-point NaN value (not equal to itself in floating-point comparison), or because it is an array, struct, or interface containing such a value. On the other hand, pointer values are always equal to themselves, even if they point at or contain such problematic values, because they compare equal using Go's == operator, and that is a sufficient condition to be deeply equal, regardless of content. DeepEqual has been defined so that the same short-cut applies to slices and maps: if x and y are the same slice or the same map, they are deeply equal regardless of content. As DeepEqual traverses the data values it may find a cycle. The second and subsequent times that DeepEqual compares two pointer values that have been compared before, it treats the values as equal rather than examining the values to which they point. This ensures that DeepEqual terminates. func Swapper 1.8 ---------------- ``` func Swapper(slice any) func(i, j int) ``` Swapper returns a function that swaps the elements in the provided slice. Swapper panics if the provided interface is not a slice. type ChanDir ------------ ChanDir represents a channel type's direction. ``` type ChanDir int ``` ``` const ( RecvDir ChanDir = 1 << iota // <-chan SendDir // chan<- BothDir = RecvDir | SendDir // chan ) ``` ### func (ChanDir) String ``` func (d ChanDir) String() string ``` type Kind --------- A Kind represents the specific kind of type that a Type represents. The zero Kind is not a valid kind. ``` type Kind uint ``` ``` const ( Invalid Kind = iota Bool Int Int8 Int16 Int32 Int64 Uint Uint8 Uint16 Uint32 Uint64 Uintptr Float32 Float64 Complex64 Complex128 Array Chan Func Interface Map Pointer Slice String Struct UnsafePointer ) ``` #### Example Code: ``` for _, v := range []any{"hi", 42, func() {}} { switch v := reflect.ValueOf(v); v.Kind() { case reflect.String: fmt.Println(v.String()) case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64: fmt.Println(v.Int()) default: fmt.Printf("unhandled kind %s", v.Kind()) } } ``` Output: ``` hi 42 unhandled kind func ``` ### func (Kind) String ``` func (k Kind) String() string ``` String returns the name of k. type MapIter 1.12 ----------------- A MapIter is an iterator for ranging over a map. See Value.MapRange. ``` type MapIter struct { // contains filtered or unexported fields } ``` ### func (\*MapIter) Key 1.12 ``` func (iter *MapIter) Key() Value ``` Key returns the key of iter's current map entry. ### func (\*MapIter) Next 1.12 ``` func (iter *MapIter) Next() bool ``` Next advances the map iterator and reports whether there is another entry. It returns false when iter is exhausted; subsequent calls to Key, Value, or Next will panic. ### func (\*MapIter) Reset 1.18 ``` func (iter *MapIter) Reset(v Value) ``` Reset modifies iter to iterate over v. It panics if v's Kind is not Map and v is not the zero Value. Reset(Value{}) causes iter to not to refer to any map, which may allow the previously iterated-over map to be garbage collected. ### func (\*MapIter) Value 1.12 ``` func (iter *MapIter) Value() Value ``` Value returns the value of iter's current map entry. type Method ----------- Method represents a single method. ``` type Method struct { // Name is the method name. Name string // PkgPath is the package path that qualifies a lower case (unexported) // method name. It is empty for upper case (exported) method names. // The combination of PkgPath and Name uniquely identifies a method // in a method set. // See https://golang.org/ref/spec#Uniqueness_of_identifiers PkgPath string Type Type // method type Func Value // func with receiver as first argument Index int // index for Type.Method } ``` ### func (Method) IsExported 1.17 ``` func (m Method) IsExported() bool ``` IsExported reports whether the method is exported. type SelectCase 1.1 ------------------- A SelectCase describes a single case in a select operation. The kind of case depends on Dir, the communication direction. If Dir is SelectDefault, the case represents a default case. Chan and Send must be zero Values. If Dir is SelectSend, the case represents a send operation. Normally Chan's underlying value must be a channel, and Send's underlying value must be assignable to the channel's element type. As a special case, if Chan is a zero Value, then the case is ignored, and the field Send will also be ignored and may be either zero or non-zero. If Dir is SelectRecv, the case represents a receive operation. Normally Chan's underlying value must be a channel and Send must be a zero Value. If Chan is a zero Value, then the case is ignored, but Send must still be a zero Value. When a receive operation is selected, the received Value is returned by Select. ``` type SelectCase struct { Dir SelectDir // direction of case Chan Value // channel to use (for send or receive) Send Value // value to send (for send) } ``` type SelectDir 1.1 ------------------ A SelectDir describes the communication direction of a select case. ``` type SelectDir int ``` ``` const ( SelectSend SelectDir // case Chan <- Send SelectRecv // case <-Chan: SelectDefault // default ) ``` type SliceHeader ---------------- SliceHeader is the runtime representation of a slice. It cannot be used safely or portably and its representation may change in a later release. Moreover, the Data field is not sufficient to guarantee the data it references will not be garbage collected, so programs must keep a separate, correctly typed pointer to the underlying data. In new code, use unsafe.Slice or unsafe.SliceData instead. ``` type SliceHeader struct { Data uintptr Len int Cap int } ``` type StringHeader ----------------- StringHeader is the runtime representation of a string. It cannot be used safely or portably and its representation may change in a later release. Moreover, the Data field is not sufficient to guarantee the data it references will not be garbage collected, so programs must keep a separate, correctly typed pointer to the underlying data. In new code, use unsafe.String or unsafe.StringData instead. ``` type StringHeader struct { Data uintptr Len int } ``` type StructField ---------------- A StructField describes a single field in a struct. ``` type StructField struct { // Name is the field name. Name string // PkgPath is the package path that qualifies a lower case (unexported) // field name. It is empty for upper case (exported) field names. // See https://golang.org/ref/spec#Uniqueness_of_identifiers PkgPath string Type Type // field type Tag StructTag // field tag string Offset uintptr // offset within struct, in bytes Index []int // index sequence for Type.FieldByIndex Anonymous bool // is an embedded field } ``` ### func VisibleFields 1.17 ``` func VisibleFields(t Type) []StructField ``` VisibleFields returns all the visible fields in t, which must be a struct type. A field is defined as visible if it's accessible directly with a FieldByName call. The returned fields include fields inside anonymous struct members and unexported fields. They follow the same order found in the struct, with anonymous fields followed immediately by their promoted fields. For each element e of the returned slice, the corresponding field can be retrieved from a value v of type t by calling v.FieldByIndex(e.Index). ### func (StructField) IsExported 1.17 ``` func (f StructField) IsExported() bool ``` IsExported reports whether the field is exported. type StructTag -------------- A StructTag is the tag string in a struct field. By convention, tag strings are a concatenation of optionally space-separated key:"value" pairs. Each key is a non-empty string consisting of non-control characters other than space (U+0020 ' '), quote (U+0022 '"'), and colon (U+003A ':'). Each value is quoted using U+0022 '"' characters and Go string literal syntax. ``` type StructTag string ``` #### Example Code: ``` type S struct { F string `species:"gopher" color:"blue"` } s := S{} st := reflect.TypeOf(s) field := st.Field(0) fmt.Println(field.Tag.Get("color"), field.Tag.Get("species")) ``` Output: ``` blue gopher ``` ### func (StructTag) Get ``` func (tag StructTag) Get(key string) string ``` Get returns the value associated with key in the tag string. If there is no such key in the tag, Get returns the empty string. If the tag does not have the conventional format, the value returned by Get is unspecified. To determine whether a tag is explicitly set to the empty string, use Lookup. ### func (StructTag) Lookup 1.7 ``` func (tag StructTag) Lookup(key string) (value string, ok bool) ``` Lookup returns the value associated with key in the tag string. If the key is present in the tag the value (which may be empty) is returned. Otherwise the returned value will be the empty string. The ok return value reports whether the value was explicitly set in the tag string. If the tag does not have the conventional format, the value returned by Lookup is unspecified. #### Example Code: ``` type S struct { F0 string `alias:"field_0"` F1 string `alias:""` F2 string } s := S{} st := reflect.TypeOf(s) for i := 0; i < st.NumField(); i++ { field := st.Field(i) if alias, ok := field.Tag.Lookup("alias"); ok { if alias == "" { fmt.Println("(blank)") } else { fmt.Println(alias) } } else { fmt.Println("(not specified)") } } ``` Output: ``` field_0 (blank) (not specified) ``` type Type --------- Type is the representation of a Go type. Not all methods apply to all kinds of types. Restrictions, if any, are noted in the documentation for each method. Use the Kind method to find out the kind of type before calling kind-specific methods. Calling a method inappropriate to the kind of type causes a run-time panic. Type values are comparable, such as with the == operator, so they can be used as map keys. Two Type values are equal if they represent identical types. ``` type Type interface { // Align returns the alignment in bytes of a value of // this type when allocated in memory. Align() int // FieldAlign returns the alignment in bytes of a value of // this type when used as a field in a struct. FieldAlign() int // Method returns the i'th method in the type's method set. // It panics if i is not in the range [0, NumMethod()). // // For a non-interface type T or *T, the returned Method's Type and Func // fields describe a function whose first argument is the receiver, // and only exported methods are accessible. // // For an interface type, the returned Method's Type field gives the // method signature, without a receiver, and the Func field is nil. // // Methods are sorted in lexicographic order. Method(int) Method // MethodByName returns the method with that name in the type's // method set and a boolean indicating if the method was found. // // For a non-interface type T or *T, the returned Method's Type and Func // fields describe a function whose first argument is the receiver. // // For an interface type, the returned Method's Type field gives the // method signature, without a receiver, and the Func field is nil. MethodByName(string) (Method, bool) // NumMethod returns the number of methods accessible using Method. // // For a non-interface type, it returns the number of exported methods. // // For an interface type, it returns the number of exported and unexported methods. NumMethod() int // Name returns the type's name within its package for a defined type. // For other (non-defined) types it returns the empty string. Name() string // PkgPath returns a defined type's package path, that is, the import path // that uniquely identifies the package, such as "encoding/base64". // If the type was predeclared (string, error) or not defined (*T, struct{}, // []int, or A where A is an alias for a non-defined type), the package path // will be the empty string. PkgPath() string // Size returns the number of bytes needed to store // a value of the given type; it is analogous to unsafe.Sizeof. Size() uintptr // String returns a string representation of the type. // The string representation may use shortened package names // (e.g., base64 instead of "encoding/base64") and is not // guaranteed to be unique among types. To test for type identity, // compare the Types directly. String() string // Kind returns the specific kind of this type. Kind() Kind // Implements reports whether the type implements the interface type u. Implements(u Type) bool // AssignableTo reports whether a value of the type is assignable to type u. AssignableTo(u Type) bool // ConvertibleTo reports whether a value of the type is convertible to type u. // Even if ConvertibleTo returns true, the conversion may still panic. // For example, a slice of type []T is convertible to *[N]T, // but the conversion will panic if its length is less than N. ConvertibleTo(u Type) bool // Comparable reports whether values of this type are comparable. // Even if Comparable returns true, the comparison may still panic. // For example, values of interface type are comparable, // but the comparison will panic if their dynamic type is not comparable. Comparable() bool // Bits returns the size of the type in bits. // It panics if the type's Kind is not one of the // sized or unsized Int, Uint, Float, or Complex kinds. Bits() int // ChanDir returns a channel type's direction. // It panics if the type's Kind is not Chan. ChanDir() ChanDir // IsVariadic reports whether a function type's final input parameter // is a "..." parameter. If so, t.In(t.NumIn() - 1) returns the parameter's // implicit actual type []T. // // For concreteness, if t represents func(x int, y ... float64), then // // t.NumIn() == 2 // t.In(0) is the reflect.Type for "int" // t.In(1) is the reflect.Type for "[]float64" // t.IsVariadic() == true // // IsVariadic panics if the type's Kind is not Func. IsVariadic() bool // Elem returns a type's element type. // It panics if the type's Kind is not Array, Chan, Map, Pointer, or Slice. Elem() Type // Field returns a struct type's i'th field. // It panics if the type's Kind is not Struct. // It panics if i is not in the range [0, NumField()). Field(i int) StructField // FieldByIndex returns the nested field corresponding // to the index sequence. It is equivalent to calling Field // successively for each index i. // It panics if the type's Kind is not Struct. FieldByIndex(index []int) StructField // FieldByName returns the struct field with the given name // and a boolean indicating if the field was found. FieldByName(name string) (StructField, bool) // FieldByNameFunc returns the struct field with a name // that satisfies the match function and a boolean indicating if // the field was found. // // FieldByNameFunc considers the fields in the struct itself // and then the fields in any embedded structs, in breadth first order, // stopping at the shallowest nesting depth containing one or more // fields satisfying the match function. If multiple fields at that depth // satisfy the match function, they cancel each other // and FieldByNameFunc returns no match. // This behavior mirrors Go's handling of name lookup in // structs containing embedded fields. FieldByNameFunc(match func(string) bool) (StructField, bool) // In returns the type of a function type's i'th input parameter. // It panics if the type's Kind is not Func. // It panics if i is not in the range [0, NumIn()). In(i int) Type // Key returns a map type's key type. // It panics if the type's Kind is not Map. Key() Type // Len returns an array type's length. // It panics if the type's Kind is not Array. Len() int // NumField returns a struct type's field count. // It panics if the type's Kind is not Struct. NumField() int // NumIn returns a function type's input parameter count. // It panics if the type's Kind is not Func. NumIn() int // NumOut returns a function type's output parameter count. // It panics if the type's Kind is not Func. NumOut() int // Out returns the type of a function type's i'th output parameter. // It panics if the type's Kind is not Func. // It panics if i is not in the range [0, NumOut()). Out(i int) Type // contains filtered or unexported methods } ``` ### func ArrayOf 1.5 ``` func ArrayOf(length int, elem Type) Type ``` ArrayOf returns the array type with the given length and element type. For example, if t represents int, ArrayOf(5, t) represents [5]int. If the resulting type would be larger than the available address space, ArrayOf panics. ### func ChanOf 1.1 ``` func ChanOf(dir ChanDir, t Type) Type ``` ChanOf returns the channel type with the given direction and element type. For example, if t represents int, ChanOf(RecvDir, t) represents <-chan int. The gc runtime imposes a limit of 64 kB on channel element types. If t's size is equal to or exceeds this limit, ChanOf panics. ### func FuncOf 1.5 ``` func FuncOf(in, out []Type, variadic bool) Type ``` FuncOf returns the function type with the given argument and result types. For example if k represents int and e represents string, FuncOf([]Type{k}, []Type{e}, false) represents func(int) string. The variadic argument controls whether the function is variadic. FuncOf panics if the in[len(in)-1] does not represent a slice and variadic is true. ### func MapOf 1.1 ``` func MapOf(key, elem Type) Type ``` MapOf returns the map type with the given key and element types. For example, if k represents int and e represents string, MapOf(k, e) represents map[int]string. If the key type is not a valid map key type (that is, if it does not implement Go's == operator), MapOf panics. ### func PointerTo 1.18 ``` func PointerTo(t Type) Type ``` PointerTo returns the pointer type with element t. For example, if t represents type Foo, PointerTo(t) represents \*Foo. ### func PtrTo ``` func PtrTo(t Type) Type ``` PtrTo returns the pointer type with element t. For example, if t represents type Foo, PtrTo(t) represents \*Foo. PtrTo is the old spelling of PointerTo. The two functions behave identically. ### func SliceOf 1.1 ``` func SliceOf(t Type) Type ``` SliceOf returns the slice type with element type t. For example, if t represents int, SliceOf(t) represents []int. ### func StructOf 1.7 ``` func StructOf(fields []StructField) Type ``` StructOf returns the struct type containing fields. The Offset and Index fields are ignored and computed as they would be by the compiler. StructOf currently does not generate wrapper methods for embedded fields and panics if passed unexported StructFields. These limitations may be lifted in a future version. #### Example Code: ``` typ := reflect.StructOf([]reflect.StructField{ { Name: "Height", Type: reflect.TypeOf(float64(0)), Tag: `json:"height"`, }, { Name: "Age", Type: reflect.TypeOf(int(0)), Tag: `json:"age"`, }, }) v := reflect.New(typ).Elem() v.Field(0).SetFloat(0.4) v.Field(1).SetInt(2) s := v.Addr().Interface() w := new(bytes.Buffer) if err := json.NewEncoder(w).Encode(s); err != nil { panic(err) } fmt.Printf("value: %+v\n", s) fmt.Printf("json: %s", w.Bytes()) r := bytes.NewReader([]byte(`{"height":1.5,"age":10}`)) if err := json.NewDecoder(r).Decode(s); err != nil { panic(err) } fmt.Printf("value: %+v\n", s) ``` Output: ``` value: &{Height:0.4 Age:2} json: {"height":0.4,"age":2} value: &{Height:1.5 Age:10} ``` ### func TypeOf ``` func TypeOf(i any) Type ``` TypeOf returns the reflection Type that represents the dynamic type of i. If i is a nil interface value, TypeOf returns nil. #### Example Code: ``` // As interface types are only used for static typing, a // common idiom to find the reflection Type for an interface // type Foo is to use a *Foo value. writerType := reflect.TypeOf((*io.Writer)(nil)).Elem() fileType := reflect.TypeOf((*os.File)(nil)) fmt.Println(fileType.Implements(writerType)) ``` Output: ``` true ``` type Value ---------- Value is the reflection interface to a Go value. Not all methods apply to all kinds of values. Restrictions, if any, are noted in the documentation for each method. Use the Kind method to find out the kind of value before calling kind-specific methods. Calling a method inappropriate to the kind of type causes a run time panic. The zero Value represents no value. Its IsValid method returns false, its Kind method returns Invalid, its String method returns "<invalid Value>", and all other methods panic. Most functions and methods never return an invalid value. If one does, its documentation states the conditions explicitly. A Value can be used concurrently by multiple goroutines provided that the underlying Go value can be used concurrently for the equivalent direct operations. To compare two Values, compare the results of the Interface method. Using == on two Values does not compare the underlying values they represent. ``` type Value struct { // contains filtered or unexported fields } ``` ### func Append ``` func Append(s Value, x ...Value) Value ``` Append appends the values x to a slice s and returns the resulting slice. As in Go, each x's value must be assignable to the slice's element type. ### func AppendSlice ``` func AppendSlice(s, t Value) Value ``` AppendSlice appends a slice t to a slice s and returns the resulting slice. The slices s and t must have the same element type. ### func Indirect ``` func Indirect(v Value) Value ``` Indirect returns the value that v points to. If v is a nil pointer, Indirect returns a zero Value. If v is not a pointer, Indirect returns v. ### func MakeChan ``` func MakeChan(typ Type, buffer int) Value ``` MakeChan creates a new channel with the specified type and buffer size. ### func MakeFunc 1.1 ``` func MakeFunc(typ Type, fn func(args []Value) (results []Value)) Value ``` MakeFunc returns a new function of the given Type that wraps the function fn. When called, that new function does the following: * converts its arguments to a slice of Values. * runs results := fn(args). * returns the results as a slice of Values, one per formal result. The implementation fn can assume that the argument Value slice has the number and type of arguments given by typ. If typ describes a variadic function, the final Value is itself a slice representing the variadic arguments, as in the body of a variadic function. The result Value slice returned by fn must have the number and type of results given by typ. The Value.Call method allows the caller to invoke a typed function in terms of Values; in contrast, MakeFunc allows the caller to implement a typed function in terms of Values. The Examples section of the documentation includes an illustration of how to use MakeFunc to build a swap function for different types. #### Example Code: ``` // swap is the implementation passed to MakeFunc. // It must work in terms of reflect.Values so that it is possible // to write code without knowing beforehand what the types // will be. swap := func(in []reflect.Value) []reflect.Value { return []reflect.Value{in[1], in[0]} } // makeSwap expects fptr to be a pointer to a nil function. // It sets that pointer to a new function created with MakeFunc. // When the function is invoked, reflect turns the arguments // into Values, calls swap, and then turns swap's result slice // into the values returned by the new function. makeSwap := func(fptr any) { // fptr is a pointer to a function. // Obtain the function value itself (likely nil) as a reflect.Value // so that we can query its type and then set the value. fn := reflect.ValueOf(fptr).Elem() // Make a function of the right type. v := reflect.MakeFunc(fn.Type(), swap) // Assign it to the value fn represents. fn.Set(v) } // Make and call a swap function for ints. var intSwap func(int, int) (int, int) makeSwap(&intSwap) fmt.Println(intSwap(0, 1)) // Make and call a swap function for float64s. var floatSwap func(float64, float64) (float64, float64) makeSwap(&floatSwap) fmt.Println(floatSwap(2.72, 3.14)) ``` Output: ``` 1 0 3.14 2.72 ``` ### func MakeMap ``` func MakeMap(typ Type) Value ``` MakeMap creates a new map with the specified type. ### func MakeMapWithSize 1.9 ``` func MakeMapWithSize(typ Type, n int) Value ``` MakeMapWithSize creates a new map with the specified type and initial space for approximately n elements. ### func MakeSlice ``` func MakeSlice(typ Type, len, cap int) Value ``` MakeSlice creates a new zero-initialized slice value for the specified slice type, length, and capacity. ### func New ``` func New(typ Type) Value ``` New returns a Value representing a pointer to a new zero value for the specified type. That is, the returned Value's Type is PointerTo(typ). ### func NewAt ``` func NewAt(typ Type, p unsafe.Pointer) Value ``` NewAt returns a Value representing a pointer to a value of the specified type, using p as that pointer. ### func Select 1.1 ``` func Select(cases []SelectCase) (chosen int, recv Value, recvOK bool) ``` Select executes a select operation described by the list of cases. Like the Go select statement, it blocks until at least one of the cases can proceed, makes a uniform pseudo-random choice, and then executes that case. It returns the index of the chosen case and, if that case was a receive operation, the value received and a boolean indicating whether the value corresponds to a send on the channel (as opposed to a zero value received because the channel is closed). Select supports a maximum of 65536 cases. ### func ValueOf ``` func ValueOf(i any) Value ``` ValueOf returns a new Value initialized to the concrete value stored in the interface i. ValueOf(nil) returns the zero Value. ### func Zero ``` func Zero(typ Type) Value ``` Zero returns a Value representing the zero value for the specified type. The result is different from the zero value of the Value struct, which represents no value at all. For example, Zero(TypeOf(42)) returns a Value with Kind Int and value 0. The returned value is neither addressable nor settable. ### func (Value) Addr ``` func (v Value) Addr() Value ``` Addr returns a pointer value representing the address of v. It panics if CanAddr() returns false. Addr is typically used to obtain a pointer to a struct field or slice element in order to call a method that requires a pointer receiver. ### func (Value) Bool ``` func (v Value) Bool() bool ``` Bool returns v's underlying value. It panics if v's kind is not Bool. ### func (Value) Bytes ``` func (v Value) Bytes() []byte ``` Bytes returns v's underlying value. It panics if v's underlying value is not a slice of bytes or an addressable array of bytes. ### func (Value) Call ``` func (v Value) Call(in []Value) []Value ``` Call calls the function v with the input arguments in. For example, if len(in) == 3, v.Call(in) represents the Go call v(in[0], in[1], in[2]). Call panics if v's Kind is not Func. It returns the output results as Values. As in Go, each input argument must be assignable to the type of the function's corresponding input parameter. If v is a variadic function, Call creates the variadic slice parameter itself, copying in the corresponding values. ### func (Value) CallSlice ``` func (v Value) CallSlice(in []Value) []Value ``` CallSlice calls the variadic function v with the input arguments in, assigning the slice in[len(in)-1] to v's final variadic argument. For example, if len(in) == 3, v.CallSlice(in) represents the Go call v(in[0], in[1], in[2]...). CallSlice panics if v's Kind is not Func or if v is not variadic. It returns the output results as Values. As in Go, each input argument must be assignable to the type of the function's corresponding input parameter. ### func (Value) CanAddr ``` func (v Value) CanAddr() bool ``` CanAddr reports whether the value's address can be obtained with Addr. Such values are called addressable. A value is addressable if it is an element of a slice, an element of an addressable array, a field of an addressable struct, or the result of dereferencing a pointer. If CanAddr returns false, calling Addr will panic. ### func (Value) CanComplex 1.18 ``` func (v Value) CanComplex() bool ``` CanComplex reports whether Complex can be used without panicking. ### func (Value) CanConvert 1.17 ``` func (v Value) CanConvert(t Type) bool ``` CanConvert reports whether the value v can be converted to type t. If v.CanConvert(t) returns true then v.Convert(t) will not panic. ### func (Value) CanFloat 1.18 ``` func (v Value) CanFloat() bool ``` CanFloat reports whether Float can be used without panicking. ### func (Value) CanInt 1.18 ``` func (v Value) CanInt() bool ``` CanInt reports whether Int can be used without panicking. ### func (Value) CanInterface ``` func (v Value) CanInterface() bool ``` CanInterface reports whether Interface can be used without panicking. ### func (Value) CanSet ``` func (v Value) CanSet() bool ``` CanSet reports whether the value of v can be changed. A Value can be changed only if it is addressable and was not obtained by the use of unexported struct fields. If CanSet returns false, calling Set or any type-specific setter (e.g., SetBool, SetInt) will panic. ### func (Value) CanUint 1.18 ``` func (v Value) CanUint() bool ``` CanUint reports whether Uint can be used without panicking. ### func (Value) Cap ``` func (v Value) Cap() int ``` Cap returns v's capacity. It panics if v's Kind is not Array, Chan, Slice or pointer to Array. ### func (Value) Close ``` func (v Value) Close() ``` Close closes the channel v. It panics if v's Kind is not Chan. ### func (Value) Comparable 1.20 ``` func (v Value) Comparable() bool ``` Comparable reports whether the value v is comparable. If the type of v is an interface, this checks the dynamic type. If this reports true then v.Interface() == x will not panic for any x, nor will v.Equal(u) for any Value u. ### func (Value) Complex ``` func (v Value) Complex() complex128 ``` Complex returns v's underlying value, as a complex128. It panics if v's Kind is not Complex64 or Complex128 ### func (Value) Convert 1.1 ``` func (v Value) Convert(t Type) Value ``` Convert returns the value v converted to type t. If the usual Go conversion rules do not allow conversion of the value v to type t, or if converting v to type t panics, Convert panics. ### func (Value) Elem ``` func (v Value) Elem() Value ``` Elem returns the value that the interface v contains or that the pointer v points to. It panics if v's Kind is not Interface or Pointer. It returns the zero Value if v is nil. ### func (Value) Equal 1.20 ``` func (v Value) Equal(u Value) bool ``` Equal reports true if v is equal to u. For two invalid values, Equal will report true. For an interface value, Equal will compare the value within the interface. Otherwise, If the values have different types, Equal will report false. Otherwise, for arrays and structs Equal will compare each element in order, and report false if it finds non-equal elements. During all comparisons, if values of the same type are compared, and the type is not comparable, Equal will panic. ### func (Value) Field ``` func (v Value) Field(i int) Value ``` Field returns the i'th field of the struct v. It panics if v's Kind is not Struct or i is out of range. ### func (Value) FieldByIndex ``` func (v Value) FieldByIndex(index []int) Value ``` FieldByIndex returns the nested field corresponding to index. It panics if evaluation requires stepping through a nil pointer or a field that is not a struct. #### Example Code: ``` // This example shows a case in which the name of a promoted field // is hidden by another field: FieldByName will not work, so // FieldByIndex must be used instead. type user struct { firstName string lastName string } type data struct { user firstName string lastName string } u := data{ user: user{"Embedded John", "Embedded Doe"}, firstName: "John", lastName: "Doe", } s := reflect.ValueOf(u).FieldByIndex([]int{0, 1}) fmt.Println("embedded last name:", s) ``` Output: ``` embedded last name: Embedded Doe ``` ### func (Value) FieldByIndexErr 1.18 ``` func (v Value) FieldByIndexErr(index []int) (Value, error) ``` FieldByIndexErr returns the nested field corresponding to index. It returns an error if evaluation requires stepping through a nil pointer, but panics if it must step through a field that is not a struct. ### func (Value) FieldByName ``` func (v Value) FieldByName(name string) Value ``` FieldByName returns the struct field with the given name. It returns the zero Value if no field was found. It panics if v's Kind is not struct. #### Example Code: ``` type user struct { firstName string lastName string } u := user{firstName: "John", lastName: "Doe"} s := reflect.ValueOf(u) fmt.Println("Name:", s.FieldByName("firstName")) ``` Output: ``` Name: John ``` ### func (Value) FieldByNameFunc ``` func (v Value) FieldByNameFunc(match func(string) bool) Value ``` FieldByNameFunc returns the struct field with a name that satisfies the match function. It panics if v's Kind is not struct. It returns the zero Value if no field was found. ### func (Value) Float ``` func (v Value) Float() float64 ``` Float returns v's underlying value, as a float64. It panics if v's Kind is not Float32 or Float64 ### func (Value) Grow 1.20 ``` func (v Value) Grow(n int) ``` Grow increases the slice's capacity, if necessary, to guarantee space for another n elements. After Grow(n), at least n elements can be appended to the slice without another allocation. It panics if v's Kind is not a Slice or if n is negative or too large to allocate the memory. ### func (Value) Index ``` func (v Value) Index(i int) Value ``` Index returns v's i'th element. It panics if v's Kind is not Array, Slice, or String or i is out of range. ### func (Value) Int ``` func (v Value) Int() int64 ``` Int returns v's underlying value, as an int64. It panics if v's Kind is not Int, Int8, Int16, Int32, or Int64. ### func (Value) Interface ``` func (v Value) Interface() (i any) ``` Interface returns v's current value as an interface{}. It is equivalent to: ``` var i interface{} = (v's underlying value) ``` It panics if the Value was obtained by accessing unexported struct fields. ### func (Value) InterfaceData ``` func (v Value) InterfaceData() [2]uintptr ``` InterfaceData returns a pair of unspecified uintptr values. It panics if v's Kind is not Interface. In earlier versions of Go, this function returned the interface's value as a uintptr pair. As of Go 1.4, the implementation of interface values precludes any defined use of InterfaceData. Deprecated: The memory representation of interface values is not compatible with InterfaceData. ### func (Value) IsNil ``` func (v Value) IsNil() bool ``` IsNil reports whether its argument v is nil. The argument must be a chan, func, interface, map, pointer, or slice value; if it is not, IsNil panics. Note that IsNil is not always equivalent to a regular comparison with nil in Go. For example, if v was created by calling ValueOf with an uninitialized interface variable i, i==nil will be true but v.IsNil will panic as v will be the zero Value. ### func (Value) IsValid ``` func (v Value) IsValid() bool ``` IsValid reports whether v represents a value. It returns false if v is the zero Value. If IsValid returns false, all other methods except String panic. Most functions and methods never return an invalid Value. If one does, its documentation states the conditions explicitly. ### func (Value) IsZero 1.13 ``` func (v Value) IsZero() bool ``` IsZero reports whether v is the zero value for its type. It panics if the argument is invalid. ### func (Value) Kind ``` func (v Value) Kind() Kind ``` Kind returns v's Kind. If v is the zero Value (IsValid returns false), Kind returns Invalid. ### func (Value) Len ``` func (v Value) Len() int ``` Len returns v's length. It panics if v's Kind is not Array, Chan, Map, Slice, String, or pointer to Array. ### func (Value) MapIndex ``` func (v Value) MapIndex(key Value) Value ``` MapIndex returns the value associated with key in the map v. It panics if v's Kind is not Map. It returns the zero Value if key is not found in the map or if v represents a nil map. As in Go, the key's value must be assignable to the map's key type. ### func (Value) MapKeys ``` func (v Value) MapKeys() []Value ``` MapKeys returns a slice containing all the keys present in the map, in unspecified order. It panics if v's Kind is not Map. It returns an empty slice if v represents a nil map. ### func (Value) MapRange 1.12 ``` func (v Value) MapRange() *MapIter ``` MapRange returns a range iterator for a map. It panics if v's Kind is not Map. Call Next to advance the iterator, and Key/Value to access each entry. Next returns false when the iterator is exhausted. MapRange follows the same iteration semantics as a range statement. Example: ``` iter := reflect.ValueOf(m).MapRange() for iter.Next() { k := iter.Key() v := iter.Value() ... } ``` ### func (Value) Method ``` func (v Value) Method(i int) Value ``` Method returns a function value corresponding to v's i'th method. The arguments to a Call on the returned function should not include a receiver; the returned function will always use v as the receiver. Method panics if i is out of range or if v is a nil interface value. ### func (Value) MethodByName ``` func (v Value) MethodByName(name string) Value ``` MethodByName returns a function value corresponding to the method of v with the given name. The arguments to a Call on the returned function should not include a receiver; the returned function will always use v as the receiver. It returns the zero Value if no method was found. ### func (Value) NumField ``` func (v Value) NumField() int ``` NumField returns the number of fields in the struct v. It panics if v's Kind is not Struct. ### func (Value) NumMethod ``` func (v Value) NumMethod() int ``` NumMethod returns the number of methods in the value's method set. For a non-interface type, it returns the number of exported methods. For an interface type, it returns the number of exported and unexported methods. ### func (Value) OverflowComplex ``` func (v Value) OverflowComplex(x complex128) bool ``` OverflowComplex reports whether the complex128 x cannot be represented by v's type. It panics if v's Kind is not Complex64 or Complex128. ### func (Value) OverflowFloat ``` func (v Value) OverflowFloat(x float64) bool ``` OverflowFloat reports whether the float64 x cannot be represented by v's type. It panics if v's Kind is not Float32 or Float64. ### func (Value) OverflowInt ``` func (v Value) OverflowInt(x int64) bool ``` OverflowInt reports whether the int64 x cannot be represented by v's type. It panics if v's Kind is not Int, Int8, Int16, Int32, or Int64. ### func (Value) OverflowUint ``` func (v Value) OverflowUint(x uint64) bool ``` OverflowUint reports whether the uint64 x cannot be represented by v's type. It panics if v's Kind is not Uint, Uintptr, Uint8, Uint16, Uint32, or Uint64. ### func (Value) Pointer ``` func (v Value) Pointer() uintptr ``` Pointer returns v's value as a uintptr. It panics if v's Kind is not Chan, Func, Map, Pointer, Slice, or UnsafePointer. If v's Kind is Func, the returned pointer is an underlying code pointer, but not necessarily enough to identify a single function uniquely. The only guarantee is that the result is zero if and only if v is a nil func Value. If v's Kind is Slice, the returned pointer is to the first element of the slice. If the slice is nil the returned value is 0. If the slice is empty but non-nil the return value is non-zero. It's preferred to use uintptr(Value.UnsafePointer()) to get the equivalent result. ### func (Value) Recv ``` func (v Value) Recv() (x Value, ok bool) ``` Recv receives and returns a value from the channel v. It panics if v's Kind is not Chan. The receive blocks until a value is ready. The boolean value ok is true if the value x corresponds to a send on the channel, false if it is a zero value received because the channel is closed. ### func (Value) Send ``` func (v Value) Send(x Value) ``` Send sends x on the channel v. It panics if v's kind is not Chan or if x's type is not the same type as v's element type. As in Go, x's value must be assignable to the channel's element type. ### func (Value) Set ``` func (v Value) Set(x Value) ``` Set assigns x to the value v. It panics if CanSet returns false. As in Go, x's value must be assignable to v's type and must not be derived from an unexported field. ### func (Value) SetBool ``` func (v Value) SetBool(x bool) ``` SetBool sets v's underlying value. It panics if v's Kind is not Bool or if CanSet() is false. ### func (Value) SetBytes ``` func (v Value) SetBytes(x []byte) ``` SetBytes sets v's underlying value. It panics if v's underlying value is not a slice of bytes. ### func (Value) SetCap 1.2 ``` func (v Value) SetCap(n int) ``` SetCap sets v's capacity to n. It panics if v's Kind is not Slice or if n is smaller than the length or greater than the capacity of the slice. ### func (Value) SetComplex ``` func (v Value) SetComplex(x complex128) ``` SetComplex sets v's underlying value to x. It panics if v's Kind is not Complex64 or Complex128, or if CanSet() is false. ### func (Value) SetFloat ``` func (v Value) SetFloat(x float64) ``` SetFloat sets v's underlying value to x. It panics if v's Kind is not Float32 or Float64, or if CanSet() is false. ### func (Value) SetInt ``` func (v Value) SetInt(x int64) ``` SetInt sets v's underlying value to x. It panics if v's Kind is not Int, Int8, Int16, Int32, or Int64, or if CanSet() is false. ### func (Value) SetIterKey 1.18 ``` func (v Value) SetIterKey(iter *MapIter) ``` SetIterKey assigns to v the key of iter's current map entry. It is equivalent to v.Set(iter.Key()), but it avoids allocating a new Value. As in Go, the key must be assignable to v's type and must not be derived from an unexported field. ### func (Value) SetIterValue 1.18 ``` func (v Value) SetIterValue(iter *MapIter) ``` SetIterValue assigns to v the value of iter's current map entry. It is equivalent to v.Set(iter.Value()), but it avoids allocating a new Value. As in Go, the value must be assignable to v's type and must not be derived from an unexported field. ### func (Value) SetLen ``` func (v Value) SetLen(n int) ``` SetLen sets v's length to n. It panics if v's Kind is not Slice or if n is negative or greater than the capacity of the slice. ### func (Value) SetMapIndex ``` func (v Value) SetMapIndex(key, elem Value) ``` SetMapIndex sets the element associated with key in the map v to elem. It panics if v's Kind is not Map. If elem is the zero Value, SetMapIndex deletes the key from the map. Otherwise if v holds a nil map, SetMapIndex will panic. As in Go, key's elem must be assignable to the map's key type, and elem's value must be assignable to the map's elem type. ### func (Value) SetPointer ``` func (v Value) SetPointer(x unsafe.Pointer) ``` SetPointer sets the unsafe.Pointer value v to x. It panics if v's Kind is not UnsafePointer. ### func (Value) SetString ``` func (v Value) SetString(x string) ``` SetString sets v's underlying value to x. It panics if v's Kind is not String or if CanSet() is false. ### func (Value) SetUint ``` func (v Value) SetUint(x uint64) ``` SetUint sets v's underlying value to x. It panics if v's Kind is not Uint, Uintptr, Uint8, Uint16, Uint32, or Uint64, or if CanSet() is false. ### func (Value) SetZero 1.20 ``` func (v Value) SetZero() ``` SetZero sets v to be the zero value of v's type. It panics if CanSet returns false. ### func (Value) Slice ``` func (v Value) Slice(i, j int) Value ``` Slice returns v[i:j]. It panics if v's Kind is not Array, Slice or String, or if v is an unaddressable array, or if the indexes are out of bounds. ### func (Value) Slice3 1.2 ``` func (v Value) Slice3(i, j, k int) Value ``` Slice3 is the 3-index form of the slice operation: it returns v[i:j:k]. It panics if v's Kind is not Array or Slice, or if v is an unaddressable array, or if the indexes are out of bounds. ### func (Value) String ``` func (v Value) String() string ``` String returns the string v's underlying value, as a string. String is a special case because of Go's String method convention. Unlike the other getters, it does not panic if v's Kind is not String. Instead, it returns a string of the form "<T value>" where T is v's type. The fmt package treats Values specially. It does not call their String method implicitly but instead prints the concrete values they hold. ### func (Value) TryRecv ``` func (v Value) TryRecv() (x Value, ok bool) ``` TryRecv attempts to receive a value from the channel v but will not block. It panics if v's Kind is not Chan. If the receive delivers a value, x is the transferred value and ok is true. If the receive cannot finish without blocking, x is the zero Value and ok is false. If the channel is closed, x is the zero value for the channel's element type and ok is false. ### func (Value) TrySend ``` func (v Value) TrySend(x Value) bool ``` TrySend attempts to send x on the channel v but will not block. It panics if v's Kind is not Chan. It reports whether the value was sent. As in Go, x's value must be assignable to the channel's element type. ### func (Value) Type ``` func (v Value) Type() Type ``` Type returns v's type. ### func (Value) Uint ``` func (v Value) Uint() uint64 ``` Uint returns v's underlying value, as a uint64. It panics if v's Kind is not Uint, Uintptr, Uint8, Uint16, Uint32, or Uint64. ### func (Value) UnsafeAddr ``` func (v Value) UnsafeAddr() uintptr ``` UnsafeAddr returns a pointer to v's data, as a uintptr. It panics if v is not addressable. It's preferred to use uintptr(Value.Addr().UnsafePointer()) to get the equivalent result. ### func (Value) UnsafePointer 1.18 ``` func (v Value) UnsafePointer() unsafe.Pointer ``` UnsafePointer returns v's value as a unsafe.Pointer. It panics if v's Kind is not Chan, Func, Map, Pointer, Slice, or UnsafePointer. If v's Kind is Func, the returned pointer is an underlying code pointer, but not necessarily enough to identify a single function uniquely. The only guarantee is that the result is zero if and only if v is a nil func Value. If v's Kind is Slice, the returned pointer is to the first element of the slice. If the slice is nil the returned value is nil. If the slice is empty but non-nil the return value is non-nil. type ValueError --------------- A ValueError occurs when a Value method is invoked on a Value that does not support it. Such cases are documented in the description of each method. ``` type ValueError struct { Method string Kind Kind } ``` ### func (\*ValueError) Error ``` func (e *ValueError) Error() string ``` Bugs ---- * ☞ FieldByName and related functions consider struct field names to be equal if the names are equal, even if they are unexported names originating in different packages. The practical effect of this is that the result of t.FieldByName("x") is not well defined if the struct type t contains multiple fields named x (embedded from different packages). FieldByName may return one of the fields named x or may report that there are none. See <https://golang.org/issue/4876> for more details. Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) |
programming_docs
go Package io Package io =========== * `import "io"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package io provides basic interfaces to I/O primitives. Its primary job is to wrap existing implementations of such primitives, such as those in package os, into shared public interfaces that abstract the functionality, plus some other related primitives. Because these interfaces and primitives wrap lower-level operations with various implementations, unless otherwise informed clients should not assume they are safe for parallel execution. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func Copy(dst Writer, src Reader) (written int64, err error)](#Copy) * [func CopyBuffer(dst Writer, src Reader, buf []byte) (written int64, err error)](#CopyBuffer) * [func CopyN(dst Writer, src Reader, n int64) (written int64, err error)](#CopyN) * [func Pipe() (\*PipeReader, \*PipeWriter)](#Pipe) * [func ReadAll(r Reader) ([]byte, error)](#ReadAll) * [func ReadAtLeast(r Reader, buf []byte, min int) (n int, err error)](#ReadAtLeast) * [func ReadFull(r Reader, buf []byte) (n int, err error)](#ReadFull) * [func WriteString(w Writer, s string) (n int, err error)](#WriteString) * [type ByteReader](#ByteReader) * [type ByteScanner](#ByteScanner) * [type ByteWriter](#ByteWriter) * [type Closer](#Closer) * [type LimitedReader](#LimitedReader) * [func (l \*LimitedReader) Read(p []byte) (n int, err error)](#LimitedReader.Read) * [type OffsetWriter](#OffsetWriter) * [func NewOffsetWriter(w WriterAt, off int64) \*OffsetWriter](#NewOffsetWriter) * [func (o \*OffsetWriter) Seek(offset int64, whence int) (int64, error)](#OffsetWriter.Seek) * [func (o \*OffsetWriter) Write(p []byte) (n int, err error)](#OffsetWriter.Write) * [func (o \*OffsetWriter) WriteAt(p []byte, off int64) (n int, err error)](#OffsetWriter.WriteAt) * [type PipeReader](#PipeReader) * [func (r \*PipeReader) Close() error](#PipeReader.Close) * [func (r \*PipeReader) CloseWithError(err error) error](#PipeReader.CloseWithError) * [func (r \*PipeReader) Read(data []byte) (n int, err error)](#PipeReader.Read) * [type PipeWriter](#PipeWriter) * [func (w \*PipeWriter) Close() error](#PipeWriter.Close) * [func (w \*PipeWriter) CloseWithError(err error) error](#PipeWriter.CloseWithError) * [func (w \*PipeWriter) Write(data []byte) (n int, err error)](#PipeWriter.Write) * [type ReadCloser](#ReadCloser) * [func NopCloser(r Reader) ReadCloser](#NopCloser) * [type ReadSeekCloser](#ReadSeekCloser) * [type ReadSeeker](#ReadSeeker) * [type ReadWriteCloser](#ReadWriteCloser) * [type ReadWriteSeeker](#ReadWriteSeeker) * [type ReadWriter](#ReadWriter) * [type Reader](#Reader) * [func LimitReader(r Reader, n int64) Reader](#LimitReader) * [func MultiReader(readers ...Reader) Reader](#MultiReader) * [func TeeReader(r Reader, w Writer) Reader](#TeeReader) * [type ReaderAt](#ReaderAt) * [type ReaderFrom](#ReaderFrom) * [type RuneReader](#RuneReader) * [type RuneScanner](#RuneScanner) * [type SectionReader](#SectionReader) * [func NewSectionReader(r ReaderAt, off int64, n int64) \*SectionReader](#NewSectionReader) * [func (s \*SectionReader) Read(p []byte) (n int, err error)](#SectionReader.Read) * [func (s \*SectionReader) ReadAt(p []byte, off int64) (n int, err error)](#SectionReader.ReadAt) * [func (s \*SectionReader) Seek(offset int64, whence int) (int64, error)](#SectionReader.Seek) * [func (s \*SectionReader) Size() int64](#SectionReader.Size) * [type Seeker](#Seeker) * [type StringWriter](#StringWriter) * [type WriteCloser](#WriteCloser) * [type WriteSeeker](#WriteSeeker) * [type Writer](#Writer) * [func MultiWriter(writers ...Writer) Writer](#MultiWriter) * [type WriterAt](#WriterAt) * [type WriterTo](#WriterTo) ### Examples [Copy](#example_Copy) [CopyBuffer](#example_CopyBuffer) [CopyN](#example_CopyN) [LimitReader](#example_LimitReader) [MultiReader](#example_MultiReader) [MultiWriter](#example_MultiWriter) [Pipe](#example_Pipe) [ReadAll](#example_ReadAll) [ReadAtLeast](#example_ReadAtLeast) [ReadFull](#example_ReadFull) [SectionReader](#example_SectionReader) [SectionReader.Read](#example_SectionReader_Read) [SectionReader.ReadAt](#example_SectionReader_ReadAt) [SectionReader.Seek](#example_SectionReader_Seek) [SectionReader.Size](#example_SectionReader_Size) [TeeReader](#example_TeeReader) [WriteString](#example_WriteString) ### Package files io.go multi.go pipe.go Constants --------- Seek whence values. ``` const ( SeekStart = 0 // seek relative to the origin of the file SeekCurrent = 1 // seek relative to the current offset SeekEnd = 2 // seek relative to the end ) ``` Variables --------- EOF is the error returned by Read when no more input is available. (Read must return EOF itself, not an error wrapping EOF, because callers will test for EOF using ==.) Functions should return EOF only to signal a graceful end of input. If the EOF occurs unexpectedly in a structured data stream, the appropriate error is either ErrUnexpectedEOF or some other error giving more detail. ``` var EOF = errors.New("EOF") ``` ErrClosedPipe is the error used for read or write operations on a closed pipe. ``` var ErrClosedPipe = errors.New("io: read/write on closed pipe") ``` ErrNoProgress is returned by some clients of a Reader when many calls to Read have failed to return any data or error, usually the sign of a broken Reader implementation. ``` var ErrNoProgress = errors.New("multiple Read calls return no data or error") ``` ErrShortBuffer means that a read required a longer buffer than was provided. ``` var ErrShortBuffer = errors.New("short buffer") ``` ErrShortWrite means that a write accepted fewer bytes than requested but failed to return an explicit error. ``` var ErrShortWrite = errors.New("short write") ``` ErrUnexpectedEOF means that EOF was encountered in the middle of reading a fixed-size block or data structure. ``` var ErrUnexpectedEOF = errors.New("unexpected EOF") ``` func Copy --------- ``` func Copy(dst Writer, src Reader) (written int64, err error) ``` Copy copies from src to dst until either EOF is reached on src or an error occurs. It returns the number of bytes copied and the first error encountered while copying, if any. A successful Copy returns err == nil, not err == EOF. Because Copy is defined to read from src until EOF, it does not treat an EOF from Read as an error to be reported. If src implements the WriterTo interface, the copy is implemented by calling src.WriteTo(dst). Otherwise, if dst implements the ReaderFrom interface, the copy is implemented by calling dst.ReadFrom(src). #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") if _, err := io.Copy(os.Stdout, r); err != nil { log.Fatal(err) } ``` Output: ``` some io.Reader stream to be read ``` func CopyBuffer 1.5 ------------------- ``` func CopyBuffer(dst Writer, src Reader, buf []byte) (written int64, err error) ``` CopyBuffer is identical to Copy except that it stages through the provided buffer (if one is required) rather than allocating a temporary one. If buf is nil, one is allocated; otherwise if it has zero length, CopyBuffer panics. If either src implements WriterTo or dst implements ReaderFrom, buf will not be used to perform the copy. #### Example Code: ``` r1 := strings.NewReader("first reader\n") r2 := strings.NewReader("second reader\n") buf := make([]byte, 8) // buf is used here... if _, err := io.CopyBuffer(os.Stdout, r1, buf); err != nil { log.Fatal(err) } // ... reused here also. No need to allocate an extra buffer. if _, err := io.CopyBuffer(os.Stdout, r2, buf); err != nil { log.Fatal(err) } ``` Output: ``` first reader second reader ``` func CopyN ---------- ``` func CopyN(dst Writer, src Reader, n int64) (written int64, err error) ``` CopyN copies n bytes (or until an error) from src to dst. It returns the number of bytes copied and the earliest error encountered while copying. On return, written == n if and only if err == nil. If dst implements the ReaderFrom interface, the copy is implemented using it. #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read") if _, err := io.CopyN(os.Stdout, r, 4); err != nil { log.Fatal(err) } ``` Output: ``` some ``` func Pipe --------- ``` func Pipe() (*PipeReader, *PipeWriter) ``` Pipe creates a synchronous in-memory pipe. It can be used to connect code expecting an io.Reader with code expecting an io.Writer. Reads and Writes on the pipe are matched one to one except when multiple Reads are needed to consume a single Write. That is, each Write to the PipeWriter blocks until it has satisfied one or more Reads from the PipeReader that fully consume the written data. The data is copied directly from the Write to the corresponding Read (or Reads); there is no internal buffering. It is safe to call Read and Write in parallel with each other or with Close. Parallel calls to Read and parallel calls to Write are also safe: the individual calls will be gated sequentially. #### Example Code: ``` r, w := io.Pipe() go func() { fmt.Fprint(w, "some io.Reader stream to be read\n") w.Close() }() if _, err := io.Copy(os.Stdout, r); err != nil { log.Fatal(err) } ``` Output: ``` some io.Reader stream to be read ``` func ReadAll 1.16 ----------------- ``` func ReadAll(r Reader) ([]byte, error) ``` ReadAll reads from r until an error or EOF and returns the data it read. A successful call returns err == nil, not err == EOF. Because ReadAll is defined to read from src until EOF, it does not treat an EOF from Read as an error to be reported. #### Example Code: ``` r := strings.NewReader("Go is a general-purpose language designed with systems programming in mind.") b, err := io.ReadAll(r) if err != nil { log.Fatal(err) } fmt.Printf("%s", b) ``` Output: ``` Go is a general-purpose language designed with systems programming in mind. ``` func ReadAtLeast ---------------- ``` func ReadAtLeast(r Reader, buf []byte, min int) (n int, err error) ``` ReadAtLeast reads from r into buf until it has read at least min bytes. It returns the number of bytes copied and an error if fewer bytes were read. The error is EOF only if no bytes were read. If an EOF happens after reading fewer than min bytes, ReadAtLeast returns ErrUnexpectedEOF. If min is greater than the length of buf, ReadAtLeast returns ErrShortBuffer. On return, n >= min if and only if err == nil. If r returns an error having read at least min bytes, the error is dropped. #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") buf := make([]byte, 14) if _, err := io.ReadAtLeast(r, buf, 4); err != nil { log.Fatal(err) } fmt.Printf("%s\n", buf) // buffer smaller than minimal read size. shortBuf := make([]byte, 3) if _, err := io.ReadAtLeast(r, shortBuf, 4); err != nil { fmt.Println("error:", err) } // minimal read size bigger than io.Reader stream longBuf := make([]byte, 64) if _, err := io.ReadAtLeast(r, longBuf, 64); err != nil { fmt.Println("error:", err) } ``` Output: ``` some io.Reader error: short buffer error: unexpected EOF ``` func ReadFull ------------- ``` func ReadFull(r Reader, buf []byte) (n int, err error) ``` ReadFull reads exactly len(buf) bytes from r into buf. It returns the number of bytes copied and an error if fewer bytes were read. The error is EOF only if no bytes were read. If an EOF happens after reading some but not all the bytes, ReadFull returns ErrUnexpectedEOF. On return, n == len(buf) if and only if err == nil. If r returns an error having read at least len(buf) bytes, the error is dropped. #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") buf := make([]byte, 4) if _, err := io.ReadFull(r, buf); err != nil { log.Fatal(err) } fmt.Printf("%s\n", buf) // minimal read size bigger than io.Reader stream longBuf := make([]byte, 64) if _, err := io.ReadFull(r, longBuf); err != nil { fmt.Println("error:", err) } ``` Output: ``` some error: unexpected EOF ``` func WriteString ---------------- ``` func WriteString(w Writer, s string) (n int, err error) ``` WriteString writes the contents of the string s to w, which accepts a slice of bytes. If w implements StringWriter, its WriteString method is invoked directly. Otherwise, w.Write is called exactly once. #### Example Code: ``` if _, err := io.WriteString(os.Stdout, "Hello World"); err != nil { log.Fatal(err) } ``` Output: ``` Hello World ``` type ByteReader --------------- ByteReader is the interface that wraps the ReadByte method. ReadByte reads and returns the next byte from the input or any error encountered. If ReadByte returns an error, no input byte was consumed, and the returned byte value is undefined. ReadByte provides an efficient interface for byte-at-time processing. A Reader that does not implement ByteReader can be wrapped using bufio.NewReader to add this method. ``` type ByteReader interface { ReadByte() (byte, error) } ``` type ByteScanner ---------------- ByteScanner is the interface that adds the UnreadByte method to the basic ReadByte method. UnreadByte causes the next call to ReadByte to return the last byte read. If the last operation was not a successful call to ReadByte, UnreadByte may return an error, unread the last byte read (or the byte prior to the last-unread byte), or (in implementations that support the Seeker interface) seek to one byte before the current offset. ``` type ByteScanner interface { ByteReader UnreadByte() error } ``` type ByteWriter 1.1 ------------------- ByteWriter is the interface that wraps the WriteByte method. ``` type ByteWriter interface { WriteByte(c byte) error } ``` type Closer ----------- Closer is the interface that wraps the basic Close method. The behavior of Close after the first call is undefined. Specific implementations may document their own behavior. ``` type Closer interface { Close() error } ``` type LimitedReader ------------------ A LimitedReader reads from R but limits the amount of data returned to just N bytes. Each call to Read updates N to reflect the new amount remaining. Read returns EOF when N <= 0 or when the underlying R returns EOF. ``` type LimitedReader struct { R Reader // underlying reader N int64 // max bytes remaining } ``` ### func (\*LimitedReader) Read ``` func (l *LimitedReader) Read(p []byte) (n int, err error) ``` type OffsetWriter 1.20 ---------------------- An OffsetWriter maps writes at offset base to offset base+off in the underlying writer. ``` type OffsetWriter struct { // contains filtered or unexported fields } ``` ### func NewOffsetWriter 1.20 ``` func NewOffsetWriter(w WriterAt, off int64) *OffsetWriter ``` NewOffsetWriter returns an OffsetWriter that writes to w starting at offset off. ### func (\*OffsetWriter) Seek 1.20 ``` func (o *OffsetWriter) Seek(offset int64, whence int) (int64, error) ``` ### func (\*OffsetWriter) Write 1.20 ``` func (o *OffsetWriter) Write(p []byte) (n int, err error) ``` ### func (\*OffsetWriter) WriteAt 1.20 ``` func (o *OffsetWriter) WriteAt(p []byte, off int64) (n int, err error) ``` type PipeReader --------------- A PipeReader is the read half of a pipe. ``` type PipeReader struct { // contains filtered or unexported fields } ``` ### func (\*PipeReader) Close ``` func (r *PipeReader) Close() error ``` Close closes the reader; subsequent writes to the write half of the pipe will return the error ErrClosedPipe. ### func (\*PipeReader) CloseWithError ``` func (r *PipeReader) CloseWithError(err error) error ``` CloseWithError closes the reader; subsequent writes to the write half of the pipe will return the error err. CloseWithError never overwrites the previous error if it exists and always returns nil. ### func (\*PipeReader) Read ``` func (r *PipeReader) Read(data []byte) (n int, err error) ``` Read implements the standard Read interface: it reads data from the pipe, blocking until a writer arrives or the write end is closed. If the write end is closed with an error, that error is returned as err; otherwise err is EOF. type PipeWriter --------------- A PipeWriter is the write half of a pipe. ``` type PipeWriter struct { // contains filtered or unexported fields } ``` ### func (\*PipeWriter) Close ``` func (w *PipeWriter) Close() error ``` Close closes the writer; subsequent reads from the read half of the pipe will return no bytes and EOF. ### func (\*PipeWriter) CloseWithError ``` func (w *PipeWriter) CloseWithError(err error) error ``` CloseWithError closes the writer; subsequent reads from the read half of the pipe will return no bytes and the error err, or EOF if err is nil. CloseWithError never overwrites the previous error if it exists and always returns nil. ### func (\*PipeWriter) Write ``` func (w *PipeWriter) Write(data []byte) (n int, err error) ``` Write implements the standard Write interface: it writes data to the pipe, blocking until one or more readers have consumed all the data or the read end is closed. If the read end is closed with an error, that err is returned as err; otherwise err is ErrClosedPipe. type ReadCloser --------------- ReadCloser is the interface that groups the basic Read and Close methods. ``` type ReadCloser interface { Reader Closer } ``` ### func NopCloser 1.16 ``` func NopCloser(r Reader) ReadCloser ``` NopCloser returns a ReadCloser with a no-op Close method wrapping the provided Reader r. If r implements WriterTo, the returned ReadCloser will implement WriterTo by forwarding calls to r. type ReadSeekCloser 1.16 ------------------------ ReadSeekCloser is the interface that groups the basic Read, Seek and Close methods. ``` type ReadSeekCloser interface { Reader Seeker Closer } ``` type ReadSeeker --------------- ReadSeeker is the interface that groups the basic Read and Seek methods. ``` type ReadSeeker interface { Reader Seeker } ``` type ReadWriteCloser -------------------- ReadWriteCloser is the interface that groups the basic Read, Write and Close methods. ``` type ReadWriteCloser interface { Reader Writer Closer } ``` type ReadWriteSeeker -------------------- ReadWriteSeeker is the interface that groups the basic Read, Write and Seek methods. ``` type ReadWriteSeeker interface { Reader Writer Seeker } ``` type ReadWriter --------------- ReadWriter is the interface that groups the basic Read and Write methods. ``` type ReadWriter interface { Reader Writer } ``` type Reader ----------- Reader is the interface that wraps the basic Read method. Read reads up to len(p) bytes into p. It returns the number of bytes read (0 <= n <= len(p)) and any error encountered. Even if Read returns n < len(p), it may use all of p as scratch space during the call. If some data is available but not len(p) bytes, Read conventionally returns what is available instead of waiting for more. When Read encounters an error or end-of-file condition after successfully reading n > 0 bytes, it returns the number of bytes read. It may return the (non-nil) error from the same call or return the error (and n == 0) from a subsequent call. An instance of this general case is that a Reader returning a non-zero number of bytes at the end of the input stream may return either err == EOF or err == nil. The next Read should return 0, EOF. Callers should always process the n > 0 bytes returned before considering the error err. Doing so correctly handles I/O errors that happen after reading some bytes and also both of the allowed EOF behaviors. Implementations of Read are discouraged from returning a zero byte count with a nil error, except when len(p) == 0. Callers should treat a return of 0 and nil as indicating that nothing happened; in particular it does not indicate EOF. Implementations must not retain p. ``` type Reader interface { Read(p []byte) (n int, err error) } ``` ### func LimitReader ``` func LimitReader(r Reader, n int64) Reader ``` LimitReader returns a Reader that reads from r but stops with EOF after n bytes. The underlying implementation is a \*LimitedReader. #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") lr := io.LimitReader(r, 4) if _, err := io.Copy(os.Stdout, lr); err != nil { log.Fatal(err) } ``` Output: ``` some ``` ### func MultiReader ``` func MultiReader(readers ...Reader) Reader ``` MultiReader returns a Reader that's the logical concatenation of the provided input readers. They're read sequentially. Once all inputs have returned EOF, Read will return EOF. If any of the readers return a non-nil, non-EOF error, Read will return that error. #### Example Code: ``` r1 := strings.NewReader("first reader ") r2 := strings.NewReader("second reader ") r3 := strings.NewReader("third reader\n") r := io.MultiReader(r1, r2, r3) if _, err := io.Copy(os.Stdout, r); err != nil { log.Fatal(err) } ``` Output: ``` first reader second reader third reader ``` ### func TeeReader ``` func TeeReader(r Reader, w Writer) Reader ``` TeeReader returns a Reader that writes to w what it reads from r. All reads from r performed through it are matched with corresponding writes to w. There is no internal buffering - the write must complete before the read completes. Any error encountered while writing is reported as a read error. #### Example Code: ``` var r io.Reader = strings.NewReader("some io.Reader stream to be read\n") r = io.TeeReader(r, os.Stdout) // Everything read from r will be copied to stdout. if _, err := io.ReadAll(r); err != nil { log.Fatal(err) } ``` Output: ``` some io.Reader stream to be read ``` type ReaderAt ------------- ReaderAt is the interface that wraps the basic ReadAt method. ReadAt reads len(p) bytes into p starting at offset off in the underlying input source. It returns the number of bytes read (0 <= n <= len(p)) and any error encountered. When ReadAt returns n < len(p), it returns a non-nil error explaining why more bytes were not returned. In this respect, ReadAt is stricter than Read. Even if ReadAt returns n < len(p), it may use all of p as scratch space during the call. If some data is available but not len(p) bytes, ReadAt blocks until either all the data is available or an error occurs. In this respect ReadAt is different from Read. If the n = len(p) bytes returned by ReadAt are at the end of the input source, ReadAt may return either err == EOF or err == nil. If ReadAt is reading from an input source with a seek offset, ReadAt should not affect nor be affected by the underlying seek offset. Clients of ReadAt can execute parallel ReadAt calls on the same input source. Implementations must not retain p. ``` type ReaderAt interface { ReadAt(p []byte, off int64) (n int, err error) } ``` type ReaderFrom --------------- ReaderFrom is the interface that wraps the ReadFrom method. ReadFrom reads data from r until EOF or error. The return value n is the number of bytes read. Any error except EOF encountered during the read is also returned. The Copy function uses ReaderFrom if available. ``` type ReaderFrom interface { ReadFrom(r Reader) (n int64, err error) } ``` type RuneReader --------------- RuneReader is the interface that wraps the ReadRune method. ReadRune reads a single encoded Unicode character and returns the rune and its size in bytes. If no character is available, err will be set. ``` type RuneReader interface { ReadRune() (r rune, size int, err error) } ``` type RuneScanner ---------------- RuneScanner is the interface that adds the UnreadRune method to the basic ReadRune method. UnreadRune causes the next call to ReadRune to return the last rune read. If the last operation was not a successful call to ReadRune, UnreadRune may return an error, unread the last rune read (or the rune prior to the last-unread rune), or (in implementations that support the Seeker interface) seek to the start of the rune before the current offset. ``` type RuneScanner interface { RuneReader UnreadRune() error } ``` type SectionReader ------------------ SectionReader implements Read, Seek, and ReadAt on a section of an underlying ReaderAt. ``` type SectionReader struct { // contains filtered or unexported fields } ``` #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") s := io.NewSectionReader(r, 5, 17) if _, err := io.Copy(os.Stdout, s); err != nil { log.Fatal(err) } ``` Output: ``` io.Reader stream ``` ### func NewSectionReader ``` func NewSectionReader(r ReaderAt, off int64, n int64) *SectionReader ``` NewSectionReader returns a SectionReader that reads from r starting at offset off and stops with EOF after n bytes. ### func (\*SectionReader) Read ``` func (s *SectionReader) Read(p []byte) (n int, err error) ``` #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") s := io.NewSectionReader(r, 5, 17) buf := make([]byte, 9) if _, err := s.Read(buf); err != nil { log.Fatal(err) } fmt.Printf("%s\n", buf) ``` Output: ``` io.Reader ``` ### func (\*SectionReader) ReadAt ``` func (s *SectionReader) ReadAt(p []byte, off int64) (n int, err error) ``` #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") s := io.NewSectionReader(r, 5, 17) buf := make([]byte, 6) if _, err := s.ReadAt(buf, 10); err != nil { log.Fatal(err) } fmt.Printf("%s\n", buf) ``` Output: ``` stream ``` ### func (\*SectionReader) Seek ``` func (s *SectionReader) Seek(offset int64, whence int) (int64, error) ``` #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") s := io.NewSectionReader(r, 5, 17) if _, err := s.Seek(10, io.SeekStart); err != nil { log.Fatal(err) } if _, err := io.Copy(os.Stdout, s); err != nil { log.Fatal(err) } ``` Output: ``` stream ``` ### func (\*SectionReader) Size ``` func (s *SectionReader) Size() int64 ``` Size returns the size of the section in bytes. #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") s := io.NewSectionReader(r, 5, 17) fmt.Println(s.Size()) ``` Output: ``` 17 ``` type Seeker ----------- Seeker is the interface that wraps the basic Seek method. Seek sets the offset for the next Read or Write to offset, interpreted according to whence: SeekStart means relative to the start of the file, SeekCurrent means relative to the current offset, and SeekEnd means relative to the end (for example, offset = -2 specifies the penultimate byte of the file). Seek returns the new offset relative to the start of the file or an error, if any. Seeking to an offset before the start of the file is an error. Seeking to any positive offset may be allowed, but if the new offset exceeds the size of the underlying object the behavior of subsequent I/O operations is implementation-dependent. ``` type Seeker interface { Seek(offset int64, whence int) (int64, error) } ``` type StringWriter 1.12 ---------------------- StringWriter is the interface that wraps the WriteString method. ``` type StringWriter interface { WriteString(s string) (n int, err error) } ``` type WriteCloser ---------------- WriteCloser is the interface that groups the basic Write and Close methods. ``` type WriteCloser interface { Writer Closer } ``` type WriteSeeker ---------------- WriteSeeker is the interface that groups the basic Write and Seek methods. ``` type WriteSeeker interface { Writer Seeker } ``` type Writer ----------- Writer is the interface that wraps the basic Write method. Write writes len(p) bytes from p to the underlying data stream. It returns the number of bytes written from p (0 <= n <= len(p)) and any error encountered that caused the write to stop early. Write must return a non-nil error if it returns n < len(p). Write must not modify the slice data, even temporarily. Implementations must not retain p. ``` type Writer interface { Write(p []byte) (n int, err error) } ``` Discard is a Writer on which all Write calls succeed without doing anything. ``` var Discard Writer = discard{} ``` ### func MultiWriter ``` func MultiWriter(writers ...Writer) Writer ``` MultiWriter creates a writer that duplicates its writes to all the provided writers, similar to the Unix tee(1) command. Each write is written to each listed writer, one at a time. If a listed writer returns an error, that overall write operation stops and returns the error; it does not continue down the list. #### Example Code: ``` r := strings.NewReader("some io.Reader stream to be read\n") var buf1, buf2 strings.Builder w := io.MultiWriter(&buf1, &buf2) if _, err := io.Copy(w, r); err != nil { log.Fatal(err) } fmt.Print(buf1.String()) fmt.Print(buf2.String()) ``` Output: ``` some io.Reader stream to be read some io.Reader stream to be read ``` type WriterAt ------------- WriterAt is the interface that wraps the basic WriteAt method. WriteAt writes len(p) bytes from p to the underlying data stream at offset off. It returns the number of bytes written from p (0 <= n <= len(p)) and any error encountered that caused the write to stop early. WriteAt must return a non-nil error if it returns n < len(p). If WriteAt is writing to a destination with a seek offset, WriteAt should not affect nor be affected by the underlying seek offset. Clients of WriteAt can execute parallel WriteAt calls on the same destination if the ranges do not overlap. Implementations must not retain p. ``` type WriterAt interface { WriteAt(p []byte, off int64) (n int, err error) } ``` type WriterTo ------------- WriterTo is the interface that wraps the WriteTo method. WriteTo writes data to w until there's no more data to write or when an error occurs. The return value n is the number of bytes written. Any error encountered during the write is also returned. The Copy function uses WriterTo if available. ``` type WriterTo interface { WriteTo(w Writer) (n int64, err error) } ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [fs](fs/index) | Package fs defines basic interfaces to a file system. | | [ioutil](ioutil/index) | Package ioutil implements some I/O utility functions. |
programming_docs
go Package ioutil Package ioutil =============== * `import "io/ioutil"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package ioutil implements some I/O utility functions. Deprecated: As of Go 1.16, the same functionality is now provided by package io or package os, and those implementations should be preferred in new code. See the specific function documentation for details. Index ----- * [Variables](#pkg-variables) * [func NopCloser(r io.Reader) io.ReadCloser](#NopCloser) * [func ReadAll(r io.Reader) ([]byte, error)](#ReadAll) * [func ReadDir(dirname string) ([]fs.FileInfo, error)](#ReadDir) * [func ReadFile(filename string) ([]byte, error)](#ReadFile) * [func TempDir(dir, pattern string) (name string, err error)](#TempDir) * [func TempFile(dir, pattern string) (f \*os.File, err error)](#TempFile) * [func WriteFile(filename string, data []byte, perm fs.FileMode) error](#WriteFile) ### Examples [ReadAll](#example_ReadAll) [ReadDir](#example_ReadDir) [ReadFile](#example_ReadFile) [TempDir](#example_TempDir) [TempDir (Suffix)](#example_TempDir_suffix) [TempFile](#example_TempFile) [TempFile (Suffix)](#example_TempFile_suffix) [WriteFile](#example_WriteFile) ### Package files ioutil.go tempfile.go Variables --------- Discard is an io.Writer on which all Write calls succeed without doing anything. Deprecated: As of Go 1.16, this value is simply io.Discard. ``` var Discard io.Writer = io.Discard ``` func NopCloser -------------- ``` func NopCloser(r io.Reader) io.ReadCloser ``` NopCloser returns a ReadCloser with a no-op Close method wrapping the provided Reader r. Deprecated: As of Go 1.16, this function simply calls io.NopCloser. func ReadAll ------------ ``` func ReadAll(r io.Reader) ([]byte, error) ``` ReadAll reads from r until an error or EOF and returns the data it read. A successful call returns err == nil, not err == EOF. Because ReadAll is defined to read from src until EOF, it does not treat an EOF from Read as an error to be reported. Deprecated: As of Go 1.16, this function simply calls io.ReadAll. #### Example Code: ``` r := strings.NewReader("Go is a general-purpose language designed with systems programming in mind.") b, err := ioutil.ReadAll(r) if err != nil { log.Fatal(err) } fmt.Printf("%s", b) ``` Output: ``` Go is a general-purpose language designed with systems programming in mind. ``` func ReadDir ------------ ``` func ReadDir(dirname string) ([]fs.FileInfo, error) ``` ReadDir reads the directory named by dirname and returns a list of fs.FileInfo for the directory's contents, sorted by filename. If an error occurs reading the directory, ReadDir returns no directory entries along with the error. Deprecated: As of Go 1.16, os.ReadDir is a more efficient and correct choice: it returns a list of fs.DirEntry instead of fs.FileInfo, and it returns partial results in the case of an error midway through reading a directory. If you must continue obtaining a list of fs.FileInfo, you still can: ``` entries, err := os.ReadDir(dirname) if err != nil { ... } infos := make([]fs.FileInfo, 0, len(entries)) for _, entry := range entries { info, err := entry.Info() if err != nil { ... } infos = append(infos, info) } ``` #### Example Code: ``` files, err := ioutil.ReadDir(".") if err != nil { log.Fatal(err) } for _, file := range files { fmt.Println(file.Name()) } ``` func ReadFile ------------- ``` func ReadFile(filename string) ([]byte, error) ``` ReadFile reads the file named by filename and returns the contents. A successful call returns err == nil, not err == EOF. Because ReadFile reads the whole file, it does not treat an EOF from Read as an error to be reported. Deprecated: As of Go 1.16, this function simply calls os.ReadFile. #### Example Code: ``` content, err := ioutil.ReadFile("testdata/hello") if err != nil { log.Fatal(err) } fmt.Printf("File contents: %s", content) ``` Output: ``` File contents: Hello, Gophers! ``` func TempDir ------------ ``` func TempDir(dir, pattern string) (name string, err error) ``` TempDir creates a new temporary directory in the directory dir. The directory name is generated by taking pattern and applying a random string to the end. If pattern includes a "\*", the random string replaces the last "\*". TempDir returns the name of the new directory. If dir is the empty string, TempDir uses the default directory for temporary files (see os.TempDir). Multiple programs calling TempDir simultaneously will not choose the same directory. It is the caller's responsibility to remove the directory when no longer needed. Deprecated: As of Go 1.17, this function simply calls os.MkdirTemp. #### Example Code: ``` content := []byte("temporary file's content") dir, err := ioutil.TempDir("", "example") if err != nil { log.Fatal(err) } defer os.RemoveAll(dir) // clean up tmpfn := filepath.Join(dir, "tmpfile") if err := ioutil.WriteFile(tmpfn, content, 0666); err != nil { log.Fatal(err) } ``` #### Example (Suffix) Code: ``` parentDir := os.TempDir() logsDir, err := ioutil.TempDir(parentDir, "*-logs") if err != nil { log.Fatal(err) } defer os.RemoveAll(logsDir) // clean up // Logs can be cleaned out earlier if needed by searching // for all directories whose suffix ends in *-logs. globPattern := filepath.Join(parentDir, "*-logs") matches, err := filepath.Glob(globPattern) if err != nil { log.Fatalf("Failed to match %q: %v", globPattern, err) } for _, match := range matches { if err := os.RemoveAll(match); err != nil { log.Printf("Failed to remove %q: %v", match, err) } } ``` func TempFile ------------- ``` func TempFile(dir, pattern string) (f *os.File, err error) ``` TempFile creates a new temporary file in the directory dir, opens the file for reading and writing, and returns the resulting \*os.File. The filename is generated by taking pattern and adding a random string to the end. If pattern includes a "\*", the random string replaces the last "\*". If dir is the empty string, TempFile uses the default directory for temporary files (see os.TempDir). Multiple programs calling TempFile simultaneously will not choose the same file. The caller can use f.Name() to find the pathname of the file. It is the caller's responsibility to remove the file when no longer needed. Deprecated: As of Go 1.17, this function simply calls os.CreateTemp. #### Example Code: ``` content := []byte("temporary file's content") tmpfile, err := ioutil.TempFile("", "example") if err != nil { log.Fatal(err) } defer os.Remove(tmpfile.Name()) // clean up if _, err := tmpfile.Write(content); err != nil { log.Fatal(err) } if err := tmpfile.Close(); err != nil { log.Fatal(err) } ``` #### Example (Suffix) Code: ``` content := []byte("temporary file's content") tmpfile, err := ioutil.TempFile("", "example.*.txt") if err != nil { log.Fatal(err) } defer os.Remove(tmpfile.Name()) // clean up if _, err := tmpfile.Write(content); err != nil { tmpfile.Close() log.Fatal(err) } if err := tmpfile.Close(); err != nil { log.Fatal(err) } ``` func WriteFile -------------- ``` func WriteFile(filename string, data []byte, perm fs.FileMode) error ``` WriteFile writes data to a file named by filename. If the file does not exist, WriteFile creates it with permissions perm (before umask); otherwise WriteFile truncates it before writing, without changing permissions. Deprecated: As of Go 1.16, this function simply calls os.WriteFile. #### Example Code: ``` message := []byte("Hello, Gophers!") err := ioutil.WriteFile("hello", message, 0644) if err != nil { log.Fatal(err) } ``` go Package fs Package fs =========== * `import "io/fs"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package fs defines basic interfaces to a file system. A file system can be provided by the host operating system but also by other packages. Index ----- * [Variables](#pkg-variables) * [func Glob(fsys FS, pattern string) (matches []string, err error)](#Glob) * [func ReadFile(fsys FS, name string) ([]byte, error)](#ReadFile) * [func ValidPath(name string) bool](#ValidPath) * [func WalkDir(fsys FS, root string, fn WalkDirFunc) error](#WalkDir) * [type DirEntry](#DirEntry) * [func FileInfoToDirEntry(info FileInfo) DirEntry](#FileInfoToDirEntry) * [func ReadDir(fsys FS, name string) ([]DirEntry, error)](#ReadDir) * [type FS](#FS) * [func Sub(fsys FS, dir string) (FS, error)](#Sub) * [type File](#File) * [type FileInfo](#FileInfo) * [func Stat(fsys FS, name string) (FileInfo, error)](#Stat) * [type FileMode](#FileMode) * [func (m FileMode) IsDir() bool](#FileMode.IsDir) * [func (m FileMode) IsRegular() bool](#FileMode.IsRegular) * [func (m FileMode) Perm() FileMode](#FileMode.Perm) * [func (m FileMode) String() string](#FileMode.String) * [func (m FileMode) Type() FileMode](#FileMode.Type) * [type GlobFS](#GlobFS) * [type PathError](#PathError) * [func (e \*PathError) Error() string](#PathError.Error) * [func (e \*PathError) Timeout() bool](#PathError.Timeout) * [func (e \*PathError) Unwrap() error](#PathError.Unwrap) * [type ReadDirFS](#ReadDirFS) * [type ReadDirFile](#ReadDirFile) * [type ReadFileFS](#ReadFileFS) * [type StatFS](#StatFS) * [type SubFS](#SubFS) * [type WalkDirFunc](#WalkDirFunc) ### Examples [WalkDir](#example_WalkDir) ### Package files fs.go glob.go readdir.go readfile.go stat.go sub.go walk.go Variables --------- Generic file system errors. Errors returned by file systems can be tested against these errors using errors.Is. ``` var ( ErrInvalid = errInvalid() // "invalid argument" ErrPermission = errPermission() // "permission denied" ErrExist = errExist() // "file already exists" ErrNotExist = errNotExist() // "file does not exist" ErrClosed = errClosed() // "file already closed" ) ``` SkipAll is used as a return value from WalkDirFuncs to indicate that all remaining files and directories are to be skipped. It is not returned as an error by any function. ``` var SkipAll = errors.New("skip everything and stop the walk") ``` SkipDir is used as a return value from WalkDirFuncs to indicate that the directory named in the call is to be skipped. It is not returned as an error by any function. ``` var SkipDir = errors.New("skip this directory") ``` func Glob 1.16 -------------- ``` func Glob(fsys FS, pattern string) (matches []string, err error) ``` Glob returns the names of all files matching pattern or nil if there is no matching file. The syntax of patterns is the same as in path.Match. The pattern may describe hierarchical names such as usr/\*/bin/ed. Glob ignores file system errors such as I/O errors reading directories. The only possible returned error is path.ErrBadPattern, reporting that the pattern is malformed. If fs implements GlobFS, Glob calls fs.Glob. Otherwise, Glob uses ReadDir to traverse the directory tree and look for matches for the pattern. func ReadFile 1.16 ------------------ ``` func ReadFile(fsys FS, name string) ([]byte, error) ``` ReadFile reads the named file from the file system fs and returns its contents. A successful call returns a nil error, not io.EOF. (Because ReadFile reads the whole file, the expected EOF from the final Read is not treated as an error to be reported.) If fs implements ReadFileFS, ReadFile calls fs.ReadFile. Otherwise ReadFile calls fs.Open and uses Read and Close on the returned file. func ValidPath 1.16 ------------------- ``` func ValidPath(name string) bool ``` ValidPath reports whether the given path name is valid for use in a call to Open. Path names passed to open are UTF-8-encoded, unrooted, slash-separated sequences of path elements, like “x/y/z”. Path names must not contain an element that is “.” or “..” or the empty string, except for the special case that the root directory is named “.”. Paths must not start or end with a slash: “/x” and “x/” are invalid. Note that paths are slash-separated on all systems, even Windows. Paths containing other characters such as backslash and colon are accepted as valid, but those characters must never be interpreted by an FS implementation as path element separators. func WalkDir 1.16 ----------------- ``` func WalkDir(fsys FS, root string, fn WalkDirFunc) error ``` WalkDir walks the file tree rooted at root, calling fn for each file or directory in the tree, including root. All errors that arise visiting files and directories are filtered by fn: see the fs.WalkDirFunc documentation for details. The files are walked in lexical order, which makes the output deterministic but requires WalkDir to read an entire directory into memory before proceeding to walk that directory. WalkDir does not follow symbolic links found in directories, but if root itself is a symbolic link, its target will be walked. #### Example Code: ``` root := "/usr/local/go/bin" fileSystem := os.DirFS(root) fs.WalkDir(fileSystem, ".", func(path string, d fs.DirEntry, err error) error { if err != nil { log.Fatal(err) } fmt.Println(path) return nil }) ``` type DirEntry 1.16 ------------------ A DirEntry is an entry read from a directory (using the ReadDir function or a ReadDirFile's ReadDir method). ``` type DirEntry interface { // Name returns the name of the file (or subdirectory) described by the entry. // This name is only the final element of the path (the base name), not the entire path. // For example, Name would return "hello.go" not "home/gopher/hello.go". Name() string // IsDir reports whether the entry describes a directory. IsDir() bool // Type returns the type bits for the entry. // The type bits are a subset of the usual FileMode bits, those returned by the FileMode.Type method. Type() FileMode // Info returns the FileInfo for the file or subdirectory described by the entry. // The returned FileInfo may be from the time of the original directory read // or from the time of the call to Info. If the file has been removed or renamed // since the directory read, Info may return an error satisfying errors.Is(err, ErrNotExist). // If the entry denotes a symbolic link, Info reports the information about the link itself, // not the link's target. Info() (FileInfo, error) } ``` ### func FileInfoToDirEntry 1.17 ``` func FileInfoToDirEntry(info FileInfo) DirEntry ``` FileInfoToDirEntry returns a DirEntry that returns information from info. If info is nil, FileInfoToDirEntry returns nil. ### func ReadDir 1.16 ``` func ReadDir(fsys FS, name string) ([]DirEntry, error) ``` ReadDir reads the named directory and returns a list of directory entries sorted by filename. If fs implements ReadDirFS, ReadDir calls fs.ReadDir. Otherwise ReadDir calls fs.Open and uses ReadDir and Close on the returned file. type FS 1.16 ------------ An FS provides access to a hierarchical file system. The FS interface is the minimum implementation required of the file system. A file system may implement additional interfaces, such as ReadFileFS, to provide additional or optimized functionality. ``` type FS interface { // Open opens the named file. // // When Open returns an error, it should be of type *PathError // with the Op field set to "open", the Path field set to name, // and the Err field describing the problem. // // Open should reject attempts to open names that do not satisfy // ValidPath(name), returning a *PathError with Err set to // ErrInvalid or ErrNotExist. Open(name string) (File, error) } ``` ### func Sub 1.16 ``` func Sub(fsys FS, dir string) (FS, error) ``` Sub returns an FS corresponding to the subtree rooted at fsys's dir. If dir is ".", Sub returns fsys unchanged. Otherwise, if fs implements SubFS, Sub returns fsys.Sub(dir). Otherwise, Sub returns a new FS implementation sub that, in effect, implements sub.Open(name) as fsys.Open(path.Join(dir, name)). The implementation also translates calls to ReadDir, ReadFile, and Glob appropriately. Note that Sub(os.DirFS("/"), "prefix") is equivalent to os.DirFS("/prefix") and that neither of them guarantees to avoid operating system accesses outside "/prefix", because the implementation of os.DirFS does not check for symbolic links inside "/prefix" that point to other directories. That is, os.DirFS is not a general substitute for a chroot-style security mechanism, and Sub does not change that fact. type File 1.16 -------------- A File provides access to a single file. The File interface is the minimum implementation required of the file. Directory files should also implement ReadDirFile. A file may implement io.ReaderAt or io.Seeker as optimizations. ``` type File interface { Stat() (FileInfo, error) Read([]byte) (int, error) Close() error } ``` type FileInfo 1.16 ------------------ A FileInfo describes a file and is returned by Stat. ``` type FileInfo interface { Name() string // base name of the file Size() int64 // length in bytes for regular files; system-dependent for others Mode() FileMode // file mode bits ModTime() time.Time // modification time IsDir() bool // abbreviation for Mode().IsDir() Sys() any // underlying data source (can return nil) } ``` ### func Stat 1.16 ``` func Stat(fsys FS, name string) (FileInfo, error) ``` Stat returns a FileInfo describing the named file from the file system. If fs implements StatFS, Stat calls fs.Stat. Otherwise, Stat opens the file to stat it. type FileMode 1.16 ------------------ A FileMode represents a file's mode and permission bits. The bits have the same definition on all systems, so that information about files can be moved from one system to another portably. Not all bits apply to all systems. The only required bit is ModeDir for directories. ``` type FileMode uint32 ``` The defined file mode bits are the most significant bits of the FileMode. The nine least-significant bits are the standard Unix rwxrwxrwx permissions. The values of these bits should be considered part of the public API and may be used in wire protocols or disk representations: they must not be changed, although new bits might be added. ``` const ( // The single letters are the abbreviations // used by the String method's formatting. ModeDir FileMode = 1 << (32 - 1 - iota) // d: is a directory ModeAppend // a: append-only ModeExclusive // l: exclusive use ModeTemporary // T: temporary file; Plan 9 only ModeSymlink // L: symbolic link ModeDevice // D: device file ModeNamedPipe // p: named pipe (FIFO) ModeSocket // S: Unix domain socket ModeSetuid // u: setuid ModeSetgid // g: setgid ModeCharDevice // c: Unix character device, when ModeDevice is set ModeSticky // t: sticky ModeIrregular // ?: non-regular file; nothing else is known about this file // Mask for the type bits. For regular files, none will be set. ModeType = ModeDir | ModeSymlink | ModeNamedPipe | ModeSocket | ModeDevice | ModeCharDevice | ModeIrregular ModePerm FileMode = 0777 // Unix permission bits ) ``` ### func (FileMode) IsDir 1.16 ``` func (m FileMode) IsDir() bool ``` IsDir reports whether m describes a directory. That is, it tests for the ModeDir bit being set in m. ### func (FileMode) IsRegular 1.16 ``` func (m FileMode) IsRegular() bool ``` IsRegular reports whether m describes a regular file. That is, it tests that no mode type bits are set. ### func (FileMode) Perm 1.16 ``` func (m FileMode) Perm() FileMode ``` Perm returns the Unix permission bits in m (m & ModePerm). ### func (FileMode) String 1.16 ``` func (m FileMode) String() string ``` ### func (FileMode) Type 1.16 ``` func (m FileMode) Type() FileMode ``` Type returns type bits in m (m & ModeType). type GlobFS 1.16 ---------------- A GlobFS is a file system with a Glob method. ``` type GlobFS interface { FS // Glob returns the names of all files matching pattern, // providing an implementation of the top-level // Glob function. Glob(pattern string) ([]string, error) } ``` type PathError 1.16 ------------------- PathError records an error and the operation and file path that caused it. ``` type PathError struct { Op string Path string Err error } ``` ### func (\*PathError) Error 1.16 ``` func (e *PathError) Error() string ``` ### func (\*PathError) Timeout 1.16 ``` func (e *PathError) Timeout() bool ``` Timeout reports whether this error represents a timeout. ### func (\*PathError) Unwrap 1.16 ``` func (e *PathError) Unwrap() error ``` type ReadDirFS 1.16 ------------------- ReadDirFS is the interface implemented by a file system that provides an optimized implementation of ReadDir. ``` type ReadDirFS interface { FS // ReadDir reads the named directory // and returns a list of directory entries sorted by filename. ReadDir(name string) ([]DirEntry, error) } ``` type ReadDirFile 1.16 --------------------- A ReadDirFile is a directory file whose entries can be read with the ReadDir method. Every directory file should implement this interface. (It is permissible for any file to implement this interface, but if so ReadDir should return an error for non-directories.) ``` type ReadDirFile interface { File // ReadDir reads the contents of the directory and returns // a slice of up to n DirEntry values in directory order. // Subsequent calls on the same file will yield further DirEntry values. // // If n > 0, ReadDir returns at most n DirEntry structures. // In this case, if ReadDir returns an empty slice, it will return // a non-nil error explaining why. // At the end of a directory, the error is io.EOF. // (ReadDir must return io.EOF itself, not an error wrapping io.EOF.) // // If n <= 0, ReadDir returns all the DirEntry values from the directory // in a single slice. In this case, if ReadDir succeeds (reads all the way // to the end of the directory), it returns the slice and a nil error. // If it encounters an error before the end of the directory, // ReadDir returns the DirEntry list read until that point and a non-nil error. ReadDir(n int) ([]DirEntry, error) } ``` type ReadFileFS 1.16 -------------------- ReadFileFS is the interface implemented by a file system that provides an optimized implementation of ReadFile. ``` type ReadFileFS interface { FS // ReadFile reads the named file and returns its contents. // A successful call returns a nil error, not io.EOF. // (Because ReadFile reads the whole file, the expected EOF // from the final Read is not treated as an error to be reported.) // // The caller is permitted to modify the returned byte slice. // This method should return a copy of the underlying data. ReadFile(name string) ([]byte, error) } ``` type StatFS 1.16 ---------------- A StatFS is a file system with a Stat method. ``` type StatFS interface { FS // Stat returns a FileInfo describing the file. // If there is an error, it should be of type *PathError. Stat(name string) (FileInfo, error) } ``` type SubFS 1.16 --------------- A SubFS is a file system with a Sub method. ``` type SubFS interface { FS // Sub returns an FS corresponding to the subtree rooted at dir. Sub(dir string) (FS, error) } ``` type WalkDirFunc 1.16 --------------------- WalkDirFunc is the type of the function called by WalkDir to visit each file or directory. The path argument contains the argument to WalkDir as a prefix. That is, if WalkDir is called with root argument "dir" and finds a file named "a" in that directory, the walk function will be called with argument "dir/a". The d argument is the fs.DirEntry for the named path. The error result returned by the function controls how WalkDir continues. If the function returns the special value SkipDir, WalkDir skips the current directory (path if d.IsDir() is true, otherwise path's parent directory). If the function returns the special value SkipAll, WalkDir skips all remaining files and directories. Otherwise, if the function returns a non-nil error, WalkDir stops entirely and returns that error. The err argument reports an error related to path, signaling that WalkDir will not walk into that directory. The function can decide how to handle that error; as described earlier, returning the error will cause WalkDir to stop walking the entire tree. WalkDir calls the function with a non-nil err argument in two cases. First, if the initial fs.Stat on the root directory fails, WalkDir calls the function with path set to root, d set to nil, and err set to the error from fs.Stat. Second, if a directory's ReadDir method fails, WalkDir calls the function with path set to the directory's path, d set to an fs.DirEntry describing the directory, and err set to the error from ReadDir. In this second case, the function is called twice with the path of the directory: the first call is before the directory read is attempted and has err set to nil, giving the function a chance to return SkipDir or SkipAll and avoid the ReadDir entirely. The second call is after a failed ReadDir and reports the error from ReadDir. (If ReadDir succeeds, there is no second call.) The differences between WalkDirFunc compared to filepath.WalkFunc are: * The second argument has type fs.DirEntry instead of fs.FileInfo. * The function is called before reading a directory, to allow SkipDir or SkipAll to bypass the directory read entirely or skip all remaining files and directories respectively. * If a directory read fails, the function is called a second time for that directory to report the error. ``` type WalkDirFunc func(path string, d DirEntry, err error) error ```
programming_docs
go Package runtime Package runtime ================ * `import "runtime"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package runtime contains operations that interact with Go's runtime system, such as functions to control goroutines. It also includes the low-level type information used by the reflect package; see reflect's documentation for the programmable interface to the run-time type system. ### Environment Variables The following environment variables ($name or %name%, depending on the host operating system) control the run-time behavior of Go programs. The meanings and use may change from release to release. The GOGC variable sets the initial garbage collection target percentage. A collection is triggered when the ratio of freshly allocated data to live data remaining after the previous collection reaches this percentage. The default is GOGC=100. Setting GOGC=off disables the garbage collector entirely. runtime/debug.SetGCPercent allows changing this percentage at run time. The GOMEMLIMIT variable sets a soft memory limit for the runtime. This memory limit includes the Go heap and all other memory managed by the runtime, and excludes external memory sources such as mappings of the binary itself, memory managed in other languages, and memory held by the operating system on behalf of the Go program. GOMEMLIMIT is a numeric value in bytes with an optional unit suffix. The supported suffixes include B, KiB, MiB, GiB, and TiB. These suffixes represent quantities of bytes as defined by the IEC 80000-13 standard. That is, they are based on powers of two: KiB means 2^10 bytes, MiB means 2^20 bytes, and so on. The default setting is math.MaxInt64, which effectively disables the memory limit. runtime/debug.SetMemoryLimit allows changing this limit at run time. The GODEBUG variable controls debugging variables within the runtime. It is a comma-separated list of name=val pairs setting these named variables: ``` allocfreetrace: setting allocfreetrace=1 causes every allocation to be profiled and a stack trace printed on each object's allocation and free. clobberfree: setting clobberfree=1 causes the garbage collector to clobber the memory content of an object with bad content when it frees the object. cpu.*: cpu.all=off disables the use of all optional instruction set extensions. cpu.extension=off disables use of instructions from the specified instruction set extension. extension is the lower case name for the instruction set extension such as sse41 or avx as listed in internal/cpu package. As an example cpu.avx=off disables runtime detection and thereby use of AVX instructions. cgocheck: setting cgocheck=0 disables all checks for packages using cgo to incorrectly pass Go pointers to non-Go code. Setting cgocheck=1 (the default) enables relatively cheap checks that may miss some errors. Setting cgocheck=2 enables expensive checks that should not miss any errors, but will cause your program to run slower. efence: setting efence=1 causes the allocator to run in a mode where each object is allocated on a unique page and addresses are never recycled. gccheckmark: setting gccheckmark=1 enables verification of the garbage collector's concurrent mark phase by performing a second mark pass while the world is stopped. If the second pass finds a reachable object that was not found by concurrent mark, the garbage collector will panic. gcpacertrace: setting gcpacertrace=1 causes the garbage collector to print information about the internal state of the concurrent pacer. gcshrinkstackoff: setting gcshrinkstackoff=1 disables moving goroutines onto smaller stacks. In this mode, a goroutine's stack can only grow. gcstoptheworld: setting gcstoptheworld=1 disables concurrent garbage collection, making every garbage collection a stop-the-world event. Setting gcstoptheworld=2 also disables concurrent sweeping after the garbage collection finishes. gctrace: setting gctrace=1 causes the garbage collector to emit a single line to standard error at each collection, summarizing the amount of memory collected and the length of the pause. The format of this line is subject to change. Currently, it is: gc # @#s #%: #+#+# ms clock, #+#/#/#+# ms cpu, #->#-># MB, # MB goal, # MB stacks, #MB globals, # P where the fields are as follows: gc # the GC number, incremented at each GC @#s time in seconds since program start #% percentage of time spent in GC since program start #+...+# wall-clock/CPU times for the phases of the GC #->#-># MB heap size at GC start, at GC end, and live heap # MB goal goal heap size # MB stacks estimated scannable stack size # MB globals scannable global size # P number of processors used The phases are stop-the-world (STW) sweep termination, concurrent mark and scan, and STW mark termination. The CPU times for mark/scan are broken down in to assist time (GC performed in line with allocation), background GC time, and idle GC time. If the line ends with "(forced)", this GC was forced by a runtime.GC() call. harddecommit: setting harddecommit=1 causes memory that is returned to the OS to also have protections removed on it. This is the only mode of operation on Windows, but is helpful in debugging scavenger-related issues on other platforms. Currently, only supported on Linux. inittrace: setting inittrace=1 causes the runtime to emit a single line to standard error for each package with init work, summarizing the execution time and memory allocation. No information is printed for inits executed as part of plugin loading and for packages without both user defined and compiler generated init work. The format of this line is subject to change. Currently, it is: init # @#ms, # ms clock, # bytes, # allocs where the fields are as follows: init # the package name @# ms time in milliseconds when the init started since program start # clock wall-clock time for package initialization work # bytes memory allocated on the heap # allocs number of heap allocations madvdontneed: setting madvdontneed=0 will use MADV_FREE instead of MADV_DONTNEED on Linux when returning memory to the kernel. This is more efficient, but means RSS numbers will drop only when the OS is under memory pressure. On the BSDs and Illumos/Solaris, setting madvdontneed=1 will use MADV_DONTNEED instead of MADV_FREE. This is less efficient, but causes RSS numbers to drop more quickly. memprofilerate: setting memprofilerate=X will update the value of runtime.MemProfileRate. When set to 0 memory profiling is disabled. Refer to the description of MemProfileRate for the default value. pagetrace: setting pagetrace=/path/to/file will write out a trace of page events that can be viewed, analyzed, and visualized using the x/debug/cmd/pagetrace tool. Build your program with GOEXPERIMENT=pagetrace to enable this functionality. Do not enable this functionality if your program is a setuid binary as it introduces a security risk in that scenario. Currently not supported on Windows, plan9 or js/wasm. Setting this option for some applications can produce large traces, so use with care. invalidptr: invalidptr=1 (the default) causes the garbage collector and stack copier to crash the program if an invalid pointer value (for example, 1) is found in a pointer-typed location. Setting invalidptr=0 disables this check. This should only be used as a temporary workaround to diagnose buggy code. The real fix is to not store integers in pointer-typed locations. sbrk: setting sbrk=1 replaces the memory allocator and garbage collector with a trivial allocator that obtains memory from the operating system and never reclaims any memory. scavtrace: setting scavtrace=1 causes the runtime to emit a single line to standard error, roughly once per GC cycle, summarizing the amount of work done by the scavenger as well as the total amount of memory returned to the operating system and an estimate of physical memory utilization. The format of this line is subject to change, but currently it is: scav # KiB work, # KiB total, #% util where the fields are as follows: # KiB work the amount of memory returned to the OS since the last line # KiB total the total amount of memory returned to the OS #% util the fraction of all unscavenged memory which is in-use If the line ends with "(forced)", then scavenging was forced by a debug.FreeOSMemory() call. scheddetail: setting schedtrace=X and scheddetail=1 causes the scheduler to emit detailed multiline info every X milliseconds, describing state of the scheduler, processors, threads and goroutines. schedtrace: setting schedtrace=X causes the scheduler to emit a single line to standard error every X milliseconds, summarizing the scheduler state. tracebackancestors: setting tracebackancestors=N extends tracebacks with the stacks at which goroutines were created, where N limits the number of ancestor goroutines to report. This also extends the information returned by runtime.Stack. Ancestor's goroutine IDs will refer to the ID of the goroutine at the time of creation; it's possible for this ID to be reused for another goroutine. Setting N to 0 will report no ancestry information. asyncpreemptoff: asyncpreemptoff=1 disables signal-based asynchronous goroutine preemption. This makes some loops non-preemptible for long periods, which may delay GC and goroutine scheduling. This is useful for debugging GC issues because it also disables the conservative stack scanning used for asynchronously preempted goroutines. ``` The net and net/http packages also refer to debugging variables in GODEBUG. See the documentation for those packages for details. The GOMAXPROCS variable limits the number of operating system threads that can execute user-level Go code simultaneously. There is no limit to the number of threads that can be blocked in system calls on behalf of Go code; those do not count against the GOMAXPROCS limit. This package's GOMAXPROCS function queries and changes the limit. The GORACE variable configures the race detector, for programs built using -race. See <https://golang.org/doc/articles/race_detector.html> for details. The GOTRACEBACK variable controls the amount of output generated when a Go program fails due to an unrecovered panic or an unexpected runtime condition. By default, a failure prints a stack trace for the current goroutine, eliding functions internal to the run-time system, and then exits with exit code 2. The failure prints stack traces for all goroutines if there is no current goroutine or the failure is internal to the run-time. GOTRACEBACK=none omits the goroutine stack traces entirely. GOTRACEBACK=single (the default) behaves as described above. GOTRACEBACK=all adds stack traces for all user-created goroutines. GOTRACEBACK=system is like “all” but adds stack frames for run-time functions and shows goroutines created internally by the run-time. GOTRACEBACK=crash is like “system” but crashes in an operating system-specific manner instead of exiting. For example, on Unix systems, the crash raises SIGABRT to trigger a core dump. For historical reasons, the GOTRACEBACK settings 0, 1, and 2 are synonyms for none, all, and system, respectively. The runtime/debug package's SetTraceback function allows increasing the amount of output at run time, but it cannot reduce the amount below that specified by the environment variable. See <https://golang.org/pkg/runtime/debug/#SetTraceback>. The GOARCH, GOOS, GOPATH, and GOROOT environment variables complete the set of Go environment variables. They influence the building of Go programs (see <https://golang.org/cmd/go> and <https://golang.org/pkg/go/build>). GOARCH, GOOS, and GOROOT are recorded at compile time and made available by constants or functions in this package, but they do not influence the execution of the run-time system. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func BlockProfile(p []BlockProfileRecord) (n int, ok bool)](#BlockProfile) * [func Breakpoint()](#Breakpoint) * [func CPUProfile() []byte](#CPUProfile) * [func Caller(skip int) (pc uintptr, file string, line int, ok bool)](#Caller) * [func Callers(skip int, pc []uintptr) int](#Callers) * [func GC()](#GC) * [func GOMAXPROCS(n int) int](#GOMAXPROCS) * [func GOROOT() string](#GOROOT) * [func Goexit()](#Goexit) * [func GoroutineProfile(p []StackRecord) (n int, ok bool)](#GoroutineProfile) * [func Gosched()](#Gosched) * [func KeepAlive(x any)](#KeepAlive) * [func LockOSThread()](#LockOSThread) * [func MemProfile(p []MemProfileRecord, inuseZero bool) (n int, ok bool)](#MemProfile) * [func MutexProfile(p []BlockProfileRecord) (n int, ok bool)](#MutexProfile) * [func NumCPU() int](#NumCPU) * [func NumCgoCall() int64](#NumCgoCall) * [func NumGoroutine() int](#NumGoroutine) * [func ReadMemStats(m \*MemStats)](#ReadMemStats) * [func ReadTrace() []byte](#ReadTrace) * [func SetBlockProfileRate(rate int)](#SetBlockProfileRate) * [func SetCPUProfileRate(hz int)](#SetCPUProfileRate) * [func SetCgoTraceback(version int, traceback, context, symbolizer unsafe.Pointer)](#SetCgoTraceback) * [func SetFinalizer(obj any, finalizer any)](#SetFinalizer) * [func SetMutexProfileFraction(rate int) int](#SetMutexProfileFraction) * [func Stack(buf []byte, all bool) int](#Stack) * [func StartTrace() error](#StartTrace) * [func StopTrace()](#StopTrace) * [func ThreadCreateProfile(p []StackRecord) (n int, ok bool)](#ThreadCreateProfile) * [func UnlockOSThread()](#UnlockOSThread) * [func Version() string](#Version) * [type BlockProfileRecord](#BlockProfileRecord) * [type Error](#Error) * [type Frame](#Frame) * [type Frames](#Frames) * [func CallersFrames(callers []uintptr) \*Frames](#CallersFrames) * [func (ci \*Frames) Next() (frame Frame, more bool)](#Frames.Next) * [type Func](#Func) * [func FuncForPC(pc uintptr) \*Func](#FuncForPC) * [func (f \*Func) Entry() uintptr](#Func.Entry) * [func (f \*Func) FileLine(pc uintptr) (file string, line int)](#Func.FileLine) * [func (f \*Func) Name() string](#Func.Name) * [type MemProfileRecord](#MemProfileRecord) * [func (r \*MemProfileRecord) InUseBytes() int64](#MemProfileRecord.InUseBytes) * [func (r \*MemProfileRecord) InUseObjects() int64](#MemProfileRecord.InUseObjects) * [func (r \*MemProfileRecord) Stack() []uintptr](#MemProfileRecord.Stack) * [type MemStats](#MemStats) * [type StackRecord](#StackRecord) * [func (r \*StackRecord) Stack() []uintptr](#StackRecord.Stack) * [type TypeAssertionError](#TypeAssertionError) * [func (e \*TypeAssertionError) Error() string](#TypeAssertionError.Error) * [func (\*TypeAssertionError) RuntimeError()](#TypeAssertionError.RuntimeError) ### Examples [Frames](#example_Frames) ### Package files alg.go arena.go asan0.go atomic\_pointer.go cgo.go cgo\_mmap.go cgo\_sigaction.go cgocall.go cgocallback.go cgocheck.go chan.go checkptr.go compiler.go complex.go covercounter.go covermeta.go cpuflags.go cpuflags\_amd64.go cpuprof.go cputicks.go create\_file\_unix.go debug.go debugcall.go debuglog.go debuglog\_off.go defs\_linux\_amd64.go env\_posix.go error.go exithook.go extern.go fastlog2.go fastlog2table.go float.go hash64.go heapdump.go histogram.go iface.go lfstack.go lfstack\_64bit.go lock\_futex.go lockrank.go lockrank\_off.go malloc.go map.go map\_fast32.go map\_fast64.go map\_faststr.go mbarrier.go mbitmap.go mcache.go mcentral.go mcheckmark.go mem.go mem\_linux.go metrics.go mfinal.go mfixalloc.go mgc.go mgclimit.go mgcmark.go mgcpacer.go mgcscavenge.go mgcstack.go mgcsweep.go mgcwork.go mheap.go mpagealloc.go mpagealloc\_64bit.go mpagecache.go mpallocbits.go mprof.go mranges.go msan0.go msize.go mspanset.go mstats.go mwbbuf.go nbpipe\_pipe2.go netpoll.go netpoll\_epoll.go os\_linux.go os\_linux\_generic.go os\_linux\_noauxv.go os\_linux\_x86.go os\_nonopenbsd.go pagetrace\_off.go panic.go plugin.go preempt.go preempt\_nonwindows.go print.go proc.go profbuf.go proflabel.go race0.go rdebug.go relax\_stub.go retry.go runtime.go runtime1.go runtime2.go runtime\_boring.go rwmutex.go select.go sema.go signal\_amd64.go signal\_linux\_amd64.go signal\_unix.go sigqueue.go sigqueue\_note.go sigtab\_linux\_generic.go sizeclasses.go slice.go softfloat64.go stack.go stkframe.go string.go stubs.go stubs2.go stubs3.go stubs\_amd64.go stubs\_linux.go symtab.go sys\_nonppc64x.go sys\_x86.go time.go time\_nofake.go timeasm.go tls\_stub.go trace.go traceback.go type.go typekind.go unsafe.go utf8.go vdso\_elf64.go vdso\_linux.go vdso\_linux\_amd64.go write\_err.go Constants --------- Compiler is the name of the compiler toolchain that built the running binary. Known toolchains are: ``` gc Also known as cmd/compile. gccgo The gccgo front end, part of the GCC compiler suite. ``` ``` const Compiler = "gc" ``` GOARCH is the running program's architecture target: one of 386, amd64, arm, s390x, and so on. ``` const GOARCH string = goarch.GOARCH ``` GOOS is the running program's operating system target: one of darwin, freebsd, linux, and so on. To view possible combinations of GOOS and GOARCH, run "go tool dist list". ``` const GOOS string = goos.GOOS ``` Variables --------- MemProfileRate controls the fraction of memory allocations that are recorded and reported in the memory profile. The profiler aims to sample an average of one allocation per MemProfileRate bytes allocated. To include every allocated block in the profile, set MemProfileRate to 1. To turn off profiling entirely, set MemProfileRate to 0. The tools that process the memory profiles assume that the profile rate is constant across the lifetime of the program and equal to the current value. Programs that change the memory profiling rate should do so just once, as early as possible in the execution of the program (for example, at the beginning of main). ``` var MemProfileRate int = 512 * 1024 ``` func BlockProfile 1.1 --------------------- ``` func BlockProfile(p []BlockProfileRecord) (n int, ok bool) ``` BlockProfile returns n, the number of records in the current blocking profile. If len(p) >= n, BlockProfile copies the profile into p and returns n, true. If len(p) < n, BlockProfile does not change p and returns n, false. Most clients should use the runtime/pprof package or the testing package's -test.blockprofile flag instead of calling BlockProfile directly. func Breakpoint --------------- ``` func Breakpoint() ``` Breakpoint executes a breakpoint trap. func CPUProfile --------------- ``` func CPUProfile() []byte ``` CPUProfile panics. It formerly provided raw access to chunks of a pprof-format profile generated by the runtime. The details of generating that format have changed, so this functionality has been removed. Deprecated: Use the runtime/pprof package, or the handlers in the net/http/pprof package, or the testing package's -test.cpuprofile flag instead. func Caller ----------- ``` func Caller(skip int) (pc uintptr, file string, line int, ok bool) ``` Caller reports file and line number information about function invocations on the calling goroutine's stack. The argument skip is the number of stack frames to ascend, with 0 identifying the caller of Caller. (For historical reasons the meaning of skip differs between Caller and Callers.) The return values report the program counter, file name, and line number within the file of the corresponding call. The boolean ok is false if it was not possible to recover the information. func Callers ------------ ``` func Callers(skip int, pc []uintptr) int ``` Callers fills the slice pc with the return program counters of function invocations on the calling goroutine's stack. The argument skip is the number of stack frames to skip before recording in pc, with 0 identifying the frame for Callers itself and 1 identifying the caller of Callers. It returns the number of entries written to pc. To translate these PCs into symbolic information such as function names and line numbers, use CallersFrames. CallersFrames accounts for inlined functions and adjusts the return program counters into call program counters. Iterating over the returned slice of PCs directly is discouraged, as is using FuncForPC on any of the returned PCs, since these cannot account for inlining or return program counter adjustment. func GC ------- ``` func GC() ``` GC runs a garbage collection and blocks the caller until the garbage collection is complete. It may also block the entire program. func GOMAXPROCS --------------- ``` func GOMAXPROCS(n int) int ``` GOMAXPROCS sets the maximum number of CPUs that can be executing simultaneously and returns the previous setting. It defaults to the value of runtime.NumCPU. If n < 1, it does not change the current setting. This call will go away when the scheduler improves. func GOROOT ----------- ``` func GOROOT() string ``` GOROOT returns the root of the Go tree. It uses the GOROOT environment variable, if set at process start, or else the root used during the Go build. func Goexit ----------- ``` func Goexit() ``` Goexit terminates the goroutine that calls it. No other goroutine is affected. Goexit runs all deferred calls before terminating the goroutine. Because Goexit is not a panic, any recover calls in those deferred functions will return nil. Calling Goexit from the main goroutine terminates that goroutine without func main returning. Since func main has not returned, the program continues execution of other goroutines. If all other goroutines exit, the program crashes. func GoroutineProfile --------------------- ``` func GoroutineProfile(p []StackRecord) (n int, ok bool) ``` GoroutineProfile returns n, the number of records in the active goroutine stack profile. If len(p) >= n, GoroutineProfile copies the profile into p and returns n, true. If len(p) < n, GoroutineProfile does not change p and returns n, false. Most clients should use the runtime/pprof package instead of calling GoroutineProfile directly. func Gosched ------------ ``` func Gosched() ``` Gosched yields the processor, allowing other goroutines to run. It does not suspend the current goroutine, so execution resumes automatically. func KeepAlive 1.7 ------------------ ``` func KeepAlive(x any) ``` KeepAlive marks its argument as currently reachable. This ensures that the object is not freed, and its finalizer is not run, before the point in the program where KeepAlive is called. A very simplified example showing where KeepAlive is required: ``` type File struct { d int } d, err := syscall.Open("/file/path", syscall.O_RDONLY, 0) // ... do something if err != nil ... p := &File{d} runtime.SetFinalizer(p, func(p *File) { syscall.Close(p.d) }) var buf [10]byte n, err := syscall.Read(p.d, buf[:]) // Ensure p is not finalized until Read returns. runtime.KeepAlive(p) // No more uses of p after this point. ``` Without the KeepAlive call, the finalizer could run at the start of syscall.Read, closing the file descriptor before syscall.Read makes the actual system call. Note: KeepAlive should only be used to prevent finalizers from running prematurely. In particular, when used with unsafe.Pointer, the rules for valid uses of unsafe.Pointer still apply. func LockOSThread ----------------- ``` func LockOSThread() ``` LockOSThread wires the calling goroutine to its current operating system thread. The calling goroutine will always execute in that thread, and no other goroutine will execute in it, until the calling goroutine has made as many calls to UnlockOSThread as to LockOSThread. If the calling goroutine exits without unlocking the thread, the thread will be terminated. All init functions are run on the startup thread. Calling LockOSThread from an init function will cause the main function to be invoked on that thread. A goroutine should call LockOSThread before calling OS services or non-Go library functions that depend on per-thread state. func MemProfile --------------- ``` func MemProfile(p []MemProfileRecord, inuseZero bool) (n int, ok bool) ``` MemProfile returns a profile of memory allocated and freed per allocation site. MemProfile returns n, the number of records in the current memory profile. If len(p) >= n, MemProfile copies the profile into p and returns n, true. If len(p) < n, MemProfile does not change p and returns n, false. If inuseZero is true, the profile includes allocation records where r.AllocBytes > 0 but r.AllocBytes == r.FreeBytes. These are sites where memory was allocated, but it has all been released back to the runtime. The returned profile may be up to two garbage collection cycles old. This is to avoid skewing the profile toward allocations; because allocations happen in real time but frees are delayed until the garbage collector performs sweeping, the profile only accounts for allocations that have had a chance to be freed by the garbage collector. Most clients should use the runtime/pprof package or the testing package's -test.memprofile flag instead of calling MemProfile directly. func MutexProfile 1.8 --------------------- ``` func MutexProfile(p []BlockProfileRecord) (n int, ok bool) ``` MutexProfile returns n, the number of records in the current mutex profile. If len(p) >= n, MutexProfile copies the profile into p and returns n, true. Otherwise, MutexProfile does not change p, and returns n, false. Most clients should use the runtime/pprof package instead of calling MutexProfile directly. func NumCPU ----------- ``` func NumCPU() int ``` NumCPU returns the number of logical CPUs usable by the current process. The set of available CPUs is checked by querying the operating system at process startup. Changes to operating system CPU allocation after process startup are not reflected. func NumCgoCall --------------- ``` func NumCgoCall() int64 ``` NumCgoCall returns the number of cgo calls made by the current process. func NumGoroutine ----------------- ``` func NumGoroutine() int ``` NumGoroutine returns the number of goroutines that currently exist. func ReadMemStats ----------------- ``` func ReadMemStats(m *MemStats) ``` ReadMemStats populates m with memory allocator statistics. The returned memory allocator statistics are up to date as of the call to ReadMemStats. This is in contrast with a heap profile, which is a snapshot as of the most recently completed garbage collection cycle. func ReadTrace 1.5 ------------------ ``` func ReadTrace() []byte ``` ReadTrace returns the next chunk of binary tracing data, blocking until data is available. If tracing is turned off and all the data accumulated while it was on has been returned, ReadTrace returns nil. The caller must copy the returned data before calling ReadTrace again. ReadTrace must be called from one goroutine at a time. func SetBlockProfileRate 1.1 ---------------------------- ``` func SetBlockProfileRate(rate int) ``` SetBlockProfileRate controls the fraction of goroutine blocking events that are reported in the blocking profile. The profiler aims to sample an average of one blocking event per rate nanoseconds spent blocked. To include every blocking event in the profile, pass rate = 1. To turn off profiling entirely, pass rate <= 0. func SetCPUProfileRate ---------------------- ``` func SetCPUProfileRate(hz int) ``` SetCPUProfileRate sets the CPU profiling rate to hz samples per second. If hz <= 0, SetCPUProfileRate turns off profiling. If the profiler is on, the rate cannot be changed without first turning it off. Most clients should use the runtime/pprof package or the testing package's -test.cpuprofile flag instead of calling SetCPUProfileRate directly. func SetCgoTraceback 1.7 ------------------------ ``` func SetCgoTraceback(version int, traceback, context, symbolizer unsafe.Pointer) ``` SetCgoTraceback records three C functions to use to gather traceback information from C code and to convert that traceback information into symbolic information. These are used when printing stack traces for a program that uses cgo. The traceback and context functions may be called from a signal handler, and must therefore use only async-signal safe functions. The symbolizer function may be called while the program is crashing, and so must be cautious about using memory. None of the functions may call back into Go. The context function will be called with a single argument, a pointer to a struct: ``` struct { Context uintptr } ``` In C syntax, this struct will be ``` struct { uintptr_t Context; }; ``` If the Context field is 0, the context function is being called to record the current traceback context. It should record in the Context field whatever information is needed about the current point of execution to later produce a stack trace, probably the stack pointer and PC. In this case the context function will be called from C code. If the Context field is not 0, then it is a value returned by a previous call to the context function. This case is called when the context is no longer needed; that is, when the Go code is returning to its C code caller. This permits the context function to release any associated resources. While it would be correct for the context function to record a complete a stack trace whenever it is called, and simply copy that out in the traceback function, in a typical program the context function will be called many times without ever recording a traceback for that context. Recording a complete stack trace in a call to the context function is likely to be inefficient. The traceback function will be called with a single argument, a pointer to a struct: ``` struct { Context uintptr SigContext uintptr Buf *uintptr Max uintptr } ``` In C syntax, this struct will be ``` struct { uintptr_t Context; uintptr_t SigContext; uintptr_t* Buf; uintptr_t Max; }; ``` The Context field will be zero to gather a traceback from the current program execution point. In this case, the traceback function will be called from C code. Otherwise Context will be a value previously returned by a call to the context function. The traceback function should gather a stack trace from that saved point in the program execution. The traceback function may be called from an execution thread other than the one that recorded the context, but only when the context is known to be valid and unchanging. The traceback function may also be called deeper in the call stack on the same thread that recorded the context. The traceback function may be called multiple times with the same Context value; it will usually be appropriate to cache the result, if possible, the first time this is called for a specific context value. If the traceback function is called from a signal handler on a Unix system, SigContext will be the signal context argument passed to the signal handler (a C ucontext\_t\* cast to uintptr\_t). This may be used to start tracing at the point where the signal occurred. If the traceback function is not called from a signal handler, SigContext will be zero. Buf is where the traceback information should be stored. It should be PC values, such that Buf[0] is the PC of the caller, Buf[1] is the PC of that function's caller, and so on. Max is the maximum number of entries to store. The function should store a zero to indicate the top of the stack, or that the caller is on a different stack, presumably a Go stack. Unlike runtime.Callers, the PC values returned should, when passed to the symbolizer function, return the file/line of the call instruction. No additional subtraction is required or appropriate. On all platforms, the traceback function is invoked when a call from Go to C to Go requests a stack trace. On linux/amd64, linux/ppc64le, linux/arm64, and freebsd/amd64, the traceback function is also invoked when a signal is received by a thread that is executing a cgo call. The traceback function should not make assumptions about when it is called, as future versions of Go may make additional calls. The symbolizer function will be called with a single argument, a pointer to a struct: ``` struct { PC uintptr // program counter to fetch information for File *byte // file name (NUL terminated) Lineno uintptr // line number Func *byte // function name (NUL terminated) Entry uintptr // function entry point More uintptr // set non-zero if more info for this PC Data uintptr // unused by runtime, available for function } ``` In C syntax, this struct will be ``` struct { uintptr_t PC; char* File; uintptr_t Lineno; char* Func; uintptr_t Entry; uintptr_t More; uintptr_t Data; }; ``` The PC field will be a value returned by a call to the traceback function. The first time the function is called for a particular traceback, all the fields except PC will be 0. The function should fill in the other fields if possible, setting them to 0/nil if the information is not available. The Data field may be used to store any useful information across calls. The More field should be set to non-zero if there is more information for this PC, zero otherwise. If More is set non-zero, the function will be called again with the same PC, and may return different information (this is intended for use with inlined functions). If More is zero, the function will be called with the next PC value in the traceback. When the traceback is complete, the function will be called once more with PC set to zero; this may be used to free any information. Each call will leave the fields of the struct set to the same values they had upon return, except for the PC field when the More field is zero. The function must not keep a copy of the struct pointer between calls. When calling SetCgoTraceback, the version argument is the version number of the structs that the functions expect to receive. Currently this must be zero. The symbolizer function may be nil, in which case the results of the traceback function will be displayed as numbers. If the traceback function is nil, the symbolizer function will never be called. The context function may be nil, in which case the traceback function will only be called with the context field set to zero. If the context function is nil, then calls from Go to C to Go will not show a traceback for the C portion of the call stack. SetCgoTraceback should be called only once, ideally from an init function. func SetFinalizer ----------------- ``` func SetFinalizer(obj any, finalizer any) ``` SetFinalizer sets the finalizer associated with obj to the provided finalizer function. When the garbage collector finds an unreachable block with an associated finalizer, it clears the association and runs finalizer(obj) in a separate goroutine. This makes obj reachable again, but now without an associated finalizer. Assuming that SetFinalizer is not called again, the next time the garbage collector sees that obj is unreachable, it will free obj. SetFinalizer(obj, nil) clears any finalizer associated with obj. The argument obj must be a pointer to an object allocated by calling new, by taking the address of a composite literal, or by taking the address of a local variable. The argument finalizer must be a function that takes a single argument to which obj's type can be assigned, and can have arbitrary ignored return values. If either of these is not true, SetFinalizer may abort the program. Finalizers are run in dependency order: if A points at B, both have finalizers, and they are otherwise unreachable, only the finalizer for A runs; once A is freed, the finalizer for B can run. If a cyclic structure includes a block with a finalizer, that cycle is not guaranteed to be garbage collected and the finalizer is not guaranteed to run, because there is no ordering that respects the dependencies. The finalizer is scheduled to run at some arbitrary time after the program can no longer reach the object to which obj points. There is no guarantee that finalizers will run before a program exits, so typically they are useful only for releasing non-memory resources associated with an object during a long-running program. For example, an os.File object could use a finalizer to close the associated operating system file descriptor when a program discards an os.File without calling Close, but it would be a mistake to depend on a finalizer to flush an in-memory I/O buffer such as a bufio.Writer, because the buffer would not be flushed at program exit. It is not guaranteed that a finalizer will run if the size of \*obj is zero bytes, because it may share same address with other zero-size objects in memory. See <https://go.dev/ref/spec#Size_and_alignment_guarantees>. It is not guaranteed that a finalizer will run for objects allocated in initializers for package-level variables. Such objects may be linker-allocated, not heap-allocated. Note that because finalizers may execute arbitrarily far into the future after an object is no longer referenced, the runtime is allowed to perform a space-saving optimization that batches objects together in a single allocation slot. The finalizer for an unreferenced object in such an allocation may never run if it always exists in the same batch as a referenced object. Typically, this batching only happens for tiny (on the order of 16 bytes or less) and pointer-free objects. A finalizer may run as soon as an object becomes unreachable. In order to use finalizers correctly, the program must ensure that the object is reachable until it is no longer required. Objects stored in global variables, or that can be found by tracing pointers from a global variable, are reachable. For other objects, pass the object to a call of the KeepAlive function to mark the last point in the function where the object must be reachable. For example, if p points to a struct, such as os.File, that contains a file descriptor d, and p has a finalizer that closes that file descriptor, and if the last use of p in a function is a call to syscall.Write(p.d, buf, size), then p may be unreachable as soon as the program enters syscall.Write. The finalizer may run at that moment, closing p.d, causing syscall.Write to fail because it is writing to a closed file descriptor (or, worse, to an entirely different file descriptor opened by a different goroutine). To avoid this problem, call KeepAlive(p) after the call to syscall.Write. A single goroutine runs all finalizers for a program, sequentially. If a finalizer must run for a long time, it should do so by starting a new goroutine. In the terminology of the Go memory model, a call SetFinalizer(x, f) “synchronizes before” the finalization call f(x). However, there is no guarantee that KeepAlive(x) or any other use of x “synchronizes before” f(x), so in general a finalizer should use a mutex or other synchronization mechanism if it needs to access mutable state in x. For example, consider a finalizer that inspects a mutable field in x that is modified from time to time in the main program before x becomes unreachable and the finalizer is invoked. The modifications in the main program and the inspection in the finalizer need to use appropriate synchronization, such as mutexes or atomic updates, to avoid read-write races. func SetMutexProfileFraction 1.8 -------------------------------- ``` func SetMutexProfileFraction(rate int) int ``` SetMutexProfileFraction controls the fraction of mutex contention events that are reported in the mutex profile. On average 1/rate events are reported. The previous rate is returned. To turn off profiling entirely, pass rate 0. To just read the current rate, pass rate < 0. (For n>1 the details of sampling may change.) func Stack ---------- ``` func Stack(buf []byte, all bool) int ``` Stack formats a stack trace of the calling goroutine into buf and returns the number of bytes written to buf. If all is true, Stack formats stack traces of all other goroutines into buf after the trace for the current goroutine. func StartTrace 1.5 ------------------- ``` func StartTrace() error ``` StartTrace enables tracing for the current process. While tracing, the data will be buffered and available via ReadTrace. StartTrace returns an error if tracing is already enabled. Most clients should use the runtime/trace package or the testing package's -test.trace flag instead of calling StartTrace directly. func StopTrace 1.5 ------------------ ``` func StopTrace() ``` StopTrace stops tracing, if it was previously enabled. StopTrace only returns after all the reads for the trace have completed. func ThreadCreateProfile ------------------------ ``` func ThreadCreateProfile(p []StackRecord) (n int, ok bool) ``` ThreadCreateProfile returns n, the number of records in the thread creation profile. If len(p) >= n, ThreadCreateProfile copies the profile into p and returns n, true. If len(p) < n, ThreadCreateProfile does not change p and returns n, false. Most clients should use the runtime/pprof package instead of calling ThreadCreateProfile directly. func UnlockOSThread ------------------- ``` func UnlockOSThread() ``` UnlockOSThread undoes an earlier call to LockOSThread. If this drops the number of active LockOSThread calls on the calling goroutine to zero, it unwires the calling goroutine from its fixed operating system thread. If there are no active LockOSThread calls, this is a no-op. Before calling UnlockOSThread, the caller must ensure that the OS thread is suitable for running other goroutines. If the caller made any permanent changes to the state of the thread that would affect other goroutines, it should not call this function and thus leave the goroutine locked to the OS thread until the goroutine (and hence the thread) exits. func Version ------------ ``` func Version() string ``` Version returns the Go tree's version string. It is either the commit hash and date at the time of the build or, when possible, a release tag like "go1.3". type BlockProfileRecord 1.1 --------------------------- BlockProfileRecord describes blocking events originated at a particular call sequence (stack trace). ``` type BlockProfileRecord struct { Count int64 Cycles int64 StackRecord } ``` type Error ---------- The Error interface identifies a run time error. ``` type Error interface { error // RuntimeError is a no-op function but // serves to distinguish types that are run time // errors from ordinary errors: a type is a // run time error if it has a RuntimeError method. RuntimeError() } ``` type Frame 1.7 -------------- Frame is the information returned by Frames for each call frame. ``` type Frame struct { // PC is the program counter for the location in this frame. // For a frame that calls another frame, this will be the // program counter of a call instruction. Because of inlining, // multiple frames may have the same PC value, but different // symbolic information. PC uintptr // Func is the Func value of this call frame. This may be nil // for non-Go code or fully inlined functions. Func *Func // Function is the package path-qualified function name of // this call frame. If non-empty, this string uniquely // identifies a single function in the program. // This may be the empty string if not known. // If Func is not nil then Function == Func.Name(). Function string // File and Line are the file name and line number of the // location in this frame. For non-leaf frames, this will be // the location of a call. These may be the empty string and // zero, respectively, if not known. File string Line int // Entry point program counter for the function; may be zero // if not known. If Func is not nil then Entry == // Func.Entry(). Entry uintptr // contains filtered or unexported fields } ``` type Frames 1.7 --------------- Frames may be used to get function/file/line information for a slice of PC values returned by Callers. ``` type Frames struct { // contains filtered or unexported fields } ``` #### Example Code: ``` c := func() { // Ask runtime.Callers for up to 10 PCs, including runtime.Callers itself. pc := make([]uintptr, 10) n := runtime.Callers(0, pc) if n == 0 { // No PCs available. This can happen if the first argument to // runtime.Callers is large. // // Return now to avoid processing the zero Frame that would // otherwise be returned by frames.Next below. return } pc = pc[:n] // pass only valid pcs to runtime.CallersFrames frames := runtime.CallersFrames(pc) // Loop to get frames. // A fixed number of PCs can expand to an indefinite number of Frames. for { frame, more := frames.Next() // Process this frame. // // To keep this example's output stable // even if there are changes in the testing package, // stop unwinding when we leave package runtime. if !strings.Contains(frame.File, "runtime/") { break } fmt.Printf("- more:%v | %s\n", more, frame.Function) // Check whether there are more frames to process after this one. if !more { break } } } b := func() { c() } a := func() { b() } a() ``` Output: ``` - more:true | runtime.Callers - more:true | runtime_test.ExampleFrames.func1 - more:true | runtime_test.ExampleFrames.func2 - more:true | runtime_test.ExampleFrames.func3 - more:true | runtime_test.ExampleFrames ``` ### func CallersFrames 1.7 ``` func CallersFrames(callers []uintptr) *Frames ``` CallersFrames takes a slice of PC values returned by Callers and prepares to return function/file/line information. Do not change the slice until you are done with the Frames. ### func (\*Frames) Next 1.7 ``` func (ci *Frames) Next() (frame Frame, more bool) ``` Next returns a Frame representing the next call frame in the slice of PC values. If it has already returned all call frames, Next returns a zero Frame. The more result indicates whether the next call to Next will return a valid Frame. It does not necessarily indicate whether this call returned one. See the Frames example for idiomatic usage. type Func --------- A Func represents a Go function in the running binary. ``` type Func struct { // contains filtered or unexported fields } ``` ### func FuncForPC ``` func FuncForPC(pc uintptr) *Func ``` FuncForPC returns a \*Func describing the function that contains the given program counter address, or else nil. If pc represents multiple functions because of inlining, it returns the \*Func describing the innermost function, but with an entry of the outermost function. ### func (\*Func) Entry ``` func (f *Func) Entry() uintptr ``` Entry returns the entry address of the function. ### func (\*Func) FileLine ``` func (f *Func) FileLine(pc uintptr) (file string, line int) ``` FileLine returns the file name and line number of the source code corresponding to the program counter pc. The result will not be accurate if pc is not a program counter within f. ### func (\*Func) Name ``` func (f *Func) Name() string ``` Name returns the name of the function. type MemProfileRecord --------------------- A MemProfileRecord describes the live objects allocated by a particular call sequence (stack trace). ``` type MemProfileRecord struct { AllocBytes, FreeBytes int64 // number of bytes allocated, freed AllocObjects, FreeObjects int64 // number of objects allocated, freed Stack0 [32]uintptr // stack trace for this record; ends at first 0 entry } ``` ### func (\*MemProfileRecord) InUseBytes ``` func (r *MemProfileRecord) InUseBytes() int64 ``` InUseBytes returns the number of bytes in use (AllocBytes - FreeBytes). ### func (\*MemProfileRecord) InUseObjects ``` func (r *MemProfileRecord) InUseObjects() int64 ``` InUseObjects returns the number of objects in use (AllocObjects - FreeObjects). ### func (\*MemProfileRecord) Stack ``` func (r *MemProfileRecord) Stack() []uintptr ``` Stack returns the stack trace associated with the record, a prefix of r.Stack0. type MemStats ------------- A MemStats records statistics about the memory allocator. ``` type MemStats struct { // Alloc is bytes of allocated heap objects. // // This is the same as HeapAlloc (see below). Alloc uint64 // TotalAlloc is cumulative bytes allocated for heap objects. // // TotalAlloc increases as heap objects are allocated, but // unlike Alloc and HeapAlloc, it does not decrease when // objects are freed. TotalAlloc uint64 // Sys is the total bytes of memory obtained from the OS. // // Sys is the sum of the XSys fields below. Sys measures the // virtual address space reserved by the Go runtime for the // heap, stacks, and other internal data structures. It's // likely that not all of the virtual address space is backed // by physical memory at any given moment, though in general // it all was at some point. Sys uint64 // Lookups is the number of pointer lookups performed by the // runtime. // // This is primarily useful for debugging runtime internals. Lookups uint64 // Mallocs is the cumulative count of heap objects allocated. // The number of live objects is Mallocs - Frees. Mallocs uint64 // Frees is the cumulative count of heap objects freed. Frees uint64 // HeapAlloc is bytes of allocated heap objects. // // "Allocated" heap objects include all reachable objects, as // well as unreachable objects that the garbage collector has // not yet freed. Specifically, HeapAlloc increases as heap // objects are allocated and decreases as the heap is swept // and unreachable objects are freed. Sweeping occurs // incrementally between GC cycles, so these two processes // occur simultaneously, and as a result HeapAlloc tends to // change smoothly (in contrast with the sawtooth that is // typical of stop-the-world garbage collectors). HeapAlloc uint64 // HeapSys is bytes of heap memory obtained from the OS. // // HeapSys measures the amount of virtual address space // reserved for the heap. This includes virtual address space // that has been reserved but not yet used, which consumes no // physical memory, but tends to be small, as well as virtual // address space for which the physical memory has been // returned to the OS after it became unused (see HeapReleased // for a measure of the latter). // // HeapSys estimates the largest size the heap has had. HeapSys uint64 // HeapIdle is bytes in idle (unused) spans. // // Idle spans have no objects in them. These spans could be // (and may already have been) returned to the OS, or they can // be reused for heap allocations, or they can be reused as // stack memory. // // HeapIdle minus HeapReleased estimates the amount of memory // that could be returned to the OS, but is being retained by // the runtime so it can grow the heap without requesting more // memory from the OS. If this difference is significantly // larger than the heap size, it indicates there was a recent // transient spike in live heap size. HeapIdle uint64 // HeapInuse is bytes in in-use spans. // // In-use spans have at least one object in them. These spans // can only be used for other objects of roughly the same // size. // // HeapInuse minus HeapAlloc estimates the amount of memory // that has been dedicated to particular size classes, but is // not currently being used. This is an upper bound on // fragmentation, but in general this memory can be reused // efficiently. HeapInuse uint64 // HeapReleased is bytes of physical memory returned to the OS. // // This counts heap memory from idle spans that was returned // to the OS and has not yet been reacquired for the heap. HeapReleased uint64 // HeapObjects is the number of allocated heap objects. // // Like HeapAlloc, this increases as objects are allocated and // decreases as the heap is swept and unreachable objects are // freed. HeapObjects uint64 // StackInuse is bytes in stack spans. // // In-use stack spans have at least one stack in them. These // spans can only be used for other stacks of the same size. // // There is no StackIdle because unused stack spans are // returned to the heap (and hence counted toward HeapIdle). StackInuse uint64 // StackSys is bytes of stack memory obtained from the OS. // // StackSys is StackInuse, plus any memory obtained directly // from the OS for OS thread stacks (which should be minimal). StackSys uint64 // MSpanInuse is bytes of allocated mspan structures. MSpanInuse uint64 // MSpanSys is bytes of memory obtained from the OS for mspan // structures. MSpanSys uint64 // MCacheInuse is bytes of allocated mcache structures. MCacheInuse uint64 // MCacheSys is bytes of memory obtained from the OS for // mcache structures. MCacheSys uint64 // BuckHashSys is bytes of memory in profiling bucket hash tables. BuckHashSys uint64 // GCSys is bytes of memory in garbage collection metadata. GCSys uint64 // Go 1.2 // OtherSys is bytes of memory in miscellaneous off-heap // runtime allocations. OtherSys uint64 // Go 1.2 // NextGC is the target heap size of the next GC cycle. // // The garbage collector's goal is to keep HeapAlloc ≤ NextGC. // At the end of each GC cycle, the target for the next cycle // is computed based on the amount of reachable data and the // value of GOGC. NextGC uint64 // LastGC is the time the last garbage collection finished, as // nanoseconds since 1970 (the UNIX epoch). LastGC uint64 // PauseTotalNs is the cumulative nanoseconds in GC // stop-the-world pauses since the program started. // // During a stop-the-world pause, all goroutines are paused // and only the garbage collector can run. PauseTotalNs uint64 // PauseNs is a circular buffer of recent GC stop-the-world // pause times in nanoseconds. // // The most recent pause is at PauseNs[(NumGC+255)%256]. In // general, PauseNs[N%256] records the time paused in the most // recent N%256th GC cycle. There may be multiple pauses per // GC cycle; this is the sum of all pauses during a cycle. PauseNs [256]uint64 // PauseEnd is a circular buffer of recent GC pause end times, // as nanoseconds since 1970 (the UNIX epoch). // // This buffer is filled the same way as PauseNs. There may be // multiple pauses per GC cycle; this records the end of the // last pause in a cycle. PauseEnd [256]uint64 // Go 1.4 // NumGC is the number of completed GC cycles. NumGC uint32 // NumForcedGC is the number of GC cycles that were forced by // the application calling the GC function. NumForcedGC uint32 // Go 1.8 // GCCPUFraction is the fraction of this program's available // CPU time used by the GC since the program started. // // GCCPUFraction is expressed as a number between 0 and 1, // where 0 means GC has consumed none of this program's CPU. A // program's available CPU time is defined as the integral of // GOMAXPROCS since the program started. That is, if // GOMAXPROCS is 2 and a program has been running for 10 // seconds, its "available CPU" is 20 seconds. GCCPUFraction // does not include CPU time used for write barrier activity. // // This is the same as the fraction of CPU reported by // GODEBUG=gctrace=1. GCCPUFraction float64 // Go 1.5 // EnableGC indicates that GC is enabled. It is always true, // even if GOGC=off. EnableGC bool // DebugGC is currently unused. DebugGC bool // BySize reports per-size class allocation statistics. // // BySize[N] gives statistics for allocations of size S where // BySize[N-1].Size < S ≤ BySize[N].Size. // // This does not report allocations larger than BySize[60].Size. BySize [61]struct { // Size is the maximum byte size of an object in this // size class. Size uint32 // Mallocs is the cumulative count of heap objects // allocated in this size class. The cumulative bytes // of allocation is Size*Mallocs. The number of live // objects in this size class is Mallocs - Frees. Mallocs uint64 // Frees is the cumulative count of heap objects freed // in this size class. Frees uint64 } } ``` type StackRecord ---------------- A StackRecord describes a single execution stack. ``` type StackRecord struct { Stack0 [32]uintptr // stack trace for this record; ends at first 0 entry } ``` ### func (\*StackRecord) Stack ``` func (r *StackRecord) Stack() []uintptr ``` Stack returns the stack trace associated with the record, a prefix of r.Stack0. type TypeAssertionError ----------------------- A TypeAssertionError explains a failed type assertion. ``` type TypeAssertionError struct { // contains filtered or unexported fields } ``` ### func (\*TypeAssertionError) Error ``` func (e *TypeAssertionError) Error() string ``` ### func (\*TypeAssertionError) RuntimeError ``` func (*TypeAssertionError) RuntimeError() ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [asan](asan/index) | | | [cgo](cgo/index) | Package cgo contains runtime support for code generated by the cgo tool. | | [coverage](coverage/index) | | | [debug](debug/index) | Package debug contains facilities for programs to debug themselves while they are running. | | [metrics](metrics/index) | Package metrics provides a stable interface to access implementation-defined metrics exported by the Go runtime. | | msan | | | [pprof](pprof/index) | Package pprof writes runtime profiling data in the format expected by the pprof visualization tool. | | [race](race/index) | Package race implements data race detection logic. | | [trace](trace/index) | Package trace contains facilities for programs to generate traces for the Go execution tracer. |
programming_docs
go Package metrics Package metrics ================ * `import "runtime/metrics"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package metrics provides a stable interface to access implementation-defined metrics exported by the Go runtime. This package is similar to existing functions like runtime.ReadMemStats and debug.ReadGCStats, but significantly more general. The set of metrics defined by this package may evolve as the runtime itself evolves, and also enables variation across Go implementations, whose relevant metric sets may not intersect. ### Interface Metrics are designated by a string key, rather than, for example, a field name in a struct. The full list of supported metrics is always available in the slice of Descriptions returned by All. Each Description also includes useful information about the metric. Thus, users of this API are encouraged to sample supported metrics defined by the slice returned by All to remain compatible across Go versions. Of course, situations arise where reading specific metrics is critical. For these cases, users are encouraged to use build tags, and although metrics may be deprecated and removed, users should consider this to be an exceptional and rare event, coinciding with a very large change in a particular Go implementation. Each metric key also has a "kind" that describes the format of the metric's value. In the interest of not breaking users of this package, the "kind" for a given metric is guaranteed not to change. If it must change, then a new metric will be introduced with a new key and a new "kind." ### Metric key format As mentioned earlier, metric keys are strings. Their format is simple and well-defined, designed to be both human and machine readable. It is split into two components, separated by a colon: a rooted path and a unit. The choice to include the unit in the key is motivated by compatibility: if a metric's unit changes, its semantics likely did also, and a new key should be introduced. For more details on the precise definition of the metric key's path and unit formats, see the documentation of the Name field of the Description struct. ### A note about floats This package supports metrics whose values have a floating-point representation. In order to improve ease-of-use, this package promises to never produce the following classes of floating-point values: NaN, infinity. ### Supported metrics Below is the full list of supported metrics, ordered lexicographically. ``` /cgo/go-to-c-calls:calls Count of calls made from Go to C by the current process. /cpu/classes/gc/mark/assist:cpu-seconds Estimated total CPU time goroutines spent performing GC tasks to assist the GC and prevent it from falling behind the application. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/gc/mark/dedicated:cpu-seconds Estimated total CPU time spent performing GC tasks on processors (as defined by GOMAXPROCS) dedicated to those tasks. This includes time spent with the world stopped due to the GC. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/gc/mark/idle:cpu-seconds Estimated total CPU time spent performing GC tasks on spare CPU resources that the Go scheduler could not otherwise find a use for. This should be subtracted from the total GC CPU time to obtain a measure of compulsory GC CPU time. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/gc/pause:cpu-seconds Estimated total CPU time spent with the application paused by the GC. Even if only one thread is running during the pause, this is computed as GOMAXPROCS times the pause latency because nothing else can be executing. This is the exact sum of samples in /gc/pause:seconds if each sample is multiplied by GOMAXPROCS at the time it is taken. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/gc/total:cpu-seconds Estimated total CPU time spent performing GC tasks. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. Sum of all metrics in /cpu/classes/gc. /cpu/classes/idle:cpu-seconds Estimated total available CPU time not spent executing any Go or Go runtime code. In other words, the part of /cpu/classes/total:cpu-seconds that was unused. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/scavenge/assist:cpu-seconds Estimated total CPU time spent returning unused memory to the underlying platform in response eagerly in response to memory pressure. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/scavenge/background:cpu-seconds Estimated total CPU time spent performing background tasks to return unused memory to the underlying platform. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /cpu/classes/scavenge/total:cpu-seconds Estimated total CPU time spent performing tasks that return unused memory to the underlying platform. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. Sum of all metrics in /cpu/classes/scavenge. /cpu/classes/total:cpu-seconds Estimated total available CPU time for user Go code or the Go runtime, as defined by GOMAXPROCS. In other words, GOMAXPROCS integrated over the wall-clock duration this process has been executing for. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. Sum of all metrics in /cpu/classes. /cpu/classes/user:cpu-seconds Estimated total CPU time spent running user Go code. This may also include some small amount of time spent in the Go runtime. This metric is an overestimate, and not directly comparable to system CPU time measurements. Compare only with other /cpu/classes metrics. /gc/cycles/automatic:gc-cycles Count of completed GC cycles generated by the Go runtime. /gc/cycles/forced:gc-cycles Count of completed GC cycles forced by the application. /gc/cycles/total:gc-cycles Count of all completed GC cycles. /gc/heap/allocs-by-size:bytes Distribution of heap allocations by approximate size. Note that this does not include tiny objects as defined by /gc/heap/tiny/allocs:objects, only tiny blocks. /gc/heap/allocs:bytes Cumulative sum of memory allocated to the heap by the application. /gc/heap/allocs:objects Cumulative count of heap allocations triggered by the application. Note that this does not include tiny objects as defined by /gc/heap/tiny/allocs:objects, only tiny blocks. /gc/heap/frees-by-size:bytes Distribution of freed heap allocations by approximate size. Note that this does not include tiny objects as defined by /gc/heap/tiny/allocs:objects, only tiny blocks. /gc/heap/frees:bytes Cumulative sum of heap memory freed by the garbage collector. /gc/heap/frees:objects Cumulative count of heap allocations whose storage was freed by the garbage collector. Note that this does not include tiny objects as defined by /gc/heap/tiny/allocs:objects, only tiny blocks. /gc/heap/goal:bytes Heap size target for the end of the GC cycle. /gc/heap/objects:objects Number of objects, live or unswept, occupying heap memory. /gc/heap/tiny/allocs:objects Count of small allocations that are packed together into blocks. These allocations are counted separately from other allocations because each individual allocation is not tracked by the runtime, only their block. Each block is already accounted for in allocs-by-size and frees-by-size. /gc/limiter/last-enabled:gc-cycle GC cycle the last time the GC CPU limiter was enabled. This metric is useful for diagnosing the root cause of an out-of-memory error, because the limiter trades memory for CPU time when the GC's CPU time gets too high. This is most likely to occur with use of SetMemoryLimit. The first GC cycle is cycle 1, so a value of 0 indicates that it was never enabled. /gc/pauses:seconds Distribution individual GC-related stop-the-world pause latencies. /gc/stack/starting-size:bytes The stack size of new goroutines. /memory/classes/heap/free:bytes Memory that is completely free and eligible to be returned to the underlying system, but has not been. This metric is the runtime's estimate of free address space that is backed by physical memory. /memory/classes/heap/objects:bytes Memory occupied by live objects and dead objects that have not yet been marked free by the garbage collector. /memory/classes/heap/released:bytes Memory that is completely free and has been returned to the underlying system. This metric is the runtime's estimate of free address space that is still mapped into the process, but is not backed by physical memory. /memory/classes/heap/stacks:bytes Memory allocated from the heap that is reserved for stack space, whether or not it is currently in-use. /memory/classes/heap/unused:bytes Memory that is reserved for heap objects but is not currently used to hold heap objects. /memory/classes/metadata/mcache/free:bytes Memory that is reserved for runtime mcache structures, but not in-use. /memory/classes/metadata/mcache/inuse:bytes Memory that is occupied by runtime mcache structures that are currently being used. /memory/classes/metadata/mspan/free:bytes Memory that is reserved for runtime mspan structures, but not in-use. /memory/classes/metadata/mspan/inuse:bytes Memory that is occupied by runtime mspan structures that are currently being used. /memory/classes/metadata/other:bytes Memory that is reserved for or used to hold runtime metadata. /memory/classes/os-stacks:bytes Stack memory allocated by the underlying operating system. /memory/classes/other:bytes Memory used by execution trace buffers, structures for debugging the runtime, finalizer and profiler specials, and more. /memory/classes/profiling/buckets:bytes Memory that is used by the stack trace hash map used for profiling. /memory/classes/total:bytes All memory mapped by the Go runtime into the current process as read-write. Note that this does not include memory mapped by code called via cgo or via the syscall package. Sum of all metrics in /memory/classes. /sched/gomaxprocs:threads The current runtime.GOMAXPROCS setting, or the number of operating system threads that can execute user-level Go code simultaneously. /sched/goroutines:goroutines Count of live goroutines. /sched/latencies:seconds Distribution of the time goroutines have spent in the scheduler in a runnable state before actually running. /sync/mutex/wait/total:seconds Approximate cumulative time goroutines have spent blocked on a sync.Mutex or sync.RWMutex. This metric is useful for identifying global changes in lock contention. Collect a mutex or block profile using the runtime/pprof package for more detailed contention data. ``` Index ----- * [func Read(m []Sample)](#Read) * [type Description](#Description) * [func All() []Description](#All) * [type Float64Histogram](#Float64Histogram) * [type Sample](#Sample) * [type Value](#Value) * [func (v Value) Float64() float64](#Value.Float64) * [func (v Value) Float64Histogram() \*Float64Histogram](#Value.Float64Histogram) * [func (v Value) Kind() ValueKind](#Value.Kind) * [func (v Value) Uint64() uint64](#Value.Uint64) * [type ValueKind](#ValueKind) ### Examples [Read (ReadingAllMetrics)](#example_Read_readingAllMetrics) [Read (ReadingOneMetric)](#example_Read_readingOneMetric) ### Package files description.go doc.go histogram.go sample.go value.go func Read 1.16 -------------- ``` func Read(m []Sample) ``` Read populates each Value field in the given slice of metric samples. Desired metrics should be present in the slice with the appropriate name. The user of this API is encouraged to re-use the same slice between calls for efficiency, but is not required to do so. Note that re-use has some caveats. Notably, Values should not be read or manipulated while a Read with that value is outstanding; that is a data race. This property includes pointer-typed Values (for example, Float64Histogram) whose underlying storage will be reused by Read when possible. To safely use such values in a concurrent setting, all data must be deep-copied. It is safe to execute multiple Read calls concurrently, but their arguments must share no underlying memory. When in doubt, create a new []Sample from scratch, which is always safe, though may be inefficient. Sample values with names not appearing in All will have their Value populated as KindBad to indicate that the name is unknown. #### Example (ReadingAllMetrics) Code: ``` // Get descriptions for all supported metrics. descs := metrics.All() // Create a sample for each metric. samples := make([]metrics.Sample, len(descs)) for i := range samples { samples[i].Name = descs[i].Name } // Sample the metrics. Re-use the samples slice if you can! metrics.Read(samples) // Iterate over all results. for _, sample := range samples { // Pull out the name and value. name, value := sample.Name, sample.Value // Handle each sample. switch value.Kind() { case metrics.KindUint64: fmt.Printf("%s: %d\n", name, value.Uint64()) case metrics.KindFloat64: fmt.Printf("%s: %f\n", name, value.Float64()) case metrics.KindFloat64Histogram: // The histogram may be quite large, so let's just pull out // a crude estimate for the median for the sake of this example. fmt.Printf("%s: %f\n", name, medianBucket(value.Float64Histogram())) case metrics.KindBad: // This should never happen because all metrics are supported // by construction. panic("bug in runtime/metrics package!") default: // This may happen as new metrics get added. // // The safest thing to do here is to simply log it somewhere // as something to look into, but ignore it for now. // In the worst case, you might temporarily miss out on a new metric. fmt.Printf("%s: unexpected metric Kind: %v\n", name, value.Kind()) } } ``` #### Example (ReadingOneMetric) Code: ``` // Name of the metric we want to read. const myMetric = "/memory/classes/heap/free:bytes" // Create a sample for the metric. sample := make([]metrics.Sample, 1) sample[0].Name = myMetric // Sample the metric. metrics.Read(sample) // Check if the metric is actually supported. // If it's not, the resulting value will always have // kind KindBad. if sample[0].Value.Kind() == metrics.KindBad { panic(fmt.Sprintf("metric %q no longer supported", myMetric)) } // Handle the result. // // It's OK to assume a particular Kind for a metric; // they're guaranteed not to change. freeBytes := sample[0].Value.Uint64() fmt.Printf("free but not released memory: %d\n", freeBytes) ``` type Description 1.16 --------------------- Description describes a runtime metric. ``` type Description struct { // Name is the full name of the metric which includes the unit. // // The format of the metric may be described by the following regular expression. // // ^(?P<name>/[^:]+):(?P<unit>[^:*/]+(?:[*/][^:*/]+)*)$ // // The format splits the name into two components, separated by a colon: a path which always // starts with a /, and a machine-parseable unit. The name may contain any valid Unicode // codepoint in between / characters, but by convention will try to stick to lowercase // characters and hyphens. An example of such a path might be "/memory/heap/free". // // The unit is by convention a series of lowercase English unit names (singular or plural) // without prefixes delimited by '*' or '/'. The unit names may contain any valid Unicode // codepoint that is not a delimiter. // Examples of units might be "seconds", "bytes", "bytes/second", "cpu-seconds", // "byte*cpu-seconds", and "bytes/second/second". // // For histograms, multiple units may apply. For instance, the units of the buckets and // the count. By convention, for histograms, the units of the count are always "samples" // with the type of sample evident by the metric's name, while the unit in the name // specifies the buckets' unit. // // A complete name might look like "/memory/heap/free:bytes". Name string // Description is an English language sentence describing the metric. Description string // Kind is the kind of value for this metric. // // The purpose of this field is to allow users to filter out metrics whose values are // types which their application may not understand. Kind ValueKind // Cumulative is whether or not the metric is cumulative. If a cumulative metric is just // a single number, then it increases monotonically. If the metric is a distribution, // then each bucket count increases monotonically. // // This flag thus indicates whether or not it's useful to compute a rate from this value. Cumulative bool } ``` ### func All 1.16 ``` func All() []Description ``` All returns a slice of containing metric descriptions for all supported metrics. type Float64Histogram 1.16 -------------------------- Float64Histogram represents a distribution of float64 values. ``` type Float64Histogram struct { // Counts contains the weights for each histogram bucket. // // Given N buckets, Count[n] is the weight of the range // [bucket[n], bucket[n+1]), for 0 <= n < N. Counts []uint64 // Buckets contains the boundaries of the histogram buckets, in increasing order. // // Buckets[0] is the inclusive lower bound of the minimum bucket while // Buckets[len(Buckets)-1] is the exclusive upper bound of the maximum bucket. // Hence, there are len(Buckets)-1 counts. Furthermore, len(Buckets) != 1, always, // since at least two boundaries are required to describe one bucket (and 0 // boundaries are used to describe 0 buckets). // // Buckets[0] is permitted to have value -Inf and Buckets[len(Buckets)-1] is // permitted to have value Inf. // // For a given metric name, the value of Buckets is guaranteed not to change // between calls until program exit. // // This slice value is permitted to alias with other Float64Histograms' Buckets // fields, so the values within should only ever be read. If they need to be // modified, the user must make a copy. Buckets []float64 } ``` type Sample 1.16 ---------------- Sample captures a single metric sample. ``` type Sample struct { // Name is the name of the metric sampled. // // It must correspond to a name in one of the metric descriptions // returned by All. Name string // Value is the value of the metric sample. Value Value } ``` type Value 1.16 --------------- Value represents a metric value returned by the runtime. ``` type Value struct { // contains filtered or unexported fields } ``` ### func (Value) Float64 1.16 ``` func (v Value) Float64() float64 ``` Float64 returns the internal float64 value for the metric. If v.Kind() != KindFloat64, this method panics. ### func (Value) Float64Histogram 1.16 ``` func (v Value) Float64Histogram() *Float64Histogram ``` Float64Histogram returns the internal \*Float64Histogram value for the metric. If v.Kind() != KindFloat64Histogram, this method panics. ### func (Value) Kind 1.16 ``` func (v Value) Kind() ValueKind ``` Kind returns the tag representing the kind of value this is. ### func (Value) Uint64 1.16 ``` func (v Value) Uint64() uint64 ``` Uint64 returns the internal uint64 value for the metric. If v.Kind() != KindUint64, this method panics. type ValueKind 1.16 ------------------- ValueKind is a tag for a metric Value which indicates its type. ``` type ValueKind int ``` ``` const ( // KindBad indicates that the Value has no type and should not be used. KindBad ValueKind = iota // KindUint64 indicates that the type of the Value is a uint64. KindUint64 // KindFloat64 indicates that the type of the Value is a float64. KindFloat64 // KindFloat64Histogram indicates that the type of the Value is a *Float64Histogram. KindFloat64Histogram ) ```
programming_docs
go Package trace Package trace ============== * `import "runtime/trace"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package trace contains facilities for programs to generate traces for the Go execution tracer. ### Tracing runtime activities The execution trace captures a wide range of execution events such as goroutine creation/blocking/unblocking, syscall enter/exit/block, GC-related events, changes of heap size, processor start/stop, etc. When CPU profiling is active, the execution tracer makes an effort to include those samples as well. A precise nanosecond-precision timestamp and a stack trace is captured for most events. The generated trace can be interpreted using `go tool trace`. Support for tracing tests and benchmarks built with the standard testing package is built into `go test`. For example, the following command runs the test in the current directory and writes the trace file (trace.out). ``` go test -trace=trace.out ``` This runtime/trace package provides APIs to add equivalent tracing support to a standalone program. See the Example that demonstrates how to use this API to enable tracing. There is also a standard HTTP interface to trace data. Adding the following line will install a handler under the /debug/pprof/trace URL to download a live trace: ``` import _ "net/http/pprof" ``` See the net/http/pprof package for more details about all of the debug endpoints installed by this import. ### User annotation Package trace provides user annotation APIs that can be used to log interesting events during execution. There are three types of user annotations: log messages, regions, and tasks. Log emits a timestamped message to the execution trace along with additional information such as the category of the message and which goroutine called Log. The execution tracer provides UIs to filter and group goroutines using the log category and the message supplied in Log. A region is for logging a time interval during a goroutine's execution. By definition, a region starts and ends in the same goroutine. Regions can be nested to represent subintervals. For example, the following code records four regions in the execution trace to trace the durations of sequential steps in a cappuccino making operation. ``` trace.WithRegion(ctx, "makeCappuccino", func() { // orderID allows to identify a specific order // among many cappuccino order region records. trace.Log(ctx, "orderID", orderID) trace.WithRegion(ctx, "steamMilk", steamMilk) trace.WithRegion(ctx, "extractCoffee", extractCoffee) trace.WithRegion(ctx, "mixMilkCoffee", mixMilkCoffee) }) ``` A task is a higher-level component that aids tracing of logical operations such as an RPC request, an HTTP request, or an interesting local operation which may require multiple goroutines working together. Since tasks can involve multiple goroutines, they are tracked via a context.Context object. NewTask creates a new task and embeds it in the returned context.Context object. Log messages and regions are attached to the task, if any, in the Context passed to Log and WithRegion. For example, assume that we decided to froth milk, extract coffee, and mix milk and coffee in separate goroutines. With a task, the trace tool can identify the goroutines involved in a specific cappuccino order. ``` ctx, task := trace.NewTask(ctx, "makeCappuccino") trace.Log(ctx, "orderID", orderID) milk := make(chan bool) espresso := make(chan bool) go func() { trace.WithRegion(ctx, "steamMilk", steamMilk) milk <- true }() go func() { trace.WithRegion(ctx, "extractCoffee", extractCoffee) espresso <- true }() go func() { defer task.End() // When assemble is done, the order is complete. <-espresso <-milk trace.WithRegion(ctx, "mixMilkCoffee", mixMilkCoffee) }() ``` The trace tool computes the latency of a task by measuring the time between the task creation and the task end and provides latency distributions for each task type found in the trace. #### Example Example demonstrates the use of the trace package to trace the execution of a Go program. The trace output will be written to the file trace.out Code: ``` package trace_test import ( "fmt" "log" "os" "runtime/trace" ) // Example demonstrates the use of the trace package to trace // the execution of a Go program. The trace output will be // written to the file trace.out func Example() { f, err := os.Create("trace.out") if err != nil { log.Fatalf("failed to create trace output file: %v", err) } defer func() { if err := f.Close(); err != nil { log.Fatalf("failed to close trace file: %v", err) } }() if err := trace.Start(f); err != nil { log.Fatalf("failed to start trace: %v", err) } defer trace.Stop() // your program here RunMyProgram() } func RunMyProgram() { fmt.Printf("this function will be traced") } ``` Index ----- * [func IsEnabled() bool](#IsEnabled) * [func Log(ctx context.Context, category, message string)](#Log) * [func Logf(ctx context.Context, category, format string, args ...any)](#Logf) * [func Start(w io.Writer) error](#Start) * [func Stop()](#Stop) * [func WithRegion(ctx context.Context, regionType string, fn func())](#WithRegion) * [type Region](#Region) * [func StartRegion(ctx context.Context, regionType string) \*Region](#StartRegion) * [func (r \*Region) End()](#Region.End) * [type Task](#Task) * [func NewTask(pctx context.Context, taskType string) (ctx context.Context, task \*Task)](#NewTask) * [func (t \*Task) End()](#Task.End) ### Examples [Package](#example_) ### Package files annotation.go trace.go func IsEnabled 1.11 ------------------- ``` func IsEnabled() bool ``` IsEnabled reports whether tracing is enabled. The information is advisory only. The tracing status may have changed by the time this function returns. func Log 1.11 ------------- ``` func Log(ctx context.Context, category, message string) ``` Log emits a one-off event with the given category and message. Category can be empty and the API assumes there are only a handful of unique categories in the system. func Logf 1.11 -------------- ``` func Logf(ctx context.Context, category, format string, args ...any) ``` Logf is like Log, but the value is formatted using the specified format spec. func Start 1.5 -------------- ``` func Start(w io.Writer) error ``` Start enables tracing for the current program. While tracing, the trace will be buffered and written to w. Start returns an error if tracing is already enabled. func Stop 1.5 ------------- ``` func Stop() ``` Stop stops the current tracing, if any. Stop only returns after all the writes for the trace have completed. func WithRegion 1.11 -------------------- ``` func WithRegion(ctx context.Context, regionType string, fn func()) ``` WithRegion starts a region associated with its calling goroutine, runs fn, and then ends the region. If the context carries a task, the region is associated with the task. Otherwise, the region is attached to the background task. The regionType is used to classify regions, so there should be only a handful of unique region types. type Region 1.11 ---------------- Region is a region of code whose execution time interval is traced. ``` type Region struct { // contains filtered or unexported fields } ``` ### func StartRegion 1.11 ``` func StartRegion(ctx context.Context, regionType string) *Region ``` StartRegion starts a region and returns a function for marking the end of the region. The returned Region's End function must be called from the same goroutine where the region was started. Within each goroutine, regions must nest. That is, regions started after this region must be ended before this region can be ended. Recommended usage is ``` defer trace.StartRegion(ctx, "myTracedRegion").End() ``` ### func (\*Region) End 1.11 ``` func (r *Region) End() ``` End marks the end of the traced code region. type Task 1.11 -------------- Task is a data type for tracing a user-defined, logical operation. ``` type Task struct { // contains filtered or unexported fields } ``` ### func NewTask 1.11 ``` func NewTask(pctx context.Context, taskType string) (ctx context.Context, task *Task) ``` NewTask creates a task instance with the type taskType and returns it along with a Context that carries the task. If the input context contains a task, the new task is its subtask. The taskType is used to classify task instances. Analysis tools like the Go execution tracer may assume there are only a bounded number of unique task types in the system. The returned end function is used to mark the task's end. The trace tool measures task latency as the time between task creation and when the end function is called, and provides the latency distribution per task type. If the end function is called multiple times, only the first call is used in the latency measurement. ``` ctx, task := trace.NewTask(ctx, "awesomeTask") trace.WithRegion(ctx, "preparation", prepWork) // preparation of the task go func() { // continue processing the task in a separate goroutine. defer task.End() trace.WithRegion(ctx, "remainingWork", remainingWork) }() ``` ### func (\*Task) End 1.11 ``` func (t *Task) End() ``` End marks the end of the operation represented by the Task. go Package pprof Package pprof ============== * `import "runtime/pprof"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package pprof writes runtime profiling data in the format expected by the pprof visualization tool. ### Profiling a Go program The first step to profiling a Go program is to enable profiling. Support for profiling benchmarks built with the standard testing package is built into go test. For example, the following command runs benchmarks in the current directory and writes the CPU and memory profiles to cpu.prof and mem.prof: ``` go test -cpuprofile cpu.prof -memprofile mem.prof -bench . ``` To add equivalent profiling support to a standalone program, add code like the following to your main function: ``` var cpuprofile = flag.String("cpuprofile", "", "write cpu profile to `file`") var memprofile = flag.String("memprofile", "", "write memory profile to `file`") func main() { flag.Parse() if *cpuprofile != "" { f, err := os.Create(*cpuprofile) if err != nil { log.Fatal("could not create CPU profile: ", err) } defer f.Close() // error handling omitted for example if err := pprof.StartCPUProfile(f); err != nil { log.Fatal("could not start CPU profile: ", err) } defer pprof.StopCPUProfile() } // ... rest of the program ... if *memprofile != "" { f, err := os.Create(*memprofile) if err != nil { log.Fatal("could not create memory profile: ", err) } defer f.Close() // error handling omitted for example runtime.GC() // get up-to-date statistics if err := pprof.WriteHeapProfile(f); err != nil { log.Fatal("could not write memory profile: ", err) } } } ``` There is also a standard HTTP interface to profiling data. Adding the following line will install handlers under the /debug/pprof/ URL to download live profiles: ``` import _ "net/http/pprof" ``` See the net/http/pprof package for more details. Profiles can then be visualized with the pprof tool: ``` go tool pprof cpu.prof ``` There are many commands available from the pprof command line. Commonly used commands include "top", which prints a summary of the top program hot-spots, and "web", which opens an interactive graph of hot-spots and their call graphs. Use "help" for information on all pprof commands. For more information about pprof, see <https://github.com/google/pprof/blob/master/doc/README.md>. Index ----- * [func Do(ctx context.Context, labels LabelSet, f func(context.Context))](#Do) * [func ForLabels(ctx context.Context, f func(key, value string) bool)](#ForLabels) * [func Label(ctx context.Context, key string) (string, bool)](#Label) * [func SetGoroutineLabels(ctx context.Context)](#SetGoroutineLabels) * [func StartCPUProfile(w io.Writer) error](#StartCPUProfile) * [func StopCPUProfile()](#StopCPUProfile) * [func WithLabels(ctx context.Context, labels LabelSet) context.Context](#WithLabels) * [func WriteHeapProfile(w io.Writer) error](#WriteHeapProfile) * [type LabelSet](#LabelSet) * [func Labels(args ...string) LabelSet](#Labels) * [type Profile](#Profile) * [func Lookup(name string) \*Profile](#Lookup) * [func NewProfile(name string) \*Profile](#NewProfile) * [func Profiles() []\*Profile](#Profiles) * [func (p \*Profile) Add(value any, skip int)](#Profile.Add) * [func (p \*Profile) Count() int](#Profile.Count) * [func (p \*Profile) Name() string](#Profile.Name) * [func (p \*Profile) Remove(value any)](#Profile.Remove) * [func (p \*Profile) WriteTo(w io.Writer, debug int) error](#Profile.WriteTo) * [Bugs](#pkg-note-BUG) ### Package files elf.go label.go map.go pe.go pprof.go pprof\_rusage.go proto.go proto\_other.go protobuf.go protomem.go runtime.go func Do 1.9 ----------- ``` func Do(ctx context.Context, labels LabelSet, f func(context.Context)) ``` Do calls f with a copy of the parent context with the given labels added to the parent's label map. Goroutines spawned while executing f will inherit the augmented label-set. Each key/value pair in labels is inserted into the label map in the order provided, overriding any previous value for the same key. The augmented label map will be set for the duration of the call to f and restored once f returns. func ForLabels 1.9 ------------------ ``` func ForLabels(ctx context.Context, f func(key, value string) bool) ``` ForLabels invokes f with each label set on the context. The function f should return true to continue iteration or false to stop iteration early. func Label 1.9 -------------- ``` func Label(ctx context.Context, key string) (string, bool) ``` Label returns the value of the label with the given key on ctx, and a boolean indicating whether that label exists. func SetGoroutineLabels 1.9 --------------------------- ``` func SetGoroutineLabels(ctx context.Context) ``` SetGoroutineLabels sets the current goroutine's labels to match ctx. A new goroutine inherits the labels of the goroutine that created it. This is a lower-level API than Do, which should be used instead when possible. func StartCPUProfile -------------------- ``` func StartCPUProfile(w io.Writer) error ``` StartCPUProfile enables CPU profiling for the current process. While profiling, the profile will be buffered and written to w. StartCPUProfile returns an error if profiling is already enabled. On Unix-like systems, StartCPUProfile does not work by default for Go code built with -buildmode=c-archive or -buildmode=c-shared. StartCPUProfile relies on the SIGPROF signal, but that signal will be delivered to the main program's SIGPROF signal handler (if any) not to the one used by Go. To make it work, call os/signal.Notify for syscall.SIGPROF, but note that doing so may break any profiling being done by the main program. func StopCPUProfile ------------------- ``` func StopCPUProfile() ``` StopCPUProfile stops the current CPU profile, if any. StopCPUProfile only returns after all the writes for the profile have completed. func WithLabels 1.9 ------------------- ``` func WithLabels(ctx context.Context, labels LabelSet) context.Context ``` WithLabels returns a new context.Context with the given labels added. A label overwrites a prior label with the same key. func WriteHeapProfile --------------------- ``` func WriteHeapProfile(w io.Writer) error ``` WriteHeapProfile is shorthand for Lookup("heap").WriteTo(w, 0). It is preserved for backwards compatibility. type LabelSet 1.9 ----------------- LabelSet is a set of labels. ``` type LabelSet struct { // contains filtered or unexported fields } ``` ### func Labels 1.9 ``` func Labels(args ...string) LabelSet ``` Labels takes an even number of strings representing key-value pairs and makes a LabelSet containing them. A label overwrites a prior label with the same key. Currently only the CPU and goroutine profiles utilize any labels information. See <https://golang.org/issue/23458> for details. type Profile ------------ A Profile is a collection of stack traces showing the call sequences that led to instances of a particular event, such as allocation. Packages can create and maintain their own profiles; the most common use is for tracking resources that must be explicitly closed, such as files or network connections. A Profile's methods can be called from multiple goroutines simultaneously. Each Profile has a unique name. A few profiles are predefined: ``` goroutine - stack traces of all current goroutines heap - a sampling of memory allocations of live objects allocs - a sampling of all past memory allocations threadcreate - stack traces that led to the creation of new OS threads block - stack traces that led to blocking on synchronization primitives mutex - stack traces of holders of contended mutexes ``` These predefined profiles maintain themselves and panic on an explicit Add or Remove method call. The heap profile reports statistics as of the most recently completed garbage collection; it elides more recent allocation to avoid skewing the profile away from live data and toward garbage. If there has been no garbage collection at all, the heap profile reports all known allocations. This exception helps mainly in programs running without garbage collection enabled, usually for debugging purposes. The heap profile tracks both the allocation sites for all live objects in the application memory and for all objects allocated since the program start. Pprof's -inuse\_space, -inuse\_objects, -alloc\_space, and -alloc\_objects flags select which to display, defaulting to -inuse\_space (live objects, scaled by size). The allocs profile is the same as the heap profile but changes the default pprof display to -alloc\_space, the total number of bytes allocated since the program began (including garbage-collected bytes). The CPU profile is not available as a Profile. It has a special API, the StartCPUProfile and StopCPUProfile functions, because it streams output to a writer during profiling. ``` type Profile struct { // contains filtered or unexported fields } ``` ### func Lookup ``` func Lookup(name string) *Profile ``` Lookup returns the profile with the given name, or nil if no such profile exists. ### func NewProfile ``` func NewProfile(name string) *Profile ``` NewProfile creates a new profile with the given name. If a profile with that name already exists, NewProfile panics. The convention is to use a 'import/path.' prefix to create separate name spaces for each package. For compatibility with various tools that read pprof data, profile names should not contain spaces. ### func Profiles ``` func Profiles() []*Profile ``` Profiles returns a slice of all the known profiles, sorted by name. ### func (\*Profile) Add ``` func (p *Profile) Add(value any, skip int) ``` Add adds the current execution stack to the profile, associated with value. Add stores value in an internal map, so value must be suitable for use as a map key and will not be garbage collected until the corresponding call to Remove. Add panics if the profile already contains a stack for value. The skip parameter has the same meaning as runtime.Caller's skip and controls where the stack trace begins. Passing skip=0 begins the trace in the function calling Add. For example, given this execution stack: ``` Add called from rpc.NewClient called from mypkg.Run called from main.main ``` Passing skip=0 begins the stack trace at the call to Add inside rpc.NewClient. Passing skip=1 begins the stack trace at the call to NewClient inside mypkg.Run. ### func (\*Profile) Count ``` func (p *Profile) Count() int ``` Count returns the number of execution stacks currently in the profile. ### func (\*Profile) Name ``` func (p *Profile) Name() string ``` Name returns this profile's name, which can be passed to Lookup to reobtain the profile. ### func (\*Profile) Remove ``` func (p *Profile) Remove(value any) ``` Remove removes the execution stack associated with value from the profile. It is a no-op if the value is not in the profile. ### func (\*Profile) WriteTo ``` func (p *Profile) WriteTo(w io.Writer, debug int) error ``` WriteTo writes a pprof-formatted snapshot of the profile to w. If a write to w returns an error, WriteTo returns that error. Otherwise, WriteTo returns nil. The debug parameter enables additional output. Passing debug=0 writes the gzip-compressed protocol buffer described in <https://github.com/google/pprof/tree/master/proto#overview>. Passing debug=1 writes the legacy text format with comments translating addresses to function names and line numbers, so that a programmer can read the profile without tools. The predefined profiles may assign meaning to other debug values; for example, when printing the "goroutine" profile, debug=2 means to print the goroutine stacks in the same form that a Go program uses when dying due to an unrecovered panic. Bugs ---- * ☞ Profiles are only as good as the kernel support used to generate them. See <https://golang.org/issue/13841> for details about known problems.
programming_docs
go Package cgo Package cgo ============ * `import "runtime/cgo"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package cgo contains runtime support for code generated by the cgo tool. See the documentation for the cgo command for details on using cgo. Index ----- * [type Handle](#Handle) * [func NewHandle(v any) Handle](#NewHandle) * [func (h Handle) Delete()](#Handle.Delete) * [func (h Handle) Value() any](#Handle.Value) * [type Incomplete](#Incomplete) ### Package files callbacks.go callbacks\_traceback.go cgo.go handle.go iscgo.go linux.go mmap.go setenv.go sigaction.go type Handle ----------- Handle provides a way to pass values that contain Go pointers (pointers to memory allocated by Go) between Go and C without breaking the cgo pointer passing rules. A Handle is an integer value that can represent any Go value. A Handle can be passed through C and back to Go, and Go code can use the Handle to retrieve the original Go value. The underlying type of Handle is guaranteed to fit in an integer type that is large enough to hold the bit pattern of any pointer. The zero value of a Handle is not valid, and thus is safe to use as a sentinel in C APIs. For instance, on the Go side: ``` package main /* #include <stdint.h> // for uintptr_t extern void MyGoPrint(uintptr_t handle); void myprint(uintptr_t handle); */ import "C" import "runtime/cgo" //export MyGoPrint func MyGoPrint(handle C.uintptr_t) { h := cgo.Handle(handle) val := h.Value().(string) println(val) h.Delete() } func main() { val := "hello Go" C.myprint(C.uintptr_t(cgo.NewHandle(val))) // Output: hello Go } ``` and on the C side: ``` #include <stdint.h> // for uintptr_t // A Go function extern void MyGoPrint(uintptr_t handle); // A C function void myprint(uintptr_t handle) { MyGoPrint(handle); } ``` Some C functions accept a void\* argument that points to an arbitrary data value supplied by the caller. It is not safe to coerce a cgo.Handle (an integer) to a Go unsafe.Pointer, but instead we can pass the address of the cgo.Handle to the void\* parameter, as in this variant of the previous example: ``` package main /* extern void MyGoPrint(void *context); static inline void myprint(void *context) { MyGoPrint(context); } */ import "C" import ( "runtime/cgo" "unsafe" ) //export MyGoPrint func MyGoPrint(context unsafe.Pointer) { h := *(*cgo.Handle)(context) val := h.Value().(string) println(val) h.Delete() } func main() { val := "hello Go" h := cgo.NewHandle(val) C.myprint(unsafe.Pointer(&h)) // Output: hello Go } ``` ``` type Handle uintptr ``` ### func NewHandle ``` func NewHandle(v any) Handle ``` NewHandle returns a handle for a given value. The handle is valid until the program calls Delete on it. The handle uses resources, and this package assumes that C code may hold on to the handle, so a program must explicitly call Delete when the handle is no longer needed. The intended use is to pass the returned handle to C code, which passes it back to Go, which calls Value. ### func (Handle) Delete ``` func (h Handle) Delete() ``` Delete invalidates a handle. This method should only be called once the program no longer needs to pass the handle to C and the C code no longer has a copy of the handle value. The method panics if the handle is invalid. ### func (Handle) Value ``` func (h Handle) Value() any ``` Value returns the associated Go value for a valid handle. The method panics if the handle is invalid. type Incomplete --------------- Incomplete is used specifically for the semantics of incomplete C types. ``` type Incomplete struct { // contains filtered or unexported fields } ``` go Package race Package race ============= * `import "runtime/race"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Subdirectories](#pkg-subdirectories) Overview -------- Package race implements data race detection logic. No public interface is provided. For details about the race detector see <https://golang.org/doc/articles/race_detector.html> Index ----- ### Package files doc.go race\_v1\_amd64.go Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | go Package coverage Package coverage ================= * `import "runtime/coverage"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Index ----- * [func ClearCounters() error](#ClearCounters) * [func WriteCounters(w io.Writer) error](#WriteCounters) * [func WriteCountersDir(dir string) error](#WriteCountersDir) * [func WriteMeta(w io.Writer) error](#WriteMeta) * [func WriteMetaDir(dir string) error](#WriteMetaDir) ### Package files apis.go emit.go hooks.go testsupport.go func ClearCounters 1.20 ----------------------- ``` func ClearCounters() error ``` ClearCounters clears/resets all coverage counter variables in the currently running program. It returns an error if the program in question was not built with the "-cover" flag. Clearing of coverage counters is also not supported for programs not using atomic counter mode (see more detailed comments below for the rationale here). func WriteCounters 1.20 ----------------------- ``` func WriteCounters(w io.Writer) error ``` WriteCounters writes coverage counter-data content for the currently running program to the writer 'w'. An error will be returned if the operation can't be completed successfully (for example, if the currently running program was not built with "-cover", or if a write fails). The counter data written will be a snapshot taken at the point of the invocation. func WriteCountersDir 1.20 -------------------------- ``` func WriteCountersDir(dir string) error ``` WriteCountersDir writes a coverage counter-data file for the currently running program to the directory specified in 'dir'. An error will be returned if the operation can't be completed successfully (for example, if the currently running program was not built with "-cover", or if the directory does not exist). The counter data written will be a snapshot taken at the point of the call. func WriteMeta 1.20 ------------------- ``` func WriteMeta(w io.Writer) error ``` WriteMeta writes the meta-data content (the payload that would normally be emitted to a meta-data file) for the currently running program to the the writer 'w'. An error will be returned if the operation can't be completed successfully (for example, if the currently running program was not built with "-cover", or if a write fails). func WriteMetaDir 1.20 ---------------------- ``` func WriteMetaDir(dir string) error ``` WriteMetaDir writes a coverage meta-data file for the currently running program to the directory specified in 'dir'. An error will be returned if the operation can't be completed successfully (for example, if the currently running program was not built with "-cover", or if the directory does not exist). go Package debug Package debug ============== * `import "runtime/debug"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package debug contains facilities for programs to debug themselves while they are running. Index ----- * [func FreeOSMemory()](#FreeOSMemory) * [func PrintStack()](#PrintStack) * [func ReadGCStats(stats \*GCStats)](#ReadGCStats) * [func SetGCPercent(percent int) int](#SetGCPercent) * [func SetMaxStack(bytes int) int](#SetMaxStack) * [func SetMaxThreads(threads int) int](#SetMaxThreads) * [func SetMemoryLimit(limit int64) int64](#SetMemoryLimit) * [func SetPanicOnFault(enabled bool) bool](#SetPanicOnFault) * [func SetTraceback(level string)](#SetTraceback) * [func Stack() []byte](#Stack) * [func WriteHeapDump(fd uintptr)](#WriteHeapDump) * [type BuildInfo](#BuildInfo) * [func ParseBuildInfo(data string) (bi \*BuildInfo, err error)](#ParseBuildInfo) * [func ReadBuildInfo() (info \*BuildInfo, ok bool)](#ReadBuildInfo) * [func (bi \*BuildInfo) String() string](#BuildInfo.String) * [type BuildSetting](#BuildSetting) * [type GCStats](#GCStats) * [type Module](#Module) ### Package files garbage.go mod.go stack.go stubs.go func FreeOSMemory 1.1 --------------------- ``` func FreeOSMemory() ``` FreeOSMemory forces a garbage collection followed by an attempt to return as much memory to the operating system as possible. (Even if this is not called, the runtime gradually returns memory to the operating system in a background task.) func PrintStack --------------- ``` func PrintStack() ``` PrintStack prints to standard error the stack trace returned by runtime.Stack. func ReadGCStats 1.1 -------------------- ``` func ReadGCStats(stats *GCStats) ``` ReadGCStats reads statistics about garbage collection into stats. The number of entries in the pause history is system-dependent; stats.Pause slice will be reused if large enough, reallocated otherwise. ReadGCStats may use the full capacity of the stats.Pause slice. If stats.PauseQuantiles is non-empty, ReadGCStats fills it with quantiles summarizing the distribution of pause time. For example, if len(stats.PauseQuantiles) is 5, it will be filled with the minimum, 25%, 50%, 75%, and maximum pause times. func SetGCPercent 1.1 --------------------- ``` func SetGCPercent(percent int) int ``` SetGCPercent sets the garbage collection target percentage: a collection is triggered when the ratio of freshly allocated data to live data remaining after the previous collection reaches this percentage. SetGCPercent returns the previous setting. The initial setting is the value of the GOGC environment variable at startup, or 100 if the variable is not set. This setting may be effectively reduced in order to maintain a memory limit. A negative percentage effectively disables garbage collection, unless the memory limit is reached. See SetMemoryLimit for more details. func SetMaxStack 1.2 -------------------- ``` func SetMaxStack(bytes int) int ``` SetMaxStack sets the maximum amount of memory that can be used by a single goroutine stack. If any goroutine exceeds this limit while growing its stack, the program crashes. SetMaxStack returns the previous setting. The initial setting is 1 GB on 64-bit systems, 250 MB on 32-bit systems. There may be a system-imposed maximum stack limit regardless of the value provided to SetMaxStack. SetMaxStack is useful mainly for limiting the damage done by goroutines that enter an infinite recursion. It only limits future stack growth. func SetMaxThreads 1.2 ---------------------- ``` func SetMaxThreads(threads int) int ``` SetMaxThreads sets the maximum number of operating system threads that the Go program can use. If it attempts to use more than this many, the program crashes. SetMaxThreads returns the previous setting. The initial setting is 10,000 threads. The limit controls the number of operating system threads, not the number of goroutines. A Go program creates a new thread only when a goroutine is ready to run but all the existing threads are blocked in system calls, cgo calls, or are locked to other goroutines due to use of runtime.LockOSThread. SetMaxThreads is useful mainly for limiting the damage done by programs that create an unbounded number of threads. The idea is to take down the program before it takes down the operating system. func SetMemoryLimit 1.19 ------------------------ ``` func SetMemoryLimit(limit int64) int64 ``` SetMemoryLimit provides the runtime with a soft memory limit. The runtime undertakes several processes to try to respect this memory limit, including adjustments to the frequency of garbage collections and returning memory to the underlying system more aggressively. This limit will be respected even if GOGC=off (or, if SetGCPercent(-1) is executed). The input limit is provided as bytes, and includes all memory mapped, managed, and not released by the Go runtime. Notably, it does not account for space used by the Go binary and memory external to Go, such as memory managed by the underlying system on behalf of the process, or memory managed by non-Go code inside the same process. Examples of excluded memory sources include: OS kernel memory held on behalf of the process, memory allocated by C code, and memory mapped by syscall.Mmap (because it is not managed by the Go runtime). More specifically, the following expression accurately reflects the value the runtime attempts to maintain as the limit: ``` runtime.MemStats.Sys - runtime.MemStats.HeapReleased ``` or in terms of the runtime/metrics package: ``` /memory/classes/total:bytes - /memory/classes/heap/released:bytes ``` A zero limit or a limit that's lower than the amount of memory used by the Go runtime may cause the garbage collector to run nearly continuously. However, the application may still make progress. The memory limit is always respected by the Go runtime, so to effectively disable this behavior, set the limit very high. math.MaxInt64 is the canonical value for disabling the limit, but values much greater than the available memory on the underlying system work just as well. See <https://go.dev/doc/gc-guide> for a detailed guide explaining the soft memory limit in more detail, as well as a variety of common use-cases and scenarios. The initial setting is math.MaxInt64 unless the GOMEMLIMIT environment variable is set, in which case it provides the initial setting. GOMEMLIMIT is a numeric value in bytes with an optional unit suffix. The supported suffixes include B, KiB, MiB, GiB, and TiB. These suffixes represent quantities of bytes as defined by the IEC 80000-13 standard. That is, they are based on powers of two: KiB means 2^10 bytes, MiB means 2^20 bytes, and so on. SetMemoryLimit returns the previously set memory limit. A negative input does not adjust the limit, and allows for retrieval of the currently set memory limit. func SetPanicOnFault 1.3 ------------------------ ``` func SetPanicOnFault(enabled bool) bool ``` SetPanicOnFault controls the runtime's behavior when a program faults at an unexpected (non-nil) address. Such faults are typically caused by bugs such as runtime memory corruption, so the default response is to crash the program. Programs working with memory-mapped files or unsafe manipulation of memory may cause faults at non-nil addresses in less dramatic situations; SetPanicOnFault allows such programs to request that the runtime trigger only a panic, not a crash. The runtime.Error that the runtime panics with may have an additional method: ``` Addr() uintptr ``` If that method exists, it returns the memory address which triggered the fault. The results of Addr are best-effort and the veracity of the result may depend on the platform. SetPanicOnFault applies only to the current goroutine. It returns the previous setting. func SetTraceback 1.6 --------------------- ``` func SetTraceback(level string) ``` SetTraceback sets the amount of detail printed by the runtime in the traceback it prints before exiting due to an unrecovered panic or an internal runtime error. The level argument takes the same values as the GOTRACEBACK environment variable. For example, SetTraceback("all") ensure that the program prints all goroutines when it crashes. See the package runtime documentation for details. If SetTraceback is called with a level lower than that of the environment variable, the call is ignored. func Stack ---------- ``` func Stack() []byte ``` Stack returns a formatted stack trace of the goroutine that calls it. It calls runtime.Stack with a large enough buffer to capture the entire trace. func WriteHeapDump 1.3 ---------------------- ``` func WriteHeapDump(fd uintptr) ``` WriteHeapDump writes a description of the heap and the objects in it to the given file descriptor. WriteHeapDump suspends the execution of all goroutines until the heap dump is completely written. Thus, the file descriptor must not be connected to a pipe or socket whose other end is in the same Go process; instead, use a temporary file or network socket. The heap dump format is defined at <https://golang.org/s/go15heapdump>. type BuildInfo 1.12 ------------------- BuildInfo represents the build information read from a Go binary. ``` type BuildInfo struct { // GoVersion is the version of the Go toolchain that built the binary // (for example, "go1.19.2"). GoVersion string // Go 1.18 // Path is the package path of the main package for the binary // (for example, "golang.org/x/tools/cmd/stringer"). Path string // Main describes the module that contains the main package for the binary. Main Module // Deps describes all the dependency modules, both direct and indirect, // that contributed packages to the build of this binary. Deps []*Module // Settings describes the build settings used to build the binary. Settings []BuildSetting // Go 1.18 } ``` ### func ParseBuildInfo 1.18 ``` func ParseBuildInfo(data string) (bi *BuildInfo, err error) ``` ### func ReadBuildInfo 1.12 ``` func ReadBuildInfo() (info *BuildInfo, ok bool) ``` ReadBuildInfo returns the build information embedded in the running binary. The information is available only in binaries built with module support. ### func (\*BuildInfo) String 1.18 ``` func (bi *BuildInfo) String() string ``` type BuildSetting 1.18 ---------------------- A BuildSetting is a key-value pair describing one setting that influenced a build. Defined keys include: * -buildmode: the buildmode flag used (typically "exe") * -compiler: the compiler toolchain flag used (typically "gc") * CGO\_ENABLED: the effective CGO\_ENABLED environment variable * CGO\_CFLAGS: the effective CGO\_CFLAGS environment variable * CGO\_CPPFLAGS: the effective CGO\_CPPFLAGS environment variable * CGO\_CXXFLAGS: the effective CGO\_CPPFLAGS environment variable * CGO\_LDFLAGS: the effective CGO\_CPPFLAGS environment variable * GOARCH: the architecture target * GOAMD64/GOARM64/GO386/etc: the architecture feature level for GOARCH * GOOS: the operating system target * vcs: the version control system for the source tree where the build ran * vcs.revision: the revision identifier for the current commit or checkout * vcs.time: the modification time associated with vcs.revision, in RFC3339 format * vcs.modified: true or false indicating whether the source tree had local modifications ``` type BuildSetting struct { // Key and Value describe the build setting. // Key must not contain an equals sign, space, tab, or newline. // Value must not contain newlines ('\n'). Key, Value string } ``` type GCStats 1.1 ---------------- GCStats collect information about recent garbage collections. ``` type GCStats struct { LastGC time.Time // time of last collection NumGC int64 // number of garbage collections PauseTotal time.Duration // total pause for all collections Pause []time.Duration // pause history, most recent first PauseEnd []time.Time // pause end times history, most recent first; added in Go 1.4 PauseQuantiles []time.Duration } ``` type Module 1.12 ---------------- A Module describes a single module included in a build. ``` type Module struct { Path string // module path Version string // module version Sum string // checksum Replace *Module // replaced by this module } ``` go Package plugin Package plugin =============== * `import "plugin"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package plugin implements loading and symbol resolution of Go plugins. A plugin is a Go main package with exported functions and variables that has been built with: ``` go build -buildmode=plugin ``` When a plugin is first opened, the init functions of all packages not already part of the program are called. The main function is not run. A plugin is only initialized once, and cannot be closed. ### Warnings The ability to dynamically load parts of an application during execution, perhaps based on user-defined configuration, may be a useful building block in some designs. In particular, because applications and dynamically loaded functions can share data structures directly, plugins may enable very high-performance integration of separate parts. However, the plugin mechanism has many significant drawbacks that should be considered carefully during the design. For example: * Plugins are currently supported only on Linux, FreeBSD, and macOS, making them unsuitable for applications intended to be portable. * Applications that use plugins may require careful configuration to ensure that the various parts of the program be made available in the correct location in the file system (or container image). By contrast, deploying an application consisting of a single static executable is straightforward. * Reasoning about program initialization is more difficult when some packages may not be initialized until long after the application has started running. * Bugs in applications that load plugins could be exploited by an an attacker to load dangerous or untrusted libraries. * Runtime crashes are likely to occur unless all parts of the program (the application and all its plugins) are compiled using exactly the same version of the toolchain, the same build tags, and the same values of certain flags and environment variables. * Similar crashing problems are likely to arise unless all common dependencies of the application and its plugins are built from exactly the same source code. * Together, these restrictions mean that, in practice, the application and its plugins must all be built together by a single person or component of a system. In that case, it may be simpler for that person or component to generate Go source files that blank-import the desired set of plugins and then compile a static executable in the usual way. For these reasons, many users decide that traditional interprocess communication (IPC) mechanisms such as sockets, pipes, remote procedure call (RPC), shared memory mappings, or file system operations may be more suitable despite the performance overheads. Index ----- * [type Plugin](#Plugin) * [func Open(path string) (\*Plugin, error)](#Open) * [func (p \*Plugin) Lookup(symName string) (Symbol, error)](#Plugin.Lookup) * [type Symbol](#Symbol) ### Package files plugin.go plugin\_dlopen.go type Plugin 1.8 --------------- Plugin is a loaded Go plugin. ``` type Plugin struct { // contains filtered or unexported fields } ``` ### func Open 1.8 ``` func Open(path string) (*Plugin, error) ``` Open opens a Go plugin. If a path has already been opened, then the existing \*Plugin is returned. It is safe for concurrent use by multiple goroutines. ### func (\*Plugin) Lookup 1.8 ``` func (p *Plugin) Lookup(symName string) (Symbol, error) ``` Lookup searches for a symbol named symName in plugin p. A symbol is any exported variable or function. It reports an error if the symbol is not found. It is safe for concurrent use by multiple goroutines. type Symbol 1.8 --------------- A Symbol is a pointer to a variable or function. For example, a plugin defined as ``` package main import "fmt" var V int func F() { fmt.Printf("Hello, number %d\n", V) } ``` may be loaded with the Open function and then the exported package symbols V and F can be accessed ``` p, err := plugin.Open("plugin_name.so") if err != nil { panic(err) } v, err := p.Lookup("V") if err != nil { panic(err) } f, err := p.Lookup("F") if err != nil { panic(err) } *v.(*int) = 7 f.(func())() // prints "Hello, number 7" ``` ``` type Symbol any ```
programming_docs
go Package goos Package goos ============= * `import "internal/goos"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- package goos contains GOOS-specific constants. Index ----- * [Constants](#pkg-constants) ### Package files goos.go unix.go zgoos\_linux.go Constants --------- ``` const GOOS = `linux` ``` ``` const IsAix = 0 ``` ``` const IsAndroid = 0 ``` ``` const IsDarwin = 0 ``` ``` const IsDragonfly = 0 ``` ``` const IsFreebsd = 0 ``` ``` const IsHurd = 0 ``` ``` const IsIllumos = 0 ``` ``` const IsIos = 0 ``` ``` const IsJs = 0 ``` ``` const IsLinux = 1 ``` ``` const IsNacl = 0 ``` ``` const IsNetbsd = 0 ``` ``` const IsOpenbsd = 0 ``` ``` const IsPlan9 = 0 ``` ``` const IsSolaris = 0 ``` ``` const IsUnix = true ``` ``` const IsWindows = 0 ``` ``` const IsZos = 0 ``` go Package goarch Package goarch =============== * `import "internal/goarch"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- package goarch contains GOARCH-specific constants. Index ----- * [Constants](#pkg-constants) * [type ArchFamilyType](#ArchFamilyType) ### Package files goarch.go goarch\_amd64.go zgoarch\_amd64.go Constants --------- BigEndian reports whether the architecture is big-endian. ``` const BigEndian = IsArmbe|IsArm64be|IsMips|IsMips64|IsPpc|IsPpc64|IsS390|IsS390x|IsSparc|IsSparc64 == 1 ``` DefaultPhysPageSize is the default physical page size. ``` const DefaultPhysPageSize = _DefaultPhysPageSize ``` ``` const GOARCH = `amd64` ``` Int64Align is the required alignment for a 64-bit integer (4 on 32-bit systems, 8 on 64-bit). ``` const Int64Align = PtrSize ``` ``` const Is386 = 0 ``` ``` const IsAmd64 = 1 ``` ``` const IsAmd64p32 = 0 ``` ``` const IsArm = 0 ``` ``` const IsArm64 = 0 ``` ``` const IsArm64be = 0 ``` ``` const IsArmbe = 0 ``` ``` const IsLoong64 = 0 ``` ``` const IsMips = 0 ``` ``` const IsMips64 = 0 ``` ``` const IsMips64le = 0 ``` ``` const IsMips64p32 = 0 ``` ``` const IsMips64p32le = 0 ``` ``` const IsMipsle = 0 ``` ``` const IsPpc = 0 ``` ``` const IsPpc64 = 0 ``` ``` const IsPpc64le = 0 ``` ``` const IsRiscv = 0 ``` ``` const IsRiscv64 = 0 ``` ``` const IsS390 = 0 ``` ``` const IsS390x = 0 ``` ``` const IsSparc = 0 ``` ``` const IsSparc64 = 0 ``` ``` const IsWasm = 0 ``` MinFrameSize is the size of the system-reserved words at the bottom of a frame (just above the architectural stack pointer). It is zero on x86 and PtrSize on most non-x86 (LR-based) systems. On PowerPC it is larger, to cover three more reserved words: the compiler word, the link editor word, and the TOC save word. ``` const MinFrameSize = _MinFrameSize ``` PCQuantum is the minimal unit for a program counter (1 on x86, 4 on most other systems). The various PC tables record PC deltas pre-divided by PCQuantum. ``` const PCQuantum = _PCQuantum ``` PtrSize is the size of a pointer in bytes - unsafe.Sizeof(uintptr(0)) but as an ideal constant. It is also the size of the machine's native word size (that is, 4 on 32-bit systems, 8 on 64-bit). ``` const PtrSize = 4 << (^uintptr(0) >> 63) ``` StackAlign is the required alignment of the SP register. The stack must be at least word aligned, but some architectures require more. ``` const StackAlign = _StackAlign ``` type ArchFamilyType ------------------- ``` type ArchFamilyType int ``` ``` const ( AMD64 ArchFamilyType = iota ARM ARM64 I386 LOONG64 MIPS MIPS64 PPC64 RISCV64 S390X WASM ) ``` ArchFamily is the architecture family (AMD64, ARM, ...) ``` const ArchFamily ArchFamilyType = _ArchFamily ``` go Package encoding Package encoding ================= * `import "encoding"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Subdirectories](#pkg-subdirectories) Overview -------- Package encoding defines interfaces shared by other packages that convert data to and from byte-level and textual representations. Packages that check for these interfaces include encoding/gob, encoding/json, and encoding/xml. As a result, implementing an interface once can make a type useful in multiple encodings. Standard types that implement these interfaces include time.Time and net.IP. The interfaces come in pairs that produce and consume encoded data. Index ----- * [type BinaryMarshaler](#BinaryMarshaler) * [type BinaryUnmarshaler](#BinaryUnmarshaler) * [type TextMarshaler](#TextMarshaler) * [type TextUnmarshaler](#TextUnmarshaler) ### Package files encoding.go type BinaryMarshaler 1.2 ------------------------ BinaryMarshaler is the interface implemented by an object that can marshal itself into a binary form. MarshalBinary encodes the receiver into a binary form and returns the result. ``` type BinaryMarshaler interface { MarshalBinary() (data []byte, err error) } ``` type BinaryUnmarshaler 1.2 -------------------------- BinaryUnmarshaler is the interface implemented by an object that can unmarshal a binary representation of itself. UnmarshalBinary must be able to decode the form generated by MarshalBinary. UnmarshalBinary must copy the data if it wishes to retain the data after returning. ``` type BinaryUnmarshaler interface { UnmarshalBinary(data []byte) error } ``` type TextMarshaler 1.2 ---------------------- TextMarshaler is the interface implemented by an object that can marshal itself into a textual form. MarshalText encodes the receiver into UTF-8-encoded text and returns the result. ``` type TextMarshaler interface { MarshalText() (text []byte, err error) } ``` type TextUnmarshaler 1.2 ------------------------ TextUnmarshaler is the interface implemented by an object that can unmarshal a textual representation of itself. UnmarshalText must be able to decode the form generated by MarshalText. UnmarshalText must copy the text if it wishes to retain the text after returning. ``` type TextUnmarshaler interface { UnmarshalText(text []byte) error } ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [ascii85](ascii85/index) | Package ascii85 implements the ascii85 data encoding as used in the btoa tool and Adobe's PostScript and PDF document formats. | | [asn1](asn1/index) | Package asn1 implements parsing of DER-encoded ASN.1 data structures, as defined in ITU-T Rec X.690. | | [base32](base32/index) | Package base32 implements base32 encoding as specified by RFC 4648. | | [base64](base64/index) | Package base64 implements base64 encoding as specified by RFC 4648. | | [binary](binary/index) | Package binary implements simple translation between numbers and byte sequences and encoding and decoding of varints. | | [csv](csv/index) | Package csv reads and writes comma-separated values (CSV) files. | | [gob](gob/index) | Package gob manages streams of gobs - binary values exchanged between an Encoder (transmitter) and a Decoder (receiver). | | [hex](hex/index) | Package hex implements hexadecimal encoding and decoding. | | [json](json/index) | Package json implements encoding and decoding of JSON as defined in RFC 7159. | | [pem](pem/index) | Package pem implements the PEM data encoding, which originated in Privacy Enhanced Mail. | | [xml](xml/index) | Package xml implements a simple XML 1.0 parser that understands XML name spaces. | go Package asn1 Package asn1 ============= * `import "encoding/asn1"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package asn1 implements parsing of DER-encoded ASN.1 data structures, as defined in ITU-T Rec X.690. See also “A Layman's Guide to a Subset of ASN.1, BER, and DER,” <http://luca.ntop.org/Teaching/Appunti/asn1.html>. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func Marshal(val any) ([]byte, error)](#Marshal) * [func MarshalWithParams(val any, params string) ([]byte, error)](#MarshalWithParams) * [func Unmarshal(b []byte, val any) (rest []byte, err error)](#Unmarshal) * [func UnmarshalWithParams(b []byte, val any, params string) (rest []byte, err error)](#UnmarshalWithParams) * [type BitString](#BitString) * [func (b BitString) At(i int) int](#BitString.At) * [func (b BitString) RightAlign() []byte](#BitString.RightAlign) * [type Enumerated](#Enumerated) * [type Flag](#Flag) * [type ObjectIdentifier](#ObjectIdentifier) * [func (oi ObjectIdentifier) Equal(other ObjectIdentifier) bool](#ObjectIdentifier.Equal) * [func (oi ObjectIdentifier) String() string](#ObjectIdentifier.String) * [type RawContent](#RawContent) * [type RawValue](#RawValue) * [type StructuralError](#StructuralError) * [func (e StructuralError) Error() string](#StructuralError.Error) * [type SyntaxError](#SyntaxError) * [func (e SyntaxError) Error() string](#SyntaxError.Error) ### Package files asn1.go common.go marshal.go Constants --------- ASN.1 tags represent the type of the following object. ``` const ( TagBoolean = 1 TagInteger = 2 TagBitString = 3 TagOctetString = 4 TagNull = 5 TagOID = 6 TagEnum = 10 TagUTF8String = 12 TagSequence = 16 TagSet = 17 TagNumericString = 18 TagPrintableString = 19 TagT61String = 20 TagIA5String = 22 TagUTCTime = 23 TagGeneralizedTime = 24 TagGeneralString = 27 TagBMPString = 30 ) ``` ASN.1 class types represent the namespace of the tag. ``` const ( ClassUniversal = 0 ClassApplication = 1 ClassContextSpecific = 2 ClassPrivate = 3 ) ``` Variables --------- NullBytes contains bytes representing the DER-encoded ASN.1 NULL type. ``` var NullBytes = []byte{TagNull, 0} ``` NullRawValue is a RawValue with its Tag set to the ASN.1 NULL type tag (5). ``` var NullRawValue = RawValue{Tag: TagNull} ``` func Marshal ------------ ``` func Marshal(val any) ([]byte, error) ``` Marshal returns the ASN.1 encoding of val. In addition to the struct tags recognised by Unmarshal, the following can be used: ``` ia5: causes strings to be marshaled as ASN.1, IA5String values omitempty: causes empty slices to be skipped printable: causes strings to be marshaled as ASN.1, PrintableString values utf8: causes strings to be marshaled as ASN.1, UTF8String values utc: causes time.Time to be marshaled as ASN.1, UTCTime values generalized: causes time.Time to be marshaled as ASN.1, GeneralizedTime values ``` func MarshalWithParams 1.10 --------------------------- ``` func MarshalWithParams(val any, params string) ([]byte, error) ``` MarshalWithParams allows field parameters to be specified for the top-level element. The form of the params is the same as the field tags. func Unmarshal -------------- ``` func Unmarshal(b []byte, val any) (rest []byte, err error) ``` Unmarshal parses the DER-encoded ASN.1 data structure b and uses the reflect package to fill in an arbitrary value pointed at by val. Because Unmarshal uses the reflect package, the structs being written to must use upper case field names. If val is nil or not a pointer, Unmarshal returns an error. After parsing b, any bytes that were leftover and not used to fill val will be returned in rest. When parsing a SEQUENCE into a struct, any trailing elements of the SEQUENCE that do not have matching fields in val will not be included in rest, as these are considered valid elements of the SEQUENCE and not trailing data. An ASN.1 INTEGER can be written to an int, int32, int64, or \*big.Int (from the math/big package). If the encoded value does not fit in the Go type, Unmarshal returns a parse error. An ASN.1 BIT STRING can be written to a BitString. An ASN.1 OCTET STRING can be written to a []byte. An ASN.1 OBJECT IDENTIFIER can be written to an ObjectIdentifier. An ASN.1 ENUMERATED can be written to an Enumerated. An ASN.1 UTCTIME or GENERALIZEDTIME can be written to a time.Time. An ASN.1 PrintableString, IA5String, or NumericString can be written to a string. Any of the above ASN.1 values can be written to an interface{}. The value stored in the interface has the corresponding Go type. For integers, that type is int64. An ASN.1 SEQUENCE OF x or SET OF x can be written to a slice if an x can be written to the slice's element type. An ASN.1 SEQUENCE or SET can be written to a struct if each of the elements in the sequence can be written to the corresponding element in the struct. The following tags on struct fields have special meaning to Unmarshal: ``` application specifies that an APPLICATION tag is used private specifies that a PRIVATE tag is used default:x sets the default value for optional integer fields (only used if optional is also present) explicit specifies that an additional, explicit tag wraps the implicit one optional marks the field as ASN.1 OPTIONAL set causes a SET, rather than a SEQUENCE type to be expected tag:x specifies the ASN.1 tag number; implies ASN.1 CONTEXT SPECIFIC ``` When decoding an ASN.1 value with an IMPLICIT tag into a string field, Unmarshal will default to a PrintableString, which doesn't support characters such as '@' and '&'. To force other encodings, use the following tags: ``` ia5 causes strings to be unmarshaled as ASN.1 IA5String values numeric causes strings to be unmarshaled as ASN.1 NumericString values utf8 causes strings to be unmarshaled as ASN.1 UTF8String values ``` If the type of the first field of a structure is RawContent then the raw ASN1 contents of the struct will be stored in it. If the name of a slice type ends with "SET" then it's treated as if the "set" tag was set on it. This results in interpreting the type as a SET OF x rather than a SEQUENCE OF x. This can be used with nested slices where a struct tag cannot be given. Other ASN.1 types are not supported; if it encounters them, Unmarshal returns a parse error. func UnmarshalWithParams ------------------------ ``` func UnmarshalWithParams(b []byte, val any, params string) (rest []byte, err error) ``` UnmarshalWithParams allows field parameters to be specified for the top-level element. The form of the params is the same as the field tags. type BitString -------------- BitString is the structure to use when you want an ASN.1 BIT STRING type. A bit string is padded up to the nearest byte in memory and the number of valid bits is recorded. Padding bits will be zero. ``` type BitString struct { Bytes []byte // bits packed into bytes. BitLength int // length in bits. } ``` ### func (BitString) At ``` func (b BitString) At(i int) int ``` At returns the bit at the given index. If the index is out of range it returns 0. ### func (BitString) RightAlign ``` func (b BitString) RightAlign() []byte ``` RightAlign returns a slice where the padding bits are at the beginning. The slice may share memory with the BitString. type Enumerated --------------- An Enumerated is represented as a plain int. ``` type Enumerated int ``` type Flag --------- A Flag accepts any data and is set to true if present. ``` type Flag bool ``` type ObjectIdentifier --------------------- An ObjectIdentifier represents an ASN.1 OBJECT IDENTIFIER. ``` type ObjectIdentifier []int ``` ### func (ObjectIdentifier) Equal ``` func (oi ObjectIdentifier) Equal(other ObjectIdentifier) bool ``` Equal reports whether oi and other represent the same identifier. ### func (ObjectIdentifier) String 1.3 ``` func (oi ObjectIdentifier) String() string ``` type RawContent --------------- RawContent is used to signal that the undecoded, DER data needs to be preserved for a struct. To use it, the first field of the struct must have this type. It's an error for any of the other fields to have this type. ``` type RawContent []byte ``` type RawValue ------------- A RawValue represents an undecoded ASN.1 object. ``` type RawValue struct { Class, Tag int IsCompound bool Bytes []byte FullBytes []byte // includes the tag and length } ``` type StructuralError -------------------- A StructuralError suggests that the ASN.1 data is valid, but the Go type which is receiving it doesn't match. ``` type StructuralError struct { Msg string } ``` ### func (StructuralError) Error ``` func (e StructuralError) Error() string ``` type SyntaxError ---------------- A SyntaxError suggests that the ASN.1 data is invalid. ``` type SyntaxError struct { Msg string } ``` ### func (SyntaxError) Error ``` func (e SyntaxError) Error() string ``` go Package pem Package pem ============ * `import "encoding/pem"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package pem implements the PEM data encoding, which originated in Privacy Enhanced Mail. The most common use of PEM encoding today is in TLS keys and certificates. See RFC 1421. Index ----- * [func Encode(out io.Writer, b \*Block) error](#Encode) * [func EncodeToMemory(b \*Block) []byte](#EncodeToMemory) * [type Block](#Block) * [func Decode(data []byte) (p \*Block, rest []byte)](#Decode) ### Examples [Decode](#example_Decode) [Encode](#example_Encode) ### Package files pem.go func Encode ----------- ``` func Encode(out io.Writer, b *Block) error ``` Encode writes the PEM encoding of b to out. #### Example Code: ``` block := &pem.Block{ Type: "MESSAGE", Headers: map[string]string{ "Animal": "Gopher", }, Bytes: []byte("test"), } if err := pem.Encode(os.Stdout, block); err != nil { log.Fatal(err) } ``` Output: ``` -----BEGIN MESSAGE----- Animal: Gopher dGVzdA== -----END MESSAGE----- ``` func EncodeToMemory ------------------- ``` func EncodeToMemory(b *Block) []byte ``` EncodeToMemory returns the PEM encoding of b. If b has invalid headers and cannot be encoded, EncodeToMemory returns nil. If it is important to report details about this error case, use Encode instead. type Block ---------- A Block represents a PEM encoded structure. The encoded form is: ``` -----BEGIN Type----- Headers base64-encoded Bytes -----END Type----- ``` where Headers is a possibly empty sequence of Key: Value lines. ``` type Block struct { Type string // The type, taken from the preamble (i.e. "RSA PRIVATE KEY"). Headers map[string]string // Optional headers. Bytes []byte // The decoded bytes of the contents. Typically a DER encoded ASN.1 structure. } ``` ### func Decode ``` func Decode(data []byte) (p *Block, rest []byte) ``` Decode will find the next PEM formatted block (certificate, private key etc) in the input. It returns that block and the remainder of the input. If no PEM data is found, p is nil and the whole of the input is returned in rest. #### Example Code: ``` var pubPEMData = []byte(` -----BEGIN PUBLIC KEY----- MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAlRuRnThUjU8/prwYxbty WPT9pURI3lbsKMiB6Fn/VHOKE13p4D8xgOCADpdRagdT6n4etr9atzDKUSvpMtR3 CP5noNc97WiNCggBjVWhs7szEe8ugyqF23XwpHQ6uV1LKH50m92MbOWfCtjU9p/x qhNpQQ1AZhqNy5Gevap5k8XzRmjSldNAFZMY7Yv3Gi+nyCwGwpVtBUwhuLzgNFK/ yDtw2WcWmUU7NuC8Q6MWvPebxVtCfVp/iQU6q60yyt6aGOBkhAX0LpKAEhKidixY nP9PNVBvxgu3XZ4P36gZV6+ummKdBVnc3NqwBLu5+CcdRdusmHPHd5pHf4/38Z3/ 6qU2a/fPvWzceVTEgZ47QjFMTCTmCwNt29cvi7zZeQzjtwQgn4ipN9NibRH/Ax/q TbIzHfrJ1xa2RteWSdFjwtxi9C20HUkjXSeI4YlzQMH0fPX6KCE7aVePTOnB69I/ a9/q96DiXZajwlpq3wFctrs1oXqBp5DVrCIj8hU2wNgB7LtQ1mCtsYz//heai0K9 PhE4X6hiE0YmeAZjR0uHl8M/5aW9xCoJ72+12kKpWAa0SFRWLy6FejNYCYpkupVJ yecLk/4L1W0l6jQQZnWErXZYe0PNFcmwGXy1Rep83kfBRNKRy5tvocalLlwXLdUk AIU+2GKjyT3iMuzZxxFxPFMCAwEAAQ== -----END PUBLIC KEY----- and some more`) block, rest := pem.Decode(pubPEMData) if block == nil || block.Type != "PUBLIC KEY" { log.Fatal("failed to decode PEM block containing public key") } pub, err := x509.ParsePKIXPublicKey(block.Bytes) if err != nil { log.Fatal(err) } fmt.Printf("Got a %T, with remaining data: %q", pub, rest) ``` Output: ``` Got a *rsa.PublicKey, with remaining data: "and some more" ```
programming_docs
go Package hex Package hex ============ * `import "encoding/hex"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package hex implements hexadecimal encoding and decoding. Index ----- * [Variables](#pkg-variables) * [func Decode(dst, src []byte) (int, error)](#Decode) * [func DecodeString(s string) ([]byte, error)](#DecodeString) * [func DecodedLen(x int) int](#DecodedLen) * [func Dump(data []byte) string](#Dump) * [func Dumper(w io.Writer) io.WriteCloser](#Dumper) * [func Encode(dst, src []byte) int](#Encode) * [func EncodeToString(src []byte) string](#EncodeToString) * [func EncodedLen(n int) int](#EncodedLen) * [func NewDecoder(r io.Reader) io.Reader](#NewDecoder) * [func NewEncoder(w io.Writer) io.Writer](#NewEncoder) * [type InvalidByteError](#InvalidByteError) * [func (e InvalidByteError) Error() string](#InvalidByteError.Error) ### Examples [Decode](#example_Decode) [DecodeString](#example_DecodeString) [Dump](#example_Dump) [Dumper](#example_Dumper) [Encode](#example_Encode) [EncodeToString](#example_EncodeToString) ### Package files hex.go Variables --------- ErrLength reports an attempt to decode an odd-length input using Decode or DecodeString. The stream-based Decoder returns io.ErrUnexpectedEOF instead of ErrLength. ``` var ErrLength = errors.New("encoding/hex: odd length hex string") ``` func Decode ----------- ``` func Decode(dst, src []byte) (int, error) ``` Decode decodes src into DecodedLen(len(src)) bytes, returning the actual number of bytes written to dst. Decode expects that src contains only hexadecimal characters and that src has even length. If the input is malformed, Decode returns the number of bytes decoded before the error. #### Example Code: ``` src := []byte("48656c6c6f20476f7068657221") dst := make([]byte, hex.DecodedLen(len(src))) n, err := hex.Decode(dst, src) if err != nil { log.Fatal(err) } fmt.Printf("%s\n", dst[:n]) ``` Output: ``` Hello Gopher! ``` func DecodeString ----------------- ``` func DecodeString(s string) ([]byte, error) ``` DecodeString returns the bytes represented by the hexadecimal string s. DecodeString expects that src contains only hexadecimal characters and that src has even length. If the input is malformed, DecodeString returns the bytes decoded before the error. #### Example Code: ``` const s = "48656c6c6f20476f7068657221" decoded, err := hex.DecodeString(s) if err != nil { log.Fatal(err) } fmt.Printf("%s\n", decoded) ``` Output: ``` Hello Gopher! ``` func DecodedLen --------------- ``` func DecodedLen(x int) int ``` DecodedLen returns the length of a decoding of x source bytes. Specifically, it returns x / 2. func Dump --------- ``` func Dump(data []byte) string ``` Dump returns a string that contains a hex dump of the given data. The format of the hex dump matches the output of `hexdump -C` on the command line. #### Example Code: ``` content := []byte("Go is an open source programming language.") fmt.Printf("%s", hex.Dump(content)) ``` Output: ``` 00000000 47 6f 20 69 73 20 61 6e 20 6f 70 65 6e 20 73 6f |Go is an open so| 00000010 75 72 63 65 20 70 72 6f 67 72 61 6d 6d 69 6e 67 |urce programming| 00000020 20 6c 61 6e 67 75 61 67 65 2e | language.| ``` func Dumper ----------- ``` func Dumper(w io.Writer) io.WriteCloser ``` Dumper returns a WriteCloser that writes a hex dump of all written data to w. The format of the dump matches the output of `hexdump -C` on the command line. #### Example Code: ``` lines := []string{ "Go is an open source programming language.", "\n", "We encourage all Go users to subscribe to golang-announce.", } stdoutDumper := hex.Dumper(os.Stdout) defer stdoutDumper.Close() for _, line := range lines { stdoutDumper.Write([]byte(line)) } ``` Output: ``` 00000000 47 6f 20 69 73 20 61 6e 20 6f 70 65 6e 20 73 6f |Go is an open so| 00000010 75 72 63 65 20 70 72 6f 67 72 61 6d 6d 69 6e 67 |urce programming| 00000020 20 6c 61 6e 67 75 61 67 65 2e 0a 57 65 20 65 6e | language..We en| 00000030 63 6f 75 72 61 67 65 20 61 6c 6c 20 47 6f 20 75 |courage all Go u| 00000040 73 65 72 73 20 74 6f 20 73 75 62 73 63 72 69 62 |sers to subscrib| 00000050 65 20 74 6f 20 67 6f 6c 61 6e 67 2d 61 6e 6e 6f |e to golang-anno| 00000060 75 6e 63 65 2e |unce.| ``` func Encode ----------- ``` func Encode(dst, src []byte) int ``` Encode encodes src into EncodedLen(len(src)) bytes of dst. As a convenience, it returns the number of bytes written to dst, but this value is always EncodedLen(len(src)). Encode implements hexadecimal encoding. #### Example Code: ``` src := []byte("Hello Gopher!") dst := make([]byte, hex.EncodedLen(len(src))) hex.Encode(dst, src) fmt.Printf("%s\n", dst) ``` Output: ``` 48656c6c6f20476f7068657221 ``` func EncodeToString ------------------- ``` func EncodeToString(src []byte) string ``` EncodeToString returns the hexadecimal encoding of src. #### Example Code: ``` src := []byte("Hello") encodedStr := hex.EncodeToString(src) fmt.Printf("%s\n", encodedStr) ``` Output: ``` 48656c6c6f ``` func EncodedLen --------------- ``` func EncodedLen(n int) int ``` EncodedLen returns the length of an encoding of n source bytes. Specifically, it returns n \* 2. func NewDecoder 1.10 -------------------- ``` func NewDecoder(r io.Reader) io.Reader ``` NewDecoder returns an io.Reader that decodes hexadecimal characters from r. NewDecoder expects that r contain only an even number of hexadecimal characters. func NewEncoder 1.10 -------------------- ``` func NewEncoder(w io.Writer) io.Writer ``` NewEncoder returns an io.Writer that writes lowercase hexadecimal characters to w. type InvalidByteError --------------------- InvalidByteError values describe errors resulting from an invalid byte in a hex string. ``` type InvalidByteError byte ``` ### func (InvalidByteError) Error ``` func (e InvalidByteError) Error() string ``` go Package ascii85 Package ascii85 ================ * `import "encoding/ascii85"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package ascii85 implements the ascii85 data encoding as used in the btoa tool and Adobe's PostScript and PDF document formats. Index ----- * [func Decode(dst, src []byte, flush bool) (ndst, nsrc int, err error)](#Decode) * [func Encode(dst, src []byte) int](#Encode) * [func MaxEncodedLen(n int) int](#MaxEncodedLen) * [func NewDecoder(r io.Reader) io.Reader](#NewDecoder) * [func NewEncoder(w io.Writer) io.WriteCloser](#NewEncoder) * [type CorruptInputError](#CorruptInputError) * [func (e CorruptInputError) Error() string](#CorruptInputError.Error) ### Package files ascii85.go func Decode ----------- ``` func Decode(dst, src []byte, flush bool) (ndst, nsrc int, err error) ``` Decode decodes src into dst, returning both the number of bytes written to dst and the number consumed from src. If src contains invalid ascii85 data, Decode will return the number of bytes successfully written and a CorruptInputError. Decode ignores space and control characters in src. Often, ascii85-encoded data is wrapped in <~ and ~> symbols. Decode expects these to have been stripped by the caller. If flush is true, Decode assumes that src represents the end of the input stream and processes it completely rather than wait for the completion of another 32-bit block. NewDecoder wraps an io.Reader interface around Decode. func Encode ----------- ``` func Encode(dst, src []byte) int ``` Encode encodes src into at most MaxEncodedLen(len(src)) bytes of dst, returning the actual number of bytes written. The encoding handles 4-byte chunks, using a special encoding for the last fragment, so Encode is not appropriate for use on individual blocks of a large data stream. Use NewEncoder() instead. Often, ascii85-encoded data is wrapped in <~ and ~> symbols. Encode does not add these. func MaxEncodedLen ------------------ ``` func MaxEncodedLen(n int) int ``` MaxEncodedLen returns the maximum length of an encoding of n source bytes. func NewDecoder --------------- ``` func NewDecoder(r io.Reader) io.Reader ``` NewDecoder constructs a new ascii85 stream decoder. func NewEncoder --------------- ``` func NewEncoder(w io.Writer) io.WriteCloser ``` NewEncoder returns a new ascii85 stream encoder. Data written to the returned writer will be encoded and then written to w. Ascii85 encodings operate in 32-bit blocks; when finished writing, the caller must Close the returned encoder to flush any trailing partial block. type CorruptInputError ---------------------- ``` type CorruptInputError int64 ``` ### func (CorruptInputError) Error ``` func (e CorruptInputError) Error() string ``` go Package base64 Package base64 =============== * `import "encoding/base64"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package base64 implements base64 encoding as specified by RFC 4648. #### Example Code: ``` msg := "Hello, 世界" encoded := base64.StdEncoding.EncodeToString([]byte(msg)) fmt.Println(encoded) decoded, err := base64.StdEncoding.DecodeString(encoded) if err != nil { fmt.Println("decode error:", err) return } fmt.Println(string(decoded)) ``` Output: ``` SGVsbG8sIOS4lueVjA== Hello, 世界 ``` Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func NewDecoder(enc \*Encoding, r io.Reader) io.Reader](#NewDecoder) * [func NewEncoder(enc \*Encoding, w io.Writer) io.WriteCloser](#NewEncoder) * [type CorruptInputError](#CorruptInputError) * [func (e CorruptInputError) Error() string](#CorruptInputError.Error) * [type Encoding](#Encoding) * [func NewEncoding(encoder string) \*Encoding](#NewEncoding) * [func (enc \*Encoding) Decode(dst, src []byte) (n int, err error)](#Encoding.Decode) * [func (enc \*Encoding) DecodeString(s string) ([]byte, error)](#Encoding.DecodeString) * [func (enc \*Encoding) DecodedLen(n int) int](#Encoding.DecodedLen) * [func (enc \*Encoding) Encode(dst, src []byte)](#Encoding.Encode) * [func (enc \*Encoding) EncodeToString(src []byte) string](#Encoding.EncodeToString) * [func (enc \*Encoding) EncodedLen(n int) int](#Encoding.EncodedLen) * [func (enc Encoding) Strict() \*Encoding](#Encoding.Strict) * [func (enc Encoding) WithPadding(padding rune) \*Encoding](#Encoding.WithPadding) ### Examples [Package](#example_) [Encoding.Decode](#example_Encoding_Decode) [Encoding.DecodeString](#example_Encoding_DecodeString) [Encoding.Encode](#example_Encoding_Encode) [Encoding.EncodeToString](#example_Encoding_EncodeToString) [NewEncoder](#example_NewEncoder) ### Package files base64.go Constants --------- ``` const ( StdPadding rune = '=' // Standard padding character NoPadding rune = -1 // No padding ) ``` Variables --------- RawStdEncoding is the standard raw, unpadded base64 encoding, as defined in RFC 4648 section 3.2. This is the same as StdEncoding but omits padding characters. ``` var RawStdEncoding = StdEncoding.WithPadding(NoPadding) ``` RawURLEncoding is the unpadded alternate base64 encoding defined in RFC 4648. It is typically used in URLs and file names. This is the same as URLEncoding but omits padding characters. ``` var RawURLEncoding = URLEncoding.WithPadding(NoPadding) ``` StdEncoding is the standard base64 encoding, as defined in RFC 4648. ``` var StdEncoding = NewEncoding(encodeStd) ``` URLEncoding is the alternate base64 encoding defined in RFC 4648. It is typically used in URLs and file names. ``` var URLEncoding = NewEncoding(encodeURL) ``` func NewDecoder --------------- ``` func NewDecoder(enc *Encoding, r io.Reader) io.Reader ``` NewDecoder constructs a new base64 stream decoder. func NewEncoder --------------- ``` func NewEncoder(enc *Encoding, w io.Writer) io.WriteCloser ``` NewEncoder returns a new base64 stream encoder. Data written to the returned writer will be encoded using enc and then written to w. Base64 encodings operate in 4-byte blocks; when finished writing, the caller must Close the returned encoder to flush any partially written blocks. #### Example Code: ``` input := []byte("foo\x00bar") encoder := base64.NewEncoder(base64.StdEncoding, os.Stdout) encoder.Write(input) // Must close the encoder when finished to flush any partial blocks. // If you comment out the following line, the last partial block "r" // won't be encoded. encoder.Close() ``` Output: ``` Zm9vAGJhcg== ``` type CorruptInputError ---------------------- ``` type CorruptInputError int64 ``` ### func (CorruptInputError) Error ``` func (e CorruptInputError) Error() string ``` type Encoding ------------- An Encoding is a radix 64 encoding/decoding scheme, defined by a 64-character alphabet. The most common encoding is the "base64" encoding defined in RFC 4648 and used in MIME (RFC 2045) and PEM (RFC 1421). RFC 4648 also defines an alternate encoding, which is the standard encoding with - and \_ substituted for + and /. ``` type Encoding struct { // contains filtered or unexported fields } ``` ### func NewEncoding ``` func NewEncoding(encoder string) *Encoding ``` NewEncoding returns a new padded Encoding defined by the given alphabet, which must be a 64-byte string that does not contain the padding character or CR / LF ('\r', '\n'). The resulting Encoding uses the default padding character ('='), which may be changed or disabled via WithPadding. ### func (\*Encoding) Decode ``` func (enc *Encoding) Decode(dst, src []byte) (n int, err error) ``` Decode decodes src using the encoding enc. It writes at most DecodedLen(len(src)) bytes to dst and returns the number of bytes written. If src contains invalid base64 data, it will return the number of bytes successfully written and CorruptInputError. New line characters (\r and \n) are ignored. #### Example Code: ``` str := "SGVsbG8sIHdvcmxkIQ==" dst := make([]byte, base64.StdEncoding.DecodedLen(len(str))) n, err := base64.StdEncoding.Decode(dst, []byte(str)) if err != nil { fmt.Println("decode error:", err) return } dst = dst[:n] fmt.Printf("%q\n", dst) ``` Output: ``` "Hello, world!" ``` ### func (\*Encoding) DecodeString ``` func (enc *Encoding) DecodeString(s string) ([]byte, error) ``` DecodeString returns the bytes represented by the base64 string s. #### Example Code: ``` str := "c29tZSBkYXRhIHdpdGggACBhbmQg77u/" data, err := base64.StdEncoding.DecodeString(str) if err != nil { fmt.Println("error:", err) return } fmt.Printf("%q\n", data) ``` Output: ``` "some data with \x00 and \ufeff" ``` ### func (\*Encoding) DecodedLen ``` func (enc *Encoding) DecodedLen(n int) int ``` DecodedLen returns the maximum length in bytes of the decoded data corresponding to n bytes of base64-encoded data. ### func (\*Encoding) Encode ``` func (enc *Encoding) Encode(dst, src []byte) ``` Encode encodes src using the encoding enc, writing EncodedLen(len(src)) bytes to dst. The encoding pads the output to a multiple of 4 bytes, so Encode is not appropriate for use on individual blocks of a large data stream. Use NewEncoder() instead. #### Example Code: ``` data := []byte("Hello, world!") dst := make([]byte, base64.StdEncoding.EncodedLen(len(data))) base64.StdEncoding.Encode(dst, data) fmt.Println(string(dst)) ``` Output: ``` SGVsbG8sIHdvcmxkIQ== ``` ### func (\*Encoding) EncodeToString ``` func (enc *Encoding) EncodeToString(src []byte) string ``` EncodeToString returns the base64 encoding of src. #### Example Code: ``` data := []byte("any + old & data") str := base64.StdEncoding.EncodeToString(data) fmt.Println(str) ``` Output: ``` YW55ICsgb2xkICYgZGF0YQ== ``` ### func (\*Encoding) EncodedLen ``` func (enc *Encoding) EncodedLen(n int) int ``` EncodedLen returns the length in bytes of the base64 encoding of an input buffer of length n. ### func (Encoding) Strict 1.8 ``` func (enc Encoding) Strict() *Encoding ``` Strict creates a new encoding identical to enc except with strict decoding enabled. In this mode, the decoder requires that trailing padding bits are zero, as described in RFC 4648 section 3.5. Note that the input is still malleable, as new line characters (CR and LF) are still ignored. ### func (Encoding) WithPadding 1.5 ``` func (enc Encoding) WithPadding(padding rune) *Encoding ``` WithPadding creates a new encoding identical to enc except with a specified padding character, or NoPadding to disable padding. The padding character must not be '\r' or '\n', must not be contained in the encoding's alphabet and must be a rune equal or below '\xff'. go Package xml Package xml ============ * `import "encoding/xml"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package xml implements a simple XML 1.0 parser that understands XML name spaces. #### Example (CustomMarshalXML) Code: ``` package xml_test import ( "encoding/xml" "fmt" "log" "strings" ) type Animal int const ( Unknown Animal = iota Gopher Zebra ) func (a *Animal) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error { var s string if err := d.DecodeElement(&s, &start); err != nil { return err } switch strings.ToLower(s) { default: *a = Unknown case "gopher": *a = Gopher case "zebra": *a = Zebra } return nil } func (a Animal) MarshalXML(e *xml.Encoder, start xml.StartElement) error { var s string switch a { default: s = "unknown" case Gopher: s = "gopher" case Zebra: s = "zebra" } return e.EncodeElement(s, start) } func Example_customMarshalXML() { blob := ` <animals> <animal>gopher</animal> <animal>armadillo</animal> <animal>zebra</animal> <animal>unknown</animal> <animal>gopher</animal> <animal>bee</animal> <animal>gopher</animal> <animal>zebra</animal> </animals>` var zoo struct { Animals []Animal `xml:"animal"` } if err := xml.Unmarshal([]byte(blob), &zoo); err != nil { log.Fatal(err) } census := make(map[Animal]int) for _, animal := range zoo.Animals { census[animal] += 1 } fmt.Printf("Zoo Census:\n* Gophers: %d\n* Zebras: %d\n* Unknown: %d\n", census[Gopher], census[Zebra], census[Unknown]) // Output: // Zoo Census: // * Gophers: 3 // * Zebras: 2 // * Unknown: 3 } ``` #### Example (TextMarshalXML) Code: ``` package xml_test import ( "encoding/xml" "fmt" "log" "strings" ) type Size int const ( Unrecognized Size = iota Small Large ) func (s *Size) UnmarshalText(text []byte) error { switch strings.ToLower(string(text)) { default: *s = Unrecognized case "small": *s = Small case "large": *s = Large } return nil } func (s Size) MarshalText() ([]byte, error) { var name string switch s { default: name = "unrecognized" case Small: name = "small" case Large: name = "large" } return []byte(name), nil } func Example_textMarshalXML() { blob := ` <sizes> <size>small</size> <size>regular</size> <size>large</size> <size>unrecognized</size> <size>small</size> <size>normal</size> <size>small</size> <size>large</size> </sizes>` var inventory struct { Sizes []Size `xml:"size"` } if err := xml.Unmarshal([]byte(blob), &inventory); err != nil { log.Fatal(err) } counts := make(map[Size]int) for _, size := range inventory.Sizes { counts[size] += 1 } fmt.Printf("Inventory Counts:\n* Small: %d\n* Large: %d\n* Unrecognized: %d\n", counts[Small], counts[Large], counts[Unrecognized]) // Output: // Inventory Counts: // * Small: 3 // * Large: 2 // * Unrecognized: 3 } ``` Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func Escape(w io.Writer, s []byte)](#Escape) * [func EscapeText(w io.Writer, s []byte) error](#EscapeText) * [func Marshal(v any) ([]byte, error)](#Marshal) * [func MarshalIndent(v any, prefix, indent string) ([]byte, error)](#MarshalIndent) * [func Unmarshal(data []byte, v any) error](#Unmarshal) * [type Attr](#Attr) * [type CharData](#CharData) * [func (c CharData) Copy() CharData](#CharData.Copy) * [type Comment](#Comment) * [func (c Comment) Copy() Comment](#Comment.Copy) * [type Decoder](#Decoder) * [func NewDecoder(r io.Reader) \*Decoder](#NewDecoder) * [func NewTokenDecoder(t TokenReader) \*Decoder](#NewTokenDecoder) * [func (d \*Decoder) Decode(v any) error](#Decoder.Decode) * [func (d \*Decoder) DecodeElement(v any, start \*StartElement) error](#Decoder.DecodeElement) * [func (d \*Decoder) InputOffset() int64](#Decoder.InputOffset) * [func (d \*Decoder) InputPos() (line, column int)](#Decoder.InputPos) * [func (d \*Decoder) RawToken() (Token, error)](#Decoder.RawToken) * [func (d \*Decoder) Skip() error](#Decoder.Skip) * [func (d \*Decoder) Token() (Token, error)](#Decoder.Token) * [type Directive](#Directive) * [func (d Directive) Copy() Directive](#Directive.Copy) * [type Encoder](#Encoder) * [func NewEncoder(w io.Writer) \*Encoder](#NewEncoder) * [func (enc \*Encoder) Close() error](#Encoder.Close) * [func (enc \*Encoder) Encode(v any) error](#Encoder.Encode) * [func (enc \*Encoder) EncodeElement(v any, start StartElement) error](#Encoder.EncodeElement) * [func (enc \*Encoder) EncodeToken(t Token) error](#Encoder.EncodeToken) * [func (enc \*Encoder) Flush() error](#Encoder.Flush) * [func (enc \*Encoder) Indent(prefix, indent string)](#Encoder.Indent) * [type EndElement](#EndElement) * [type Marshaler](#Marshaler) * [type MarshalerAttr](#MarshalerAttr) * [type Name](#Name) * [type ProcInst](#ProcInst) * [func (p ProcInst) Copy() ProcInst](#ProcInst.Copy) * [type StartElement](#StartElement) * [func (e StartElement) Copy() StartElement](#StartElement.Copy) * [func (e StartElement) End() EndElement](#StartElement.End) * [type SyntaxError](#SyntaxError) * [func (e \*SyntaxError) Error() string](#SyntaxError.Error) * [type TagPathError](#TagPathError) * [func (e \*TagPathError) Error() string](#TagPathError.Error) * [type Token](#Token) * [func CopyToken(t Token) Token](#CopyToken) * [type TokenReader](#TokenReader) * [type UnmarshalError](#UnmarshalError) * [func (e UnmarshalError) Error() string](#UnmarshalError.Error) * [type Unmarshaler](#Unmarshaler) * [type UnmarshalerAttr](#UnmarshalerAttr) * [type UnsupportedTypeError](#UnsupportedTypeError) * [func (e \*UnsupportedTypeError) Error() string](#UnsupportedTypeError.Error) * [Bugs](#pkg-note-BUG) ### Examples [Encoder](#example_Encoder) [MarshalIndent](#example_MarshalIndent) [Unmarshal](#example_Unmarshal) [Package (CustomMarshalXML)](#example__customMarshalXML) [Package (TextMarshalXML)](#example__textMarshalXML) ### Package files marshal.go read.go typeinfo.go xml.go Constants --------- ``` const ( // Header is a generic XML header suitable for use with the output of Marshal. // This is not automatically added to any output of this package, // it is provided as a convenience. Header = `<?xml version="1.0" encoding="UTF-8"?>` + "\n" ) ``` Variables --------- HTMLAutoClose is the set of HTML elements that should be considered to close automatically. See the Decoder.Strict and Decoder.Entity fields' documentation. ``` var HTMLAutoClose []string = htmlAutoClose ``` HTMLEntity is an entity map containing translations for the standard HTML entity characters. See the Decoder.Strict and Decoder.Entity fields' documentation. ``` var HTMLEntity map[string]string = htmlEntity ``` func Escape ----------- ``` func Escape(w io.Writer, s []byte) ``` Escape is like EscapeText but omits the error return value. It is provided for backwards compatibility with Go 1.0. Code targeting Go 1.1 or later should use EscapeText. func EscapeText 1.1 ------------------- ``` func EscapeText(w io.Writer, s []byte) error ``` EscapeText writes to w the properly escaped XML equivalent of the plain text data s. func Marshal ------------ ``` func Marshal(v any) ([]byte, error) ``` Marshal returns the XML encoding of v. Marshal handles an array or slice by marshaling each of the elements. Marshal handles a pointer by marshaling the value it points at or, if the pointer is nil, by writing nothing. Marshal handles an interface value by marshaling the value it contains or, if the interface value is nil, by writing nothing. Marshal handles all other data by writing one or more XML elements containing the data. The name for the XML elements is taken from, in order of preference: * the tag on the XMLName field, if the data is a struct * the value of the XMLName field of type Name * the tag of the struct field used to obtain the data * the name of the struct field used to obtain the data * the name of the marshaled type The XML element for a struct contains marshaled elements for each of the exported fields of the struct, with these exceptions: * the XMLName field, described above, is omitted. * a field with tag "-" is omitted. * a field with tag "name,attr" becomes an attribute with the given name in the XML element. * a field with tag ",attr" becomes an attribute with the field name in the XML element. * a field with tag ",chardata" is written as character data, not as an XML element. * a field with tag ",cdata" is written as character data wrapped in one or more <![CDATA[ ... ]]> tags, not as an XML element. * a field with tag ",innerxml" is written verbatim, not subject to the usual marshaling procedure. * a field with tag ",comment" is written as an XML comment, not subject to the usual marshaling procedure. It must not contain the "--" string within it. * a field with a tag including the "omitempty" option is omitted if the field value is empty. The empty values are false, 0, any nil pointer or interface value, and any array, slice, map, or string of length zero. * an anonymous struct field is handled as if the fields of its value were part of the outer struct. * a field implementing Marshaler is written by calling its MarshalXML method. * a field implementing encoding.TextMarshaler is written by encoding the result of its MarshalText method as text. If a field uses a tag "a>b>c", then the element c will be nested inside parent elements a and b. Fields that appear next to each other that name the same parent will be enclosed in one XML element. If the XML name for a struct field is defined by both the field tag and the struct's XMLName field, the names must match. See MarshalIndent for an example. Marshal will return an error if asked to marshal a channel, function, or map. func MarshalIndent ------------------ ``` func MarshalIndent(v any, prefix, indent string) ([]byte, error) ``` MarshalIndent works like Marshal, but each XML element begins on a new indented line that starts with prefix and is followed by one or more copies of indent according to the nesting depth. #### Example Code: ``` type Address struct { City, State string } type Person struct { XMLName xml.Name `xml:"person"` Id int `xml:"id,attr"` FirstName string `xml:"name>first"` LastName string `xml:"name>last"` Age int `xml:"age"` Height float32 `xml:"height,omitempty"` Married bool Address Comment string `xml:",comment"` } v := &Person{Id: 13, FirstName: "John", LastName: "Doe", Age: 42} v.Comment = " Need more details. " v.Address = Address{"Hanga Roa", "Easter Island"} output, err := xml.MarshalIndent(v, " ", " ") if err != nil { fmt.Printf("error: %v\n", err) } os.Stdout.Write(output) ``` Output: ``` <person id="13"> <name> <first>John</first> <last>Doe</last> </name> <age>42</age> <Married>false</Married> <City>Hanga Roa</City> <State>Easter Island</State> <!-- Need more details. --> </person> ``` func Unmarshal -------------- ``` func Unmarshal(data []byte, v any) error ``` Unmarshal parses the XML-encoded data and stores the result in the value pointed to by v, which must be an arbitrary struct, slice, or string. Well-formed data that does not fit into v is discarded. Because Unmarshal uses the reflect package, it can only assign to exported (upper case) fields. Unmarshal uses a case-sensitive comparison to match XML element names to tag values and struct field names. Unmarshal maps an XML element to a struct using the following rules. In the rules, the tag of a field refers to the value associated with the key 'xml' in the struct field's tag (see the example above). * If the struct has a field of type []byte or string with tag ",innerxml", Unmarshal accumulates the raw XML nested inside the element in that field. The rest of the rules still apply. * If the struct has a field named XMLName of type Name, Unmarshal records the element name in that field. * If the XMLName field has an associated tag of the form "name" or "namespace-URL name", the XML element must have the given name (and, optionally, name space) or else Unmarshal returns an error. * If the XML element has an attribute whose name matches a struct field name with an associated tag containing ",attr" or the explicit name in a struct field tag of the form "name,attr", Unmarshal records the attribute value in that field. * If the XML element has an attribute not handled by the previous rule and the struct has a field with an associated tag containing ",any,attr", Unmarshal records the attribute value in the first such field. * If the XML element contains character data, that data is accumulated in the first struct field that has tag ",chardata". The struct field may have type []byte or string. If there is no such field, the character data is discarded. * If the XML element contains comments, they are accumulated in the first struct field that has tag ",comment". The struct field may have type []byte or string. If there is no such field, the comments are discarded. * If the XML element contains a sub-element whose name matches the prefix of a tag formatted as "a" or "a>b>c", unmarshal will descend into the XML structure looking for elements with the given names, and will map the innermost elements to that struct field. A tag starting with ">" is equivalent to one starting with the field name followed by ">". * If the XML element contains a sub-element whose name matches a struct field's XMLName tag and the struct field has no explicit name tag as per the previous rule, unmarshal maps the sub-element to that struct field. * If the XML element contains a sub-element whose name matches a field without any mode flags (",attr", ",chardata", etc), Unmarshal maps the sub-element to that struct field. * If the XML element contains a sub-element that hasn't matched any of the above rules and the struct has a field with tag ",any", unmarshal maps the sub-element to that struct field. * An anonymous struct field is handled as if the fields of its value were part of the outer struct. * A struct field with tag "-" is never unmarshaled into. If Unmarshal encounters a field type that implements the Unmarshaler interface, Unmarshal calls its UnmarshalXML method to produce the value from the XML element. Otherwise, if the value implements encoding.TextUnmarshaler, Unmarshal calls that value's UnmarshalText method. Unmarshal maps an XML element to a string or []byte by saving the concatenation of that element's character data in the string or []byte. The saved []byte is never nil. Unmarshal maps an attribute value to a string or []byte by saving the value in the string or slice. Unmarshal maps an attribute value to an Attr by saving the attribute, including its name, in the Attr. Unmarshal maps an XML element or attribute value to a slice by extending the length of the slice and mapping the element or attribute to the newly created value. Unmarshal maps an XML element or attribute value to a bool by setting it to the boolean value represented by the string. Whitespace is trimmed and ignored. Unmarshal maps an XML element or attribute value to an integer or floating-point field by setting the field to the result of interpreting the string value in decimal. There is no check for overflow. Whitespace is trimmed and ignored. Unmarshal maps an XML element to a Name by recording the element name. Unmarshal maps an XML element to a pointer by setting the pointer to a freshly allocated value and then mapping the element to that value. A missing element or empty attribute value will be unmarshaled as a zero value. If the field is a slice, a zero value will be appended to the field. Otherwise, the field will be set to its zero value. #### Example This example demonstrates unmarshaling an XML excerpt into a value with some preset fields. Note that the Phone field isn't modified and that the XML <Company> element is ignored. Also, the Groups field is assigned considering the element path provided in its tag. Code: ``` type Email struct { Where string `xml:"where,attr"` Addr string } type Address struct { City, State string } type Result struct { XMLName xml.Name `xml:"Person"` Name string `xml:"FullName"` Phone string Email []Email Groups []string `xml:"Group>Value"` Address } v := Result{Name: "none", Phone: "none"} data := ` <Person> <FullName>Grace R. Emlin</FullName> <Company>Example Inc.</Company> <Email where="home"> <Addr>[email protected]</Addr> </Email> <Email where='work'> <Addr>[email protected]</Addr> </Email> <Group> <Value>Friends</Value> <Value>Squash</Value> </Group> <City>Hanga Roa</City> <State>Easter Island</State> </Person> ` err := xml.Unmarshal([]byte(data), &v) if err != nil { fmt.Printf("error: %v", err) return } fmt.Printf("XMLName: %#v\n", v.XMLName) fmt.Printf("Name: %q\n", v.Name) fmt.Printf("Phone: %q\n", v.Phone) fmt.Printf("Email: %v\n", v.Email) fmt.Printf("Groups: %v\n", v.Groups) fmt.Printf("Address: %v\n", v.Address) ``` Output: ``` XMLName: xml.Name{Space:"", Local:"Person"} Name: "Grace R. Emlin" Phone: "none" Email: [{home [email protected]} {work [email protected]}] Groups: [Friends Squash] Address: {Hanga Roa Easter Island} ``` type Attr --------- An Attr represents an attribute in an XML element (Name=Value). ``` type Attr struct { Name Name Value string } ``` type CharData ------------- A CharData represents XML character data (raw text), in which XML escape sequences have been replaced by the characters they represent. ``` type CharData []byte ``` ### func (CharData) Copy ``` func (c CharData) Copy() CharData ``` Copy creates a new copy of CharData. type Comment ------------ A Comment represents an XML comment of the form <!--comment-->. The bytes do not include the <!-- and --> comment markers. ``` type Comment []byte ``` ### func (Comment) Copy ``` func (c Comment) Copy() Comment ``` Copy creates a new copy of Comment. type Decoder ------------ A Decoder represents an XML parser reading a particular input stream. The parser assumes that its input is encoded in UTF-8. ``` type Decoder struct { // Strict defaults to true, enforcing the requirements // of the XML specification. // If set to false, the parser allows input containing common // mistakes: // * If an element is missing an end tag, the parser invents // end tags as necessary to keep the return values from Token // properly balanced. // * In attribute values and character data, unknown or malformed // character entities (sequences beginning with &) are left alone. // // Setting: // // d.Strict = false // d.AutoClose = xml.HTMLAutoClose // d.Entity = xml.HTMLEntity // // creates a parser that can handle typical HTML. // // Strict mode does not enforce the requirements of the XML name spaces TR. // In particular it does not reject name space tags using undefined prefixes. // Such tags are recorded with the unknown prefix as the name space URL. Strict bool // When Strict == false, AutoClose indicates a set of elements to // consider closed immediately after they are opened, regardless // of whether an end element is present. AutoClose []string // Entity can be used to map non-standard entity names to string replacements. // The parser behaves as if these standard mappings are present in the map, // regardless of the actual map content: // // "lt": "<", // "gt": ">", // "amp": "&", // "apos": "'", // "quot": `"`, Entity map[string]string // CharsetReader, if non-nil, defines a function to generate // charset-conversion readers, converting from the provided // non-UTF-8 charset into UTF-8. If CharsetReader is nil or // returns an error, parsing stops with an error. One of the // CharsetReader's result values must be non-nil. CharsetReader func(charset string, input io.Reader) (io.Reader, error) // DefaultSpace sets the default name space used for unadorned tags, // as if the entire XML stream were wrapped in an element containing // the attribute xmlns="DefaultSpace". DefaultSpace string // Go 1.1 // contains filtered or unexported fields } ``` ### func NewDecoder ``` func NewDecoder(r io.Reader) *Decoder ``` NewDecoder creates a new XML parser reading from r. If r does not implement io.ByteReader, NewDecoder will do its own buffering. ### func NewTokenDecoder 1.10 ``` func NewTokenDecoder(t TokenReader) *Decoder ``` NewTokenDecoder creates a new XML parser using an underlying token stream. ### func (\*Decoder) Decode ``` func (d *Decoder) Decode(v any) error ``` Decode works like Unmarshal, except it reads the decoder stream to find the start element. ### func (\*Decoder) DecodeElement ``` func (d *Decoder) DecodeElement(v any, start *StartElement) error ``` DecodeElement works like Unmarshal except that it takes a pointer to the start XML element to decode into v. It is useful when a client reads some raw XML tokens itself but also wants to defer to Unmarshal for some elements. ### func (\*Decoder) InputOffset 1.4 ``` func (d *Decoder) InputOffset() int64 ``` InputOffset returns the input stream byte offset of the current decoder position. The offset gives the location of the end of the most recently returned token and the beginning of the next token. ### func (\*Decoder) InputPos 1.19 ``` func (d *Decoder) InputPos() (line, column int) ``` InputPos returns the line of the current decoder position and the 1 based input position of the line. The position gives the location of the end of the most recently returned token. ### func (\*Decoder) RawToken ``` func (d *Decoder) RawToken() (Token, error) ``` RawToken is like Token but does not verify that start and end elements match and does not translate name space prefixes to their corresponding URLs. ### func (\*Decoder) Skip ``` func (d *Decoder) Skip() error ``` Skip reads tokens until it has consumed the end element matching the most recent start element already consumed, skipping nested structures. It returns nil if it finds an end element matching the start element; otherwise it returns an error describing the problem. ### func (\*Decoder) Token ``` func (d *Decoder) Token() (Token, error) ``` Token returns the next XML token in the input stream. At the end of the input stream, Token returns nil, io.EOF. Slices of bytes in the returned token data refer to the parser's internal buffer and remain valid only until the next call to Token. To acquire a copy of the bytes, call CopyToken or the token's Copy method. Token expands self-closing elements such as <br> into separate start and end elements returned by successive calls. Token guarantees that the StartElement and EndElement tokens it returns are properly nested and matched: if Token encounters an unexpected end element or EOF before all expected end elements, it will return an error. Token implements XML name spaces as described by <https://www.w3.org/TR/REC-xml-names/>. Each of the Name structures contained in the Token has the Space set to the URL identifying its name space when known. If Token encounters an unrecognized name space prefix, it uses the prefix as the Space rather than report an error. type Directive -------------- A Directive represents an XML directive of the form <!text>. The bytes do not include the <! and > markers. ``` type Directive []byte ``` ### func (Directive) Copy ``` func (d Directive) Copy() Directive ``` Copy creates a new copy of Directive. type Encoder ------------ An Encoder writes XML data to an output stream. ``` type Encoder struct { // contains filtered or unexported fields } ``` #### Example Code: ``` type Address struct { City, State string } type Person struct { XMLName xml.Name `xml:"person"` Id int `xml:"id,attr"` FirstName string `xml:"name>first"` LastName string `xml:"name>last"` Age int `xml:"age"` Height float32 `xml:"height,omitempty"` Married bool Address Comment string `xml:",comment"` } v := &Person{Id: 13, FirstName: "John", LastName: "Doe", Age: 42} v.Comment = " Need more details. " v.Address = Address{"Hanga Roa", "Easter Island"} enc := xml.NewEncoder(os.Stdout) enc.Indent(" ", " ") if err := enc.Encode(v); err != nil { fmt.Printf("error: %v\n", err) } ``` Output: ``` <person id="13"> <name> <first>John</first> <last>Doe</last> </name> <age>42</age> <Married>false</Married> <City>Hanga Roa</City> <State>Easter Island</State> <!-- Need more details. --> </person> ``` ### func NewEncoder ``` func NewEncoder(w io.Writer) *Encoder ``` NewEncoder returns a new encoder that writes to w. ### func (\*Encoder) Close 1.20 ``` func (enc *Encoder) Close() error ``` Close the Encoder, indicating that no more data will be written. It flushes any buffered XML to the underlying writer and returns an error if the written XML is invalid (e.g. by containing unclosed elements). ### func (\*Encoder) Encode ``` func (enc *Encoder) Encode(v any) error ``` Encode writes the XML encoding of v to the stream. See the documentation for Marshal for details about the conversion of Go values to XML. Encode calls Flush before returning. ### func (\*Encoder) EncodeElement 1.2 ``` func (enc *Encoder) EncodeElement(v any, start StartElement) error ``` EncodeElement writes the XML encoding of v to the stream, using start as the outermost tag in the encoding. See the documentation for Marshal for details about the conversion of Go values to XML. EncodeElement calls Flush before returning. ### func (\*Encoder) EncodeToken 1.2 ``` func (enc *Encoder) EncodeToken(t Token) error ``` EncodeToken writes the given XML token to the stream. It returns an error if StartElement and EndElement tokens are not properly matched. EncodeToken does not call Flush, because usually it is part of a larger operation such as Encode or EncodeElement (or a custom Marshaler's MarshalXML invoked during those), and those will call Flush when finished. Callers that create an Encoder and then invoke EncodeToken directly, without using Encode or EncodeElement, need to call Flush when finished to ensure that the XML is written to the underlying writer. EncodeToken allows writing a ProcInst with Target set to "xml" only as the first token in the stream. ### func (\*Encoder) Flush 1.2 ``` func (enc *Encoder) Flush() error ``` Flush flushes any buffered XML to the underlying writer. See the EncodeToken documentation for details about when it is necessary. ### func (\*Encoder) Indent 1.1 ``` func (enc *Encoder) Indent(prefix, indent string) ``` Indent sets the encoder to generate XML in which each element begins on a new indented line that starts with prefix and is followed by one or more copies of indent according to the nesting depth. type EndElement --------------- An EndElement represents an XML end element. ``` type EndElement struct { Name Name } ``` type Marshaler 1.2 ------------------ Marshaler is the interface implemented by objects that can marshal themselves into valid XML elements. MarshalXML encodes the receiver as zero or more XML elements. By convention, arrays or slices are typically encoded as a sequence of elements, one per entry. Using start as the element tag is not required, but doing so will enable Unmarshal to match the XML elements to the correct struct field. One common implementation strategy is to construct a separate value with a layout corresponding to the desired XML and then to encode it using e.EncodeElement. Another common strategy is to use repeated calls to e.EncodeToken to generate the XML output one token at a time. The sequence of encoded tokens must make up zero or more valid XML elements. ``` type Marshaler interface { MarshalXML(e *Encoder, start StartElement) error } ``` type MarshalerAttr 1.2 ---------------------- MarshalerAttr is the interface implemented by objects that can marshal themselves into valid XML attributes. MarshalXMLAttr returns an XML attribute with the encoded value of the receiver. Using name as the attribute name is not required, but doing so will enable Unmarshal to match the attribute to the correct struct field. If MarshalXMLAttr returns the zero attribute Attr{}, no attribute will be generated in the output. MarshalXMLAttr is used only for struct fields with the "attr" option in the field tag. ``` type MarshalerAttr interface { MarshalXMLAttr(name Name) (Attr, error) } ``` type Name --------- A Name represents an XML name (Local) annotated with a name space identifier (Space). In tokens returned by Decoder.Token, the Space identifier is given as a canonical URL, not the short prefix used in the document being parsed. ``` type Name struct { Space, Local string } ``` type ProcInst ------------- A ProcInst represents an XML processing instruction of the form <?target inst?> ``` type ProcInst struct { Target string Inst []byte } ``` ### func (ProcInst) Copy ``` func (p ProcInst) Copy() ProcInst ``` Copy creates a new copy of ProcInst. type StartElement ----------------- A StartElement represents an XML start element. ``` type StartElement struct { Name Name Attr []Attr } ``` ### func (StartElement) Copy ``` func (e StartElement) Copy() StartElement ``` Copy creates a new copy of StartElement. ### func (StartElement) End 1.2 ``` func (e StartElement) End() EndElement ``` End returns the corresponding XML end element. type SyntaxError ---------------- A SyntaxError represents a syntax error in the XML input stream. ``` type SyntaxError struct { Msg string Line int } ``` ### func (\*SyntaxError) Error ``` func (e *SyntaxError) Error() string ``` type TagPathError ----------------- A TagPathError represents an error in the unmarshaling process caused by the use of field tags with conflicting paths. ``` type TagPathError struct { Struct reflect.Type Field1, Tag1 string Field2, Tag2 string } ``` ### func (\*TagPathError) Error ``` func (e *TagPathError) Error() string ``` type Token ---------- A Token is an interface holding one of the token types: StartElement, EndElement, CharData, Comment, ProcInst, or Directive. ``` type Token any ``` ### func CopyToken ``` func CopyToken(t Token) Token ``` CopyToken returns a copy of a Token. type TokenReader 1.10 --------------------- A TokenReader is anything that can decode a stream of XML tokens, including a Decoder. When Token encounters an error or end-of-file condition after successfully reading a token, it returns the token. It may return the (non-nil) error from the same call or return the error (and a nil token) from a subsequent call. An instance of this general case is that a TokenReader returning a non-nil token at the end of the token stream may return either io.EOF or a nil error. The next Read should return nil, io.EOF. Implementations of Token are discouraged from returning a nil token with a nil error. Callers should treat a return of nil, nil as indicating that nothing happened; in particular it does not indicate EOF. ``` type TokenReader interface { Token() (Token, error) } ``` type UnmarshalError ------------------- An UnmarshalError represents an error in the unmarshaling process. ``` type UnmarshalError string ``` ### func (UnmarshalError) Error ``` func (e UnmarshalError) Error() string ``` type Unmarshaler 1.2 -------------------- Unmarshaler is the interface implemented by objects that can unmarshal an XML element description of themselves. UnmarshalXML decodes a single XML element beginning with the given start element. If it returns an error, the outer call to Unmarshal stops and returns that error. UnmarshalXML must consume exactly one XML element. One common implementation strategy is to unmarshal into a separate value with a layout matching the expected XML using d.DecodeElement, and then to copy the data from that value into the receiver. Another common strategy is to use d.Token to process the XML object one token at a time. UnmarshalXML may not use d.RawToken. ``` type Unmarshaler interface { UnmarshalXML(d *Decoder, start StartElement) error } ``` type UnmarshalerAttr 1.2 ------------------------ UnmarshalerAttr is the interface implemented by objects that can unmarshal an XML attribute description of themselves. UnmarshalXMLAttr decodes a single XML attribute. If it returns an error, the outer call to Unmarshal stops and returns that error. UnmarshalXMLAttr is used only for struct fields with the "attr" option in the field tag. ``` type UnmarshalerAttr interface { UnmarshalXMLAttr(attr Attr) error } ``` type UnsupportedTypeError ------------------------- UnsupportedTypeError is returned when Marshal encounters a type that cannot be converted into XML. ``` type UnsupportedTypeError struct { Type reflect.Type } ``` ### func (\*UnsupportedTypeError) Error ``` func (e *UnsupportedTypeError) Error() string ``` Bugs ---- * ☞ Mapping between XML elements and data structures is inherently flawed: an XML element is an order-dependent collection of anonymous values, while a data structure is an order-independent collection of named values. See package json for a textual representation more suitable to data structures.
programming_docs
go Package json Package json ============= * `import "encoding/json"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package json implements encoding and decoding of JSON as defined in RFC 7159. The mapping between JSON and Go values is described in the documentation for the Marshal and Unmarshal functions. See "JSON and Go" for an introduction to this package: <https://golang.org/doc/articles/json_and_go.html> #### Example (CustomMarshalJSON) Code: ``` package json_test import ( "encoding/json" "fmt" "log" "strings" ) type Animal int const ( Unknown Animal = iota Gopher Zebra ) func (a *Animal) UnmarshalJSON(b []byte) error { var s string if err := json.Unmarshal(b, &s); err != nil { return err } switch strings.ToLower(s) { default: *a = Unknown case "gopher": *a = Gopher case "zebra": *a = Zebra } return nil } func (a Animal) MarshalJSON() ([]byte, error) { var s string switch a { default: s = "unknown" case Gopher: s = "gopher" case Zebra: s = "zebra" } return json.Marshal(s) } func Example_customMarshalJSON() { blob := `["gopher","armadillo","zebra","unknown","gopher","bee","gopher","zebra"]` var zoo []Animal if err := json.Unmarshal([]byte(blob), &zoo); err != nil { log.Fatal(err) } census := make(map[Animal]int) for _, animal := range zoo { census[animal] += 1 } fmt.Printf("Zoo Census:\n* Gophers: %d\n* Zebras: %d\n* Unknown: %d\n", census[Gopher], census[Zebra], census[Unknown]) // Output: // Zoo Census: // * Gophers: 3 // * Zebras: 2 // * Unknown: 3 } ``` #### Example (TextMarshalJSON) Code: ``` package json_test import ( "encoding/json" "fmt" "log" "strings" ) type Size int const ( Unrecognized Size = iota Small Large ) func (s *Size) UnmarshalText(text []byte) error { switch strings.ToLower(string(text)) { default: *s = Unrecognized case "small": *s = Small case "large": *s = Large } return nil } func (s Size) MarshalText() ([]byte, error) { var name string switch s { default: name = "unrecognized" case Small: name = "small" case Large: name = "large" } return []byte(name), nil } func Example_textMarshalJSON() { blob := `["small","regular","large","unrecognized","small","normal","small","large"]` var inventory []Size if err := json.Unmarshal([]byte(blob), &inventory); err != nil { log.Fatal(err) } counts := make(map[Size]int) for _, size := range inventory { counts[size] += 1 } fmt.Printf("Inventory Counts:\n* Small: %d\n* Large: %d\n* Unrecognized: %d\n", counts[Small], counts[Large], counts[Unrecognized]) // Output: // Inventory Counts: // * Small: 3 // * Large: 2 // * Unrecognized: 3 } ``` Index ----- * [func Compact(dst \*bytes.Buffer, src []byte) error](#Compact) * [func HTMLEscape(dst \*bytes.Buffer, src []byte)](#HTMLEscape) * [func Indent(dst \*bytes.Buffer, src []byte, prefix, indent string) error](#Indent) * [func Marshal(v any) ([]byte, error)](#Marshal) * [func MarshalIndent(v any, prefix, indent string) ([]byte, error)](#MarshalIndent) * [func Unmarshal(data []byte, v any) error](#Unmarshal) * [func Valid(data []byte) bool](#Valid) * [type Decoder](#Decoder) * [func NewDecoder(r io.Reader) \*Decoder](#NewDecoder) * [func (dec \*Decoder) Buffered() io.Reader](#Decoder.Buffered) * [func (dec \*Decoder) Decode(v any) error](#Decoder.Decode) * [func (dec \*Decoder) DisallowUnknownFields()](#Decoder.DisallowUnknownFields) * [func (dec \*Decoder) InputOffset() int64](#Decoder.InputOffset) * [func (dec \*Decoder) More() bool](#Decoder.More) * [func (dec \*Decoder) Token() (Token, error)](#Decoder.Token) * [func (dec \*Decoder) UseNumber()](#Decoder.UseNumber) * [type Delim](#Delim) * [func (d Delim) String() string](#Delim.String) * [type Encoder](#Encoder) * [func NewEncoder(w io.Writer) \*Encoder](#NewEncoder) * [func (enc \*Encoder) Encode(v any) error](#Encoder.Encode) * [func (enc \*Encoder) SetEscapeHTML(on bool)](#Encoder.SetEscapeHTML) * [func (enc \*Encoder) SetIndent(prefix, indent string)](#Encoder.SetIndent) * [type InvalidUTF8Error](#InvalidUTF8Error) * [func (e \*InvalidUTF8Error) Error() string](#InvalidUTF8Error.Error) * [type InvalidUnmarshalError](#InvalidUnmarshalError) * [func (e \*InvalidUnmarshalError) Error() string](#InvalidUnmarshalError.Error) * [type Marshaler](#Marshaler) * [type MarshalerError](#MarshalerError) * [func (e \*MarshalerError) Error() string](#MarshalerError.Error) * [func (e \*MarshalerError) Unwrap() error](#MarshalerError.Unwrap) * [type Number](#Number) * [func (n Number) Float64() (float64, error)](#Number.Float64) * [func (n Number) Int64() (int64, error)](#Number.Int64) * [func (n Number) String() string](#Number.String) * [type RawMessage](#RawMessage) * [func (m RawMessage) MarshalJSON() ([]byte, error)](#RawMessage.MarshalJSON) * [func (m \*RawMessage) UnmarshalJSON(data []byte) error](#RawMessage.UnmarshalJSON) * [type SyntaxError](#SyntaxError) * [func (e \*SyntaxError) Error() string](#SyntaxError.Error) * [type Token](#Token) * [type UnmarshalFieldError](#UnmarshalFieldError) * [func (e \*UnmarshalFieldError) Error() string](#UnmarshalFieldError.Error) * [type UnmarshalTypeError](#UnmarshalTypeError) * [func (e \*UnmarshalTypeError) Error() string](#UnmarshalTypeError.Error) * [type Unmarshaler](#Unmarshaler) * [type UnsupportedTypeError](#UnsupportedTypeError) * [func (e \*UnsupportedTypeError) Error() string](#UnsupportedTypeError.Error) * [type UnsupportedValueError](#UnsupportedValueError) * [func (e \*UnsupportedValueError) Error() string](#UnsupportedValueError.Error) ### Examples [Decoder](#example_Decoder) [Decoder.Decode (Stream)](#example_Decoder_Decode_stream) [Decoder.Token](#example_Decoder_Token) [HTMLEscape](#example_HTMLEscape) [Indent](#example_Indent) [Marshal](#example_Marshal) [MarshalIndent](#example_MarshalIndent) [RawMessage (Marshal)](#example_RawMessage_marshal) [RawMessage (Unmarshal)](#example_RawMessage_unmarshal) [Unmarshal](#example_Unmarshal) [Valid](#example_Valid) [Package (CustomMarshalJSON)](#example__customMarshalJSON) [Package (TextMarshalJSON)](#example__textMarshalJSON) ### Package files decode.go encode.go fold.go indent.go scanner.go stream.go tables.go tags.go func Compact ------------ ``` func Compact(dst *bytes.Buffer, src []byte) error ``` Compact appends to dst the JSON-encoded src with insignificant space characters elided. func HTMLEscape --------------- ``` func HTMLEscape(dst *bytes.Buffer, src []byte) ``` HTMLEscape appends to dst the JSON-encoded src with <, >, &, U+2028 and U+2029 characters inside string literals changed to \u003c, \u003e, \u0026, \u2028, \u2029 so that the JSON will be safe to embed inside HTML <script> tags. For historical reasons, web browsers don't honor standard HTML escaping within <script> tags, so an alternative JSON encoding must be used. #### Example Code: ``` var out bytes.Buffer json.HTMLEscape(&out, []byte(`{"Name":"<b>HTML content</b>"}`)) out.WriteTo(os.Stdout) ``` Output: ``` {"Name":"\u003cb\u003eHTML content\u003c/b\u003e"} ``` func Indent ----------- ``` func Indent(dst *bytes.Buffer, src []byte, prefix, indent string) error ``` Indent appends to dst an indented form of the JSON-encoded src. Each element in a JSON object or array begins on a new, indented line beginning with prefix followed by one or more copies of indent according to the indentation nesting. The data appended to dst does not begin with the prefix nor any indentation, to make it easier to embed inside other formatted JSON data. Although leading space characters (space, tab, carriage return, newline) at the beginning of src are dropped, trailing space characters at the end of src are preserved and copied to dst. For example, if src has no trailing spaces, neither will dst; if src ends in a trailing newline, so will dst. #### Example Code: ``` type Road struct { Name string Number int } roads := []Road{ {"Diamond Fork", 29}, {"Sheep Creek", 51}, } b, err := json.Marshal(roads) if err != nil { log.Fatal(err) } var out bytes.Buffer json.Indent(&out, b, "=", "\t") out.WriteTo(os.Stdout) ``` Output: ``` [ = { = "Name": "Diamond Fork", = "Number": 29 = }, = { = "Name": "Sheep Creek", = "Number": 51 = } =] ``` func Marshal ------------ ``` func Marshal(v any) ([]byte, error) ``` Marshal returns the JSON encoding of v. Marshal traverses the value v recursively. If an encountered value implements the Marshaler interface and is not a nil pointer, Marshal calls its MarshalJSON method to produce JSON. If no MarshalJSON method is present but the value implements encoding.TextMarshaler instead, Marshal calls its MarshalText method and encodes the result as a JSON string. The nil pointer exception is not strictly necessary but mimics a similar, necessary exception in the behavior of UnmarshalJSON. Otherwise, Marshal uses the following type-dependent default encodings: Boolean values encode as JSON booleans. Floating point, integer, and Number values encode as JSON numbers. String values encode as JSON strings coerced to valid UTF-8, replacing invalid bytes with the Unicode replacement rune. So that the JSON will be safe to embed inside HTML <script> tags, the string is encoded using HTMLEscape, which replaces "<", ">", "&", U+2028, and U+2029 are escaped to "\u003c","\u003e", "\u0026", "\u2028", and "\u2029". This replacement can be disabled when using an Encoder, by calling SetEscapeHTML(false). Array and slice values encode as JSON arrays, except that []byte encodes as a base64-encoded string, and a nil slice encodes as the null JSON value. Struct values encode as JSON objects. Each exported struct field becomes a member of the object, using the field name as the object key, unless the field is omitted for one of the reasons given below. The encoding of each struct field can be customized by the format string stored under the "json" key in the struct field's tag. The format string gives the name of the field, possibly followed by a comma-separated list of options. The name may be empty in order to specify options without overriding the default field name. The "omitempty" option specifies that the field should be omitted from the encoding if the field has an empty value, defined as false, 0, a nil pointer, a nil interface value, and any empty array, slice, map, or string. As a special case, if the field tag is "-", the field is always omitted. Note that a field with name "-" can still be generated using the tag "-,". Examples of struct field tags and their meanings: ``` // Field appears in JSON as key "myName". Field int `json:"myName"` // Field appears in JSON as key "myName" and // the field is omitted from the object if its value is empty, // as defined above. Field int `json:"myName,omitempty"` // Field appears in JSON as key "Field" (the default), but // the field is skipped if empty. // Note the leading comma. Field int `json:",omitempty"` // Field is ignored by this package. Field int `json:"-"` // Field appears in JSON as key "-". Field int `json:"-,"` ``` The "string" option signals that a field is stored as JSON inside a JSON-encoded string. It applies only to fields of string, floating point, integer, or boolean types. This extra level of encoding is sometimes used when communicating with JavaScript programs: ``` Int64String int64 `json:",string"` ``` The key name will be used if it's a non-empty string consisting of only Unicode letters, digits, and ASCII punctuation except quotation marks, backslash, and comma. Anonymous struct fields are usually marshaled as if their inner exported fields were fields in the outer struct, subject to the usual Go visibility rules amended as described in the next paragraph. An anonymous struct field with a name given in its JSON tag is treated as having that name, rather than being anonymous. An anonymous struct field of interface type is treated the same as having that type as its name, rather than being anonymous. The Go visibility rules for struct fields are amended for JSON when deciding which field to marshal or unmarshal. If there are multiple fields at the same level, and that level is the least nested (and would therefore be the nesting level selected by the usual Go rules), the following extra rules apply: 1) Of those fields, if any are JSON-tagged, only tagged fields are considered, even if there are multiple untagged fields that would otherwise conflict. 2) If there is exactly one field (tagged or not according to the first rule), that is selected. 3) Otherwise there are multiple fields, and all are ignored; no error occurs. Handling of anonymous struct fields is new in Go 1.1. Prior to Go 1.1, anonymous struct fields were ignored. To force ignoring of an anonymous struct field in both current and earlier versions, give the field a JSON tag of "-". Map values encode as JSON objects. The map's key type must either be a string, an integer type, or implement encoding.TextMarshaler. The map keys are sorted and used as JSON object keys by applying the following rules, subject to the UTF-8 coercion described for string values above: * keys of any string type are used directly * encoding.TextMarshalers are marshaled * integer keys are converted to strings Pointer values encode as the value pointed to. A nil pointer encodes as the null JSON value. Interface values encode as the value contained in the interface. A nil interface value encodes as the null JSON value. Channel, complex, and function values cannot be encoded in JSON. Attempting to encode such a value causes Marshal to return an UnsupportedTypeError. JSON cannot represent cyclic data structures and Marshal does not handle them. Passing cyclic structures to Marshal will result in an error. #### Example Code: ``` type ColorGroup struct { ID int Name string Colors []string } group := ColorGroup{ ID: 1, Name: "Reds", Colors: []string{"Crimson", "Red", "Ruby", "Maroon"}, } b, err := json.Marshal(group) if err != nil { fmt.Println("error:", err) } os.Stdout.Write(b) ``` Output: ``` {"ID":1,"Name":"Reds","Colors":["Crimson","Red","Ruby","Maroon"]} ``` func MarshalIndent ------------------ ``` func MarshalIndent(v any, prefix, indent string) ([]byte, error) ``` MarshalIndent is like Marshal but applies Indent to format the output. Each JSON element in the output will begin on a new line beginning with prefix followed by one or more copies of indent according to the indentation nesting. #### Example Code: ``` data := map[string]int{ "a": 1, "b": 2, } b, err := json.MarshalIndent(data, "<prefix>", "<indent>") if err != nil { log.Fatal(err) } fmt.Println(string(b)) ``` Output: ``` { <prefix><indent>"a": 1, <prefix><indent>"b": 2 <prefix>} ``` func Unmarshal -------------- ``` func Unmarshal(data []byte, v any) error ``` Unmarshal parses the JSON-encoded data and stores the result in the value pointed to by v. If v is nil or not a pointer, Unmarshal returns an InvalidUnmarshalError. Unmarshal uses the inverse of the encodings that Marshal uses, allocating maps, slices, and pointers as necessary, with the following additional rules: To unmarshal JSON into a pointer, Unmarshal first handles the case of the JSON being the JSON literal null. In that case, Unmarshal sets the pointer to nil. Otherwise, Unmarshal unmarshals the JSON into the value pointed at by the pointer. If the pointer is nil, Unmarshal allocates a new value for it to point to. To unmarshal JSON into a value implementing the Unmarshaler interface, Unmarshal calls that value's UnmarshalJSON method, including when the input is a JSON null. Otherwise, if the value implements encoding.TextUnmarshaler and the input is a JSON quoted string, Unmarshal calls that value's UnmarshalText method with the unquoted form of the string. To unmarshal JSON into a struct, Unmarshal matches incoming object keys to the keys used by Marshal (either the struct field name or its tag), preferring an exact match but also accepting a case-insensitive match. By default, object keys which don't have a corresponding struct field are ignored (see Decoder.DisallowUnknownFields for an alternative). To unmarshal JSON into an interface value, Unmarshal stores one of these in the interface value: ``` bool, for JSON booleans float64, for JSON numbers string, for JSON strings []interface{}, for JSON arrays map[string]interface{}, for JSON objects nil for JSON null ``` To unmarshal a JSON array into a slice, Unmarshal resets the slice length to zero and then appends each element to the slice. As a special case, to unmarshal an empty JSON array into a slice, Unmarshal replaces the slice with a new empty slice. To unmarshal a JSON array into a Go array, Unmarshal decodes JSON array elements into corresponding Go array elements. If the Go array is smaller than the JSON array, the additional JSON array elements are discarded. If the JSON array is smaller than the Go array, the additional Go array elements are set to zero values. To unmarshal a JSON object into a map, Unmarshal first establishes a map to use. If the map is nil, Unmarshal allocates a new map. Otherwise Unmarshal reuses the existing map, keeping existing entries. Unmarshal then stores key-value pairs from the JSON object into the map. The map's key type must either be any string type, an integer, implement json.Unmarshaler, or implement encoding.TextUnmarshaler. If the JSON-encoded data contain a syntax error, Unmarshal returns a SyntaxError. If a JSON value is not appropriate for a given target type, or if a JSON number overflows the target type, Unmarshal skips that field and completes the unmarshaling as best it can. If no more serious errors are encountered, Unmarshal returns an UnmarshalTypeError describing the earliest such error. In any case, it's not guaranteed that all the remaining fields following the problematic one will be unmarshaled into the target object. The JSON null value unmarshals into an interface, map, pointer, or slice by setting that Go value to nil. Because null is often used in JSON to mean “not present,” unmarshaling a JSON null into any other Go type has no effect on the value and produces no error. When unmarshaling quoted strings, invalid UTF-8 or invalid UTF-16 surrogate pairs are not treated as an error. Instead, they are replaced by the Unicode replacement character U+FFFD. #### Example Code: ``` var jsonBlob = []byte(`[ {"Name": "Platypus", "Order": "Monotremata"}, {"Name": "Quoll", "Order": "Dasyuromorphia"} ]`) type Animal struct { Name string Order string } var animals []Animal err := json.Unmarshal(jsonBlob, &animals) if err != nil { fmt.Println("error:", err) } fmt.Printf("%+v", animals) ``` Output: ``` [{Name:Platypus Order:Monotremata} {Name:Quoll Order:Dasyuromorphia}] ``` func Valid 1.9 -------------- ``` func Valid(data []byte) bool ``` Valid reports whether data is a valid JSON encoding. #### Example Code: ``` goodJSON := `{"example": 1}` badJSON := `{"example":2:]}}` fmt.Println(json.Valid([]byte(goodJSON)), json.Valid([]byte(badJSON))) ``` Output: ``` true false ``` type Decoder ------------ A Decoder reads and decodes JSON values from an input stream. ``` type Decoder struct { // contains filtered or unexported fields } ``` #### Example This example uses a Decoder to decode a stream of distinct JSON values. Code: ``` const jsonStream = ` {"Name": "Ed", "Text": "Knock knock."} {"Name": "Sam", "Text": "Who's there?"} {"Name": "Ed", "Text": "Go fmt."} {"Name": "Sam", "Text": "Go fmt who?"} {"Name": "Ed", "Text": "Go fmt yourself!"} ` type Message struct { Name, Text string } dec := json.NewDecoder(strings.NewReader(jsonStream)) for { var m Message if err := dec.Decode(&m); err == io.EOF { break } else if err != nil { log.Fatal(err) } fmt.Printf("%s: %s\n", m.Name, m.Text) } ``` Output: ``` Ed: Knock knock. Sam: Who's there? Ed: Go fmt. Sam: Go fmt who? Ed: Go fmt yourself! ``` ### func NewDecoder ``` func NewDecoder(r io.Reader) *Decoder ``` NewDecoder returns a new decoder that reads from r. The decoder introduces its own buffering and may read data from r beyond the JSON values requested. ### func (\*Decoder) Buffered 1.1 ``` func (dec *Decoder) Buffered() io.Reader ``` Buffered returns a reader of the data remaining in the Decoder's buffer. The reader is valid until the next call to Decode. ### func (\*Decoder) Decode ``` func (dec *Decoder) Decode(v any) error ``` Decode reads the next JSON-encoded value from its input and stores it in the value pointed to by v. See the documentation for Unmarshal for details about the conversion of JSON into a Go value. #### Example (Stream) This example uses a Decoder to decode a streaming array of JSON objects. Code: ``` const jsonStream = ` [ {"Name": "Ed", "Text": "Knock knock."}, {"Name": "Sam", "Text": "Who's there?"}, {"Name": "Ed", "Text": "Go fmt."}, {"Name": "Sam", "Text": "Go fmt who?"}, {"Name": "Ed", "Text": "Go fmt yourself!"} ] ` type Message struct { Name, Text string } dec := json.NewDecoder(strings.NewReader(jsonStream)) // read open bracket t, err := dec.Token() if err != nil { log.Fatal(err) } fmt.Printf("%T: %v\n", t, t) // while the array contains values for dec.More() { var m Message // decode an array value (Message) err := dec.Decode(&m) if err != nil { log.Fatal(err) } fmt.Printf("%v: %v\n", m.Name, m.Text) } // read closing bracket t, err = dec.Token() if err != nil { log.Fatal(err) } fmt.Printf("%T: %v\n", t, t) ``` Output: ``` json.Delim: [ Ed: Knock knock. Sam: Who's there? Ed: Go fmt. Sam: Go fmt who? Ed: Go fmt yourself! json.Delim: ] ``` ### func (\*Decoder) DisallowUnknownFields 1.10 ``` func (dec *Decoder) DisallowUnknownFields() ``` DisallowUnknownFields causes the Decoder to return an error when the destination is a struct and the input contains object keys which do not match any non-ignored, exported fields in the destination. ### func (\*Decoder) InputOffset 1.14 ``` func (dec *Decoder) InputOffset() int64 ``` InputOffset returns the input stream byte offset of the current decoder position. The offset gives the location of the end of the most recently returned token and the beginning of the next token. ### func (\*Decoder) More 1.5 ``` func (dec *Decoder) More() bool ``` More reports whether there is another element in the current array or object being parsed. ### func (\*Decoder) Token 1.5 ``` func (dec *Decoder) Token() (Token, error) ``` Token returns the next JSON token in the input stream. At the end of the input stream, Token returns nil, io.EOF. Token guarantees that the delimiters [ ] { } it returns are properly nested and matched: if Token encounters an unexpected delimiter in the input, it will return an error. The input stream consists of basic JSON values—bool, string, number, and null—along with delimiters [ ] { } of type Delim to mark the start and end of arrays and objects. Commas and colons are elided. #### Example This example uses a Decoder to decode a stream of distinct JSON values. Code: ``` const jsonStream = ` {"Message": "Hello", "Array": [1, 2, 3], "Null": null, "Number": 1.234} ` dec := json.NewDecoder(strings.NewReader(jsonStream)) for { t, err := dec.Token() if err == io.EOF { break } if err != nil { log.Fatal(err) } fmt.Printf("%T: %v", t, t) if dec.More() { fmt.Printf(" (more)") } fmt.Printf("\n") } ``` Output: ``` json.Delim: { (more) string: Message (more) string: Hello (more) string: Array (more) json.Delim: [ (more) float64: 1 (more) float64: 2 (more) float64: 3 json.Delim: ] (more) string: Null (more) <nil>: <nil> (more) string: Number (more) float64: 1.234 json.Delim: } ``` ### func (\*Decoder) UseNumber 1.1 ``` func (dec *Decoder) UseNumber() ``` UseNumber causes the Decoder to unmarshal a number into an interface{} as a Number instead of as a float64. type Delim 1.5 -------------- A Delim is a JSON array or object delimiter, one of [ ] { or }. ``` type Delim rune ``` ### func (Delim) String 1.5 ``` func (d Delim) String() string ``` type Encoder ------------ An Encoder writes JSON values to an output stream. ``` type Encoder struct { // contains filtered or unexported fields } ``` ### func NewEncoder ``` func NewEncoder(w io.Writer) *Encoder ``` NewEncoder returns a new encoder that writes to w. ### func (\*Encoder) Encode ``` func (enc *Encoder) Encode(v any) error ``` Encode writes the JSON encoding of v to the stream, followed by a newline character. See the documentation for Marshal for details about the conversion of Go values to JSON. ### func (\*Encoder) SetEscapeHTML 1.7 ``` func (enc *Encoder) SetEscapeHTML(on bool) ``` SetEscapeHTML specifies whether problematic HTML characters should be escaped inside JSON quoted strings. The default behavior is to escape &, <, and > to \u0026, \u003c, and \u003e to avoid certain safety problems that can arise when embedding JSON in HTML. In non-HTML settings where the escaping interferes with the readability of the output, SetEscapeHTML(false) disables this behavior. ### func (\*Encoder) SetIndent 1.7 ``` func (enc *Encoder) SetIndent(prefix, indent string) ``` SetIndent instructs the encoder to format each subsequent encoded value as if indented by the package-level function Indent(dst, src, prefix, indent). Calling SetIndent("", "") disables indentation. type InvalidUTF8Error --------------------- Before Go 1.2, an InvalidUTF8Error was returned by Marshal when attempting to encode a string value with invalid UTF-8 sequences. As of Go 1.2, Marshal instead coerces the string to valid UTF-8 by replacing invalid bytes with the Unicode replacement rune U+FFFD. Deprecated: No longer used; kept for compatibility. ``` type InvalidUTF8Error struct { S string // the whole string value that caused the error } ``` ### func (\*InvalidUTF8Error) Error ``` func (e *InvalidUTF8Error) Error() string ``` type InvalidUnmarshalError -------------------------- An InvalidUnmarshalError describes an invalid argument passed to Unmarshal. (The argument to Unmarshal must be a non-nil pointer.) ``` type InvalidUnmarshalError struct { Type reflect.Type } ``` ### func (\*InvalidUnmarshalError) Error ``` func (e *InvalidUnmarshalError) Error() string ``` type Marshaler -------------- Marshaler is the interface implemented by types that can marshal themselves into valid JSON. ``` type Marshaler interface { MarshalJSON() ([]byte, error) } ``` type MarshalerError ------------------- A MarshalerError represents an error from calling a MarshalJSON or MarshalText method. ``` type MarshalerError struct { Type reflect.Type Err error // contains filtered or unexported fields } ``` ### func (\*MarshalerError) Error ``` func (e *MarshalerError) Error() string ``` ### func (\*MarshalerError) Unwrap 1.13 ``` func (e *MarshalerError) Unwrap() error ``` Unwrap returns the underlying error. type Number 1.1 --------------- A Number represents a JSON number literal. ``` type Number string ``` ### func (Number) Float64 1.1 ``` func (n Number) Float64() (float64, error) ``` Float64 returns the number as a float64. ### func (Number) Int64 1.1 ``` func (n Number) Int64() (int64, error) ``` Int64 returns the number as an int64. ### func (Number) String 1.1 ``` func (n Number) String() string ``` String returns the literal text of the number. type RawMessage --------------- RawMessage is a raw encoded JSON value. It implements Marshaler and Unmarshaler and can be used to delay JSON decoding or precompute a JSON encoding. ``` type RawMessage []byte ``` #### Example (Marshal) This example uses RawMessage to use a precomputed JSON during marshal. Code: ``` h := json.RawMessage(`{"precomputed": true}`) c := struct { Header *json.RawMessage `json:"header"` Body string `json:"body"` }{Header: &h, Body: "Hello Gophers!"} b, err := json.MarshalIndent(&c, "", "\t") if err != nil { fmt.Println("error:", err) } os.Stdout.Write(b) ``` Output: ``` { "header": { "precomputed": true }, "body": "Hello Gophers!" } ``` #### Example (Unmarshal) This example uses RawMessage to delay parsing part of a JSON message. Code: ``` type Color struct { Space string Point json.RawMessage // delay parsing until we know the color space } type RGB struct { R uint8 G uint8 B uint8 } type YCbCr struct { Y uint8 Cb int8 Cr int8 } var j = []byte(`[ {"Space": "YCbCr", "Point": {"Y": 255, "Cb": 0, "Cr": -10}}, {"Space": "RGB", "Point": {"R": 98, "G": 218, "B": 255}} ]`) var colors []Color err := json.Unmarshal(j, &colors) if err != nil { log.Fatalln("error:", err) } for _, c := range colors { var dst any switch c.Space { case "RGB": dst = new(RGB) case "YCbCr": dst = new(YCbCr) } err := json.Unmarshal(c.Point, dst) if err != nil { log.Fatalln("error:", err) } fmt.Println(c.Space, dst) } ``` Output: ``` YCbCr &{255 0 -10} RGB &{98 218 255} ``` ### func (RawMessage) MarshalJSON 1.8 ``` func (m RawMessage) MarshalJSON() ([]byte, error) ``` MarshalJSON returns m as the JSON encoding of m. ### func (\*RawMessage) UnmarshalJSON ``` func (m *RawMessage) UnmarshalJSON(data []byte) error ``` UnmarshalJSON sets \*m to a copy of data. type SyntaxError ---------------- A SyntaxError is a description of a JSON syntax error. Unmarshal will return a SyntaxError if the JSON can't be parsed. ``` type SyntaxError struct { Offset int64 // error occurred after reading Offset bytes // contains filtered or unexported fields } ``` ### func (\*SyntaxError) Error ``` func (e *SyntaxError) Error() string ``` type Token 1.5 -------------- A Token holds a value of one of these types: ``` Delim, for the four JSON delimiters [ ] { } bool, for JSON booleans float64, for JSON numbers Number, for JSON numbers string, for JSON string literals nil, for JSON null ``` ``` type Token any ``` type UnmarshalFieldError ------------------------ An UnmarshalFieldError describes a JSON object key that led to an unexported (and therefore unwritable) struct field. Deprecated: No longer used; kept for compatibility. ``` type UnmarshalFieldError struct { Key string Type reflect.Type Field reflect.StructField } ``` ### func (\*UnmarshalFieldError) Error ``` func (e *UnmarshalFieldError) Error() string ``` type UnmarshalTypeError ----------------------- An UnmarshalTypeError describes a JSON value that was not appropriate for a value of a specific Go type. ``` type UnmarshalTypeError struct { Value string // description of JSON value - "bool", "array", "number -5" Type reflect.Type // type of Go value it could not be assigned to Offset int64 // error occurred after reading Offset bytes; added in Go 1.5 Struct string // name of the struct type containing the field; added in Go 1.8 Field string // the full path from root node to the field; added in Go 1.8 } ``` ### func (\*UnmarshalTypeError) Error ``` func (e *UnmarshalTypeError) Error() string ``` type Unmarshaler ---------------- Unmarshaler is the interface implemented by types that can unmarshal a JSON description of themselves. The input can be assumed to be a valid encoding of a JSON value. UnmarshalJSON must copy the JSON data if it wishes to retain the data after returning. By convention, to approximate the behavior of Unmarshal itself, Unmarshalers implement UnmarshalJSON([]byte("null")) as a no-op. ``` type Unmarshaler interface { UnmarshalJSON([]byte) error } ``` type UnsupportedTypeError ------------------------- An UnsupportedTypeError is returned by Marshal when attempting to encode an unsupported value type. ``` type UnsupportedTypeError struct { Type reflect.Type } ``` ### func (\*UnsupportedTypeError) Error ``` func (e *UnsupportedTypeError) Error() string ``` type UnsupportedValueError -------------------------- An UnsupportedValueError is returned by Marshal when attempting to encode an unsupported value. ``` type UnsupportedValueError struct { Value reflect.Value Str string } ``` ### func (\*UnsupportedValueError) Error ``` func (e *UnsupportedValueError) Error() string ```
programming_docs
go Package csv Package csv ============ * `import "encoding/csv"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package csv reads and writes comma-separated values (CSV) files. There are many kinds of CSV files; this package supports the format described in RFC 4180. A csv file contains zero or more records of one or more fields per record. Each record is separated by the newline character. The final record may optionally be followed by a newline character. ``` field1,field2,field3 ``` White space is considered part of a field. Carriage returns before newline characters are silently removed. Blank lines are ignored. A line with only whitespace characters (excluding the ending newline character) is not considered a blank line. Fields which start and stop with the quote character " are called quoted-fields. The beginning and ending quote are not part of the field. The source: ``` normal string,"quoted-field" ``` results in the fields ``` {`normal string`, `quoted-field`} ``` Within a quoted-field a quote character followed by a second quote character is considered a single quote. ``` "the ""word"" is true","a ""quoted-field""" ``` results in ``` {`the "word" is true`, `a "quoted-field"`} ``` Newlines and commas may be included in a quoted-field ``` "Multi-line field","comma is ," ``` results in ``` {`Multi-line field`, `comma is ,`} ``` Index ----- * [Variables](#pkg-variables) * [type ParseError](#ParseError) * [func (e \*ParseError) Error() string](#ParseError.Error) * [func (e \*ParseError) Unwrap() error](#ParseError.Unwrap) * [type Reader](#Reader) * [func NewReader(r io.Reader) \*Reader](#NewReader) * [func (r \*Reader) FieldPos(field int) (line, column int)](#Reader.FieldPos) * [func (r \*Reader) InputOffset() int64](#Reader.InputOffset) * [func (r \*Reader) Read() (record []string, err error)](#Reader.Read) * [func (r \*Reader) ReadAll() (records [][]string, err error)](#Reader.ReadAll) * [type Writer](#Writer) * [func NewWriter(w io.Writer) \*Writer](#NewWriter) * [func (w \*Writer) Error() error](#Writer.Error) * [func (w \*Writer) Flush()](#Writer.Flush) * [func (w \*Writer) Write(record []string) error](#Writer.Write) * [func (w \*Writer) WriteAll(records [][]string) error](#Writer.WriteAll) ### Examples [Reader](#example_Reader) [Reader.ReadAll](#example_Reader_ReadAll) [Reader (Options)](#example_Reader_options) [Writer](#example_Writer) [Writer.WriteAll](#example_Writer_WriteAll) ### Package files reader.go writer.go Variables --------- These are the errors that can be returned in ParseError.Err. ``` var ( ErrBareQuote = errors.New("bare \" in non-quoted-field") ErrQuote = errors.New("extraneous or missing \" in quoted-field") ErrFieldCount = errors.New("wrong number of fields") // Deprecated: ErrTrailingComma is no longer used. ErrTrailingComma = errors.New("extra delimiter at end of line") ) ``` type ParseError --------------- A ParseError is returned for parsing errors. Line numbers are 1-indexed and columns are 0-indexed. ``` type ParseError struct { StartLine int // Line where the record starts; added in Go 1.10 Line int // Line where the error occurred Column int // Column (1-based byte index) where the error occurred Err error // The actual error } ``` ### func (\*ParseError) Error ``` func (e *ParseError) Error() string ``` ### func (\*ParseError) Unwrap 1.13 ``` func (e *ParseError) Unwrap() error ``` type Reader ----------- A Reader reads records from a CSV-encoded file. As returned by NewReader, a Reader expects input conforming to RFC 4180. The exported fields can be changed to customize the details before the first call to Read or ReadAll. The Reader converts all \r\n sequences in its input to plain \n, including in multiline field values, so that the returned data does not depend on which line-ending convention an input file uses. ``` type Reader struct { // Comma is the field delimiter. // It is set to comma (',') by NewReader. // Comma must be a valid rune and must not be \r, \n, // or the Unicode replacement character (0xFFFD). Comma rune // Comment, if not 0, is the comment character. Lines beginning with the // Comment character without preceding whitespace are ignored. // With leading whitespace the Comment character becomes part of the // field, even if TrimLeadingSpace is true. // Comment must be a valid rune and must not be \r, \n, // or the Unicode replacement character (0xFFFD). // It must also not be equal to Comma. Comment rune // FieldsPerRecord is the number of expected fields per record. // If FieldsPerRecord is positive, Read requires each record to // have the given number of fields. If FieldsPerRecord is 0, Read sets it to // the number of fields in the first record, so that future records must // have the same field count. If FieldsPerRecord is negative, no check is // made and records may have a variable number of fields. FieldsPerRecord int // If LazyQuotes is true, a quote may appear in an unquoted field and a // non-doubled quote may appear in a quoted field. LazyQuotes bool // If TrimLeadingSpace is true, leading white space in a field is ignored. // This is done even if the field delimiter, Comma, is white space. TrimLeadingSpace bool // ReuseRecord controls whether calls to Read may return a slice sharing // the backing array of the previous call's returned slice for performance. // By default, each call to Read returns newly allocated memory owned by the caller. ReuseRecord bool // Go 1.9 // Deprecated: TrailingComma is no longer used. TrailingComma bool // contains filtered or unexported fields } ``` #### Example Code: ``` in := `first_name,last_name,username "Rob","Pike",rob Ken,Thompson,ken "Robert","Griesemer","gri" ` r := csv.NewReader(strings.NewReader(in)) for { record, err := r.Read() if err == io.EOF { break } if err != nil { log.Fatal(err) } fmt.Println(record) } ``` Output: ``` [first_name last_name username] [Rob Pike rob] [Ken Thompson ken] [Robert Griesemer gri] ``` #### Example (Options) This example shows how csv.Reader can be configured to handle other types of CSV files. Code: ``` in := `first_name;last_name;username "Rob";"Pike";rob # lines beginning with a # character are ignored Ken;Thompson;ken "Robert";"Griesemer";"gri" ` r := csv.NewReader(strings.NewReader(in)) r.Comma = ';' r.Comment = '#' records, err := r.ReadAll() if err != nil { log.Fatal(err) } fmt.Print(records) ``` Output: ``` [[first_name last_name username] [Rob Pike rob] [Ken Thompson ken] [Robert Griesemer gri]] ``` ### func NewReader ``` func NewReader(r io.Reader) *Reader ``` NewReader returns a new Reader that reads from r. ### func (\*Reader) FieldPos 1.17 ``` func (r *Reader) FieldPos(field int) (line, column int) ``` FieldPos returns the line and column corresponding to the start of the field with the given index in the slice most recently returned by Read. Numbering of lines and columns starts at 1; columns are counted in bytes, not runes. If this is called with an out-of-bounds index, it panics. ### func (\*Reader) InputOffset 1.19 ``` func (r *Reader) InputOffset() int64 ``` InputOffset returns the input stream byte offset of the current reader position. The offset gives the location of the end of the most recently read row and the beginning of the next row. ### func (\*Reader) Read ``` func (r *Reader) Read() (record []string, err error) ``` Read reads one record (a slice of fields) from r. If the record has an unexpected number of fields, Read returns the record along with the error ErrFieldCount. Except for that case, Read always returns either a non-nil record or a non-nil error, but not both. If there is no data left to be read, Read returns nil, io.EOF. If ReuseRecord is true, the returned slice may be shared between multiple calls to Read. ### func (\*Reader) ReadAll ``` func (r *Reader) ReadAll() (records [][]string, err error) ``` ReadAll reads all the remaining records from r. Each record is a slice of fields. A successful call returns err == nil, not err == io.EOF. Because ReadAll is defined to read until EOF, it does not treat end of file as an error to be reported. #### Example Code: ``` in := `first_name,last_name,username "Rob","Pike",rob Ken,Thompson,ken "Robert","Griesemer","gri" ` r := csv.NewReader(strings.NewReader(in)) records, err := r.ReadAll() if err != nil { log.Fatal(err) } fmt.Print(records) ``` Output: ``` [[first_name last_name username] [Rob Pike rob] [Ken Thompson ken] [Robert Griesemer gri]] ``` type Writer ----------- A Writer writes records using CSV encoding. As returned by NewWriter, a Writer writes records terminated by a newline and uses ',' as the field delimiter. The exported fields can be changed to customize the details before the first call to Write or WriteAll. Comma is the field delimiter. If UseCRLF is true, the Writer ends each output line with \r\n instead of \n. The writes of individual records are buffered. After all data has been written, the client should call the Flush method to guarantee all data has been forwarded to the underlying io.Writer. Any errors that occurred should be checked by calling the Error method. ``` type Writer struct { Comma rune // Field delimiter (set to ',' by NewWriter) UseCRLF bool // True to use \r\n as the line terminator // contains filtered or unexported fields } ``` #### Example Code: ``` records := [][]string{ {"first_name", "last_name", "username"}, {"Rob", "Pike", "rob"}, {"Ken", "Thompson", "ken"}, {"Robert", "Griesemer", "gri"}, } w := csv.NewWriter(os.Stdout) for _, record := range records { if err := w.Write(record); err != nil { log.Fatalln("error writing record to csv:", err) } } // Write any buffered data to the underlying writer (standard output). w.Flush() if err := w.Error(); err != nil { log.Fatal(err) } ``` Output: ``` first_name,last_name,username Rob,Pike,rob Ken,Thompson,ken Robert,Griesemer,gri ``` ### func NewWriter ``` func NewWriter(w io.Writer) *Writer ``` NewWriter returns a new Writer that writes to w. ### func (\*Writer) Error 1.1 ``` func (w *Writer) Error() error ``` Error reports any error that has occurred during a previous Write or Flush. ### func (\*Writer) Flush ``` func (w *Writer) Flush() ``` Flush writes any buffered data to the underlying io.Writer. To check if an error occurred during the Flush, call Error. ### func (\*Writer) Write ``` func (w *Writer) Write(record []string) error ``` Write writes a single CSV record to w along with any necessary quoting. A record is a slice of strings with each string being one field. Writes are buffered, so Flush must eventually be called to ensure that the record is written to the underlying io.Writer. ### func (\*Writer) WriteAll ``` func (w *Writer) WriteAll(records [][]string) error ``` WriteAll writes multiple CSV records to w using Write and then calls Flush, returning any error from the Flush. #### Example Code: ``` records := [][]string{ {"first_name", "last_name", "username"}, {"Rob", "Pike", "rob"}, {"Ken", "Thompson", "ken"}, {"Robert", "Griesemer", "gri"}, } w := csv.NewWriter(os.Stdout) w.WriteAll(records) // calls Flush internally if err := w.Error(); err != nil { log.Fatalln("error writing csv:", err) } ``` Output: ``` first_name,last_name,username Rob,Pike,rob Ken,Thompson,ken Robert,Griesemer,gri ``` go Package binary Package binary =============== * `import "encoding/binary"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package binary implements simple translation between numbers and byte sequences and encoding and decoding of varints. Numbers are translated by reading and writing fixed-size values. A fixed-size value is either a fixed-size arithmetic type (bool, int8, uint8, int16, float32, complex64, ...) or an array or struct containing only fixed-size values. The varint functions encode and decode single integer values using a variable-length encoding; smaller values require fewer bytes. For a specification, see <https://developers.google.com/protocol-buffers/docs/encoding>. This package favors simplicity over efficiency. Clients that require high-performance serialization, especially for large data structures, should look at more advanced solutions such as the encoding/gob package or protocol buffers. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func AppendUvarint(buf []byte, x uint64) []byte](#AppendUvarint) * [func AppendVarint(buf []byte, x int64) []byte](#AppendVarint) * [func PutUvarint(buf []byte, x uint64) int](#PutUvarint) * [func PutVarint(buf []byte, x int64) int](#PutVarint) * [func Read(r io.Reader, order ByteOrder, data any) error](#Read) * [func ReadUvarint(r io.ByteReader) (uint64, error)](#ReadUvarint) * [func ReadVarint(r io.ByteReader) (int64, error)](#ReadVarint) * [func Size(v any) int](#Size) * [func Uvarint(buf []byte) (uint64, int)](#Uvarint) * [func Varint(buf []byte) (int64, int)](#Varint) * [func Write(w io.Writer, order ByteOrder, data any) error](#Write) * [type AppendByteOrder](#AppendByteOrder) * [type ByteOrder](#ByteOrder) ### Examples [ByteOrder (Get)](#example_ByteOrder_get) [ByteOrder (Put)](#example_ByteOrder_put) [PutUvarint](#example_PutUvarint) [PutVarint](#example_PutVarint) [Read](#example_Read) [Read (Multi)](#example_Read_multi) [Uvarint](#example_Uvarint) [Varint](#example_Varint) [Write](#example_Write) [Write (Multi)](#example_Write_multi) ### Package files binary.go varint.go Constants --------- MaxVarintLenN is the maximum length of a varint-encoded N-bit integer. ``` const ( MaxVarintLen16 = 3 MaxVarintLen32 = 5 MaxVarintLen64 = 10 ) ``` Variables --------- BigEndian is the big-endian implementation of ByteOrder and AppendByteOrder. ``` var BigEndian bigEndian ``` LittleEndian is the little-endian implementation of ByteOrder and AppendByteOrder. ``` var LittleEndian littleEndian ``` func AppendUvarint 1.19 ----------------------- ``` func AppendUvarint(buf []byte, x uint64) []byte ``` AppendUvarint appends the varint-encoded form of x, as generated by PutUvarint, to buf and returns the extended buffer. func AppendVarint 1.19 ---------------------- ``` func AppendVarint(buf []byte, x int64) []byte ``` AppendVarint appends the varint-encoded form of x, as generated by PutVarint, to buf and returns the extended buffer. func PutUvarint --------------- ``` func PutUvarint(buf []byte, x uint64) int ``` PutUvarint encodes a uint64 into buf and returns the number of bytes written. If the buffer is too small, PutUvarint will panic. #### Example Code: ``` buf := make([]byte, binary.MaxVarintLen64) for _, x := range []uint64{1, 2, 127, 128, 255, 256} { n := binary.PutUvarint(buf, x) fmt.Printf("%x\n", buf[:n]) } ``` Output: ``` 01 02 7f 8001 ff01 8002 ``` func PutVarint -------------- ``` func PutVarint(buf []byte, x int64) int ``` PutVarint encodes an int64 into buf and returns the number of bytes written. If the buffer is too small, PutVarint will panic. #### Example Code: ``` buf := make([]byte, binary.MaxVarintLen64) for _, x := range []int64{-65, -64, -2, -1, 0, 1, 2, 63, 64} { n := binary.PutVarint(buf, x) fmt.Printf("%x\n", buf[:n]) } ``` Output: ``` 8101 7f 03 01 00 02 04 7e 8001 ``` func Read --------- ``` func Read(r io.Reader, order ByteOrder, data any) error ``` Read reads structured binary data from r into data. Data must be a pointer to a fixed-size value or a slice of fixed-size values. Bytes read from r are decoded using the specified byte order and written to successive fields of the data. When decoding boolean values, a zero byte is decoded as false, and any other non-zero byte is decoded as true. When reading into structs, the field data for fields with blank (\_) field names is skipped; i.e., blank field names may be used for padding. When reading into a struct, all non-blank fields must be exported or Read may panic. The error is EOF only if no bytes were read. If an EOF happens after reading some but not all the bytes, Read returns ErrUnexpectedEOF. #### Example Code: ``` var pi float64 b := []byte{0x18, 0x2d, 0x44, 0x54, 0xfb, 0x21, 0x09, 0x40} buf := bytes.NewReader(b) err := binary.Read(buf, binary.LittleEndian, &pi) if err != nil { fmt.Println("binary.Read failed:", err) } fmt.Print(pi) ``` Output: ``` 3.141592653589793 ``` #### Example (Multi) Code: ``` b := []byte{0x18, 0x2d, 0x44, 0x54, 0xfb, 0x21, 0x09, 0x40, 0xff, 0x01, 0x02, 0x03, 0xbe, 0xef} r := bytes.NewReader(b) var data struct { PI float64 Uate uint8 Mine [3]byte Too uint16 } if err := binary.Read(r, binary.LittleEndian, &data); err != nil { fmt.Println("binary.Read failed:", err) } fmt.Println(data.PI) fmt.Println(data.Uate) fmt.Printf("% x\n", data.Mine) fmt.Println(data.Too) ``` Output: ``` 3.141592653589793 255 01 02 03 61374 ``` func ReadUvarint ---------------- ``` func ReadUvarint(r io.ByteReader) (uint64, error) ``` ReadUvarint reads an encoded unsigned integer from r and returns it as a uint64. The error is EOF only if no bytes were read. If an EOF happens after reading some but not all the bytes, ReadUvarint returns io.ErrUnexpectedEOF. func ReadVarint --------------- ``` func ReadVarint(r io.ByteReader) (int64, error) ``` ReadVarint reads an encoded signed integer from r and returns it as an int64. The error is EOF only if no bytes were read. If an EOF happens after reading some but not all the bytes, ReadVarint returns io.ErrUnexpectedEOF. func Size --------- ``` func Size(v any) int ``` Size returns how many bytes Write would generate to encode the value v, which must be a fixed-size value or a slice of fixed-size values, or a pointer to such data. If v is neither of these, Size returns -1. func Uvarint ------------ ``` func Uvarint(buf []byte) (uint64, int) ``` Uvarint decodes a uint64 from buf and returns that value and the number of bytes read (> 0). If an error occurred, the value is 0 and the number of bytes n is <= 0 meaning: ``` n == 0: buf too small n < 0: value larger than 64 bits (overflow) and -n is the number of bytes read ``` #### Example Code: ``` inputs := [][]byte{ {0x01}, {0x02}, {0x7f}, {0x80, 0x01}, {0xff, 0x01}, {0x80, 0x02}, } for _, b := range inputs { x, n := binary.Uvarint(b) if n != len(b) { fmt.Println("Uvarint did not consume all of in") } fmt.Println(x) } ``` Output: ``` 1 2 127 128 255 256 ``` func Varint ----------- ``` func Varint(buf []byte) (int64, int) ``` Varint decodes an int64 from buf and returns that value and the number of bytes read (> 0). If an error occurred, the value is 0 and the number of bytes n is <= 0 with the following meaning: ``` n == 0: buf too small n < 0: value larger than 64 bits (overflow) and -n is the number of bytes read ``` #### Example Code: ``` inputs := [][]byte{ {0x81, 0x01}, {0x7f}, {0x03}, {0x01}, {0x00}, {0x02}, {0x04}, {0x7e}, {0x80, 0x01}, } for _, b := range inputs { x, n := binary.Varint(b) if n != len(b) { fmt.Println("Varint did not consume all of in") } fmt.Println(x) } ``` Output: ``` -65 -64 -2 -1 0 1 2 63 64 ``` func Write ---------- ``` func Write(w io.Writer, order ByteOrder, data any) error ``` Write writes the binary representation of data into w. Data must be a fixed-size value or a slice of fixed-size values, or a pointer to such data. Boolean values encode as one byte: 1 for true, and 0 for false. Bytes written to w are encoded using the specified byte order and read from successive fields of the data. When writing structs, zero values are written for fields with blank (\_) field names. #### Example Code: ``` buf := new(bytes.Buffer) var pi float64 = math.Pi err := binary.Write(buf, binary.LittleEndian, pi) if err != nil { fmt.Println("binary.Write failed:", err) } fmt.Printf("% x", buf.Bytes()) ``` Output: ``` 18 2d 44 54 fb 21 09 40 ``` #### Example (Multi) Code: ``` buf := new(bytes.Buffer) var data = []any{ uint16(61374), int8(-54), uint8(254), } for _, v := range data { err := binary.Write(buf, binary.LittleEndian, v) if err != nil { fmt.Println("binary.Write failed:", err) } } fmt.Printf("%x", buf.Bytes()) ``` Output: ``` beefcafe ``` type AppendByteOrder 1.19 ------------------------- AppendByteOrder specifies how to append 16-, 32-, or 64-bit unsigned integers into a byte slice. ``` type AppendByteOrder interface { AppendUint16([]byte, uint16) []byte AppendUint32([]byte, uint32) []byte AppendUint64([]byte, uint64) []byte String() string } ``` type ByteOrder -------------- A ByteOrder specifies how to convert byte slices into 16-, 32-, or 64-bit unsigned integers. ``` type ByteOrder interface { Uint16([]byte) uint16 Uint32([]byte) uint32 Uint64([]byte) uint64 PutUint16([]byte, uint16) PutUint32([]byte, uint32) PutUint64([]byte, uint64) String() string } ``` #### Example (Get) Code: ``` b := []byte{0xe8, 0x03, 0xd0, 0x07} x1 := binary.LittleEndian.Uint16(b[0:]) x2 := binary.LittleEndian.Uint16(b[2:]) fmt.Printf("%#04x %#04x\n", x1, x2) ``` Output: ``` 0x03e8 0x07d0 ``` #### Example (Put) Code: ``` b := make([]byte, 4) binary.LittleEndian.PutUint16(b[0:], 0x03e8) binary.LittleEndian.PutUint16(b[2:], 0x07d0) fmt.Printf("% x\n", b) ``` Output: ``` e8 03 d0 07 ```
programming_docs
go Package base32 Package base32 =============== * `import "encoding/base32"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package base32 implements base32 encoding as specified by RFC 4648. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func NewDecoder(enc \*Encoding, r io.Reader) io.Reader](#NewDecoder) * [func NewEncoder(enc \*Encoding, w io.Writer) io.WriteCloser](#NewEncoder) * [type CorruptInputError](#CorruptInputError) * [func (e CorruptInputError) Error() string](#CorruptInputError.Error) * [type Encoding](#Encoding) * [func NewEncoding(encoder string) \*Encoding](#NewEncoding) * [func (enc \*Encoding) Decode(dst, src []byte) (n int, err error)](#Encoding.Decode) * [func (enc \*Encoding) DecodeString(s string) ([]byte, error)](#Encoding.DecodeString) * [func (enc \*Encoding) DecodedLen(n int) int](#Encoding.DecodedLen) * [func (enc \*Encoding) Encode(dst, src []byte)](#Encoding.Encode) * [func (enc \*Encoding) EncodeToString(src []byte) string](#Encoding.EncodeToString) * [func (enc \*Encoding) EncodedLen(n int) int](#Encoding.EncodedLen) * [func (enc Encoding) WithPadding(padding rune) \*Encoding](#Encoding.WithPadding) ### Examples [Encoding.Decode](#example_Encoding_Decode) [Encoding.DecodeString](#example_Encoding_DecodeString) [Encoding.Encode](#example_Encoding_Encode) [Encoding.EncodeToString](#example_Encoding_EncodeToString) [NewEncoder](#example_NewEncoder) ### Package files base32.go Constants --------- ``` const ( StdPadding rune = '=' // Standard padding character NoPadding rune = -1 // No padding ) ``` Variables --------- HexEncoding is the “Extended Hex Alphabet” defined in RFC 4648. It is typically used in DNS. ``` var HexEncoding = NewEncoding(encodeHex) ``` StdEncoding is the standard base32 encoding, as defined in RFC 4648. ``` var StdEncoding = NewEncoding(encodeStd) ``` func NewDecoder --------------- ``` func NewDecoder(enc *Encoding, r io.Reader) io.Reader ``` NewDecoder constructs a new base32 stream decoder. func NewEncoder --------------- ``` func NewEncoder(enc *Encoding, w io.Writer) io.WriteCloser ``` NewEncoder returns a new base32 stream encoder. Data written to the returned writer will be encoded using enc and then written to w. Base32 encodings operate in 5-byte blocks; when finished writing, the caller must Close the returned encoder to flush any partially written blocks. #### Example Code: ``` input := []byte("foo\x00bar") encoder := base32.NewEncoder(base32.StdEncoding, os.Stdout) encoder.Write(input) // Must close the encoder when finished to flush any partial blocks. // If you comment out the following line, the last partial block "r" // won't be encoded. encoder.Close() ``` Output: ``` MZXW6ADCMFZA==== ``` type CorruptInputError ---------------------- ``` type CorruptInputError int64 ``` ### func (CorruptInputError) Error ``` func (e CorruptInputError) Error() string ``` type Encoding ------------- An Encoding is a radix 32 encoding/decoding scheme, defined by a 32-character alphabet. The most common is the "base32" encoding introduced for SASL GSSAPI and standardized in RFC 4648. The alternate "base32hex" encoding is used in DNSSEC. ``` type Encoding struct { // contains filtered or unexported fields } ``` ### func NewEncoding ``` func NewEncoding(encoder string) *Encoding ``` NewEncoding returns a new Encoding defined by the given alphabet, which must be a 32-byte string. ### func (\*Encoding) Decode ``` func (enc *Encoding) Decode(dst, src []byte) (n int, err error) ``` Decode decodes src using the encoding enc. It writes at most DecodedLen(len(src)) bytes to dst and returns the number of bytes written. If src contains invalid base32 data, it will return the number of bytes successfully written and CorruptInputError. New line characters (\r and \n) are ignored. #### Example Code: ``` str := "JBSWY3DPFQQHO33SNRSCC===" dst := make([]byte, base32.StdEncoding.DecodedLen(len(str))) n, err := base32.StdEncoding.Decode(dst, []byte(str)) if err != nil { fmt.Println("decode error:", err) return } dst = dst[:n] fmt.Printf("%q\n", dst) ``` Output: ``` "Hello, world!" ``` ### func (\*Encoding) DecodeString ``` func (enc *Encoding) DecodeString(s string) ([]byte, error) ``` DecodeString returns the bytes represented by the base32 string s. #### Example Code: ``` str := "ONXW2ZJAMRQXIYJAO5UXI2BAAAQGC3TEEDX3XPY=" data, err := base32.StdEncoding.DecodeString(str) if err != nil { fmt.Println("error:", err) return } fmt.Printf("%q\n", data) ``` Output: ``` "some data with \x00 and \ufeff" ``` ### func (\*Encoding) DecodedLen ``` func (enc *Encoding) DecodedLen(n int) int ``` DecodedLen returns the maximum length in bytes of the decoded data corresponding to n bytes of base32-encoded data. ### func (\*Encoding) Encode ``` func (enc *Encoding) Encode(dst, src []byte) ``` Encode encodes src using the encoding enc, writing EncodedLen(len(src)) bytes to dst. The encoding pads the output to a multiple of 8 bytes, so Encode is not appropriate for use on individual blocks of a large data stream. Use NewEncoder() instead. #### Example Code: ``` data := []byte("Hello, world!") dst := make([]byte, base32.StdEncoding.EncodedLen(len(data))) base32.StdEncoding.Encode(dst, data) fmt.Println(string(dst)) ``` Output: ``` JBSWY3DPFQQHO33SNRSCC=== ``` ### func (\*Encoding) EncodeToString ``` func (enc *Encoding) EncodeToString(src []byte) string ``` EncodeToString returns the base32 encoding of src. #### Example Code: ``` data := []byte("any + old & data") str := base32.StdEncoding.EncodeToString(data) fmt.Println(str) ``` Output: ``` MFXHSIBLEBXWYZBAEYQGIYLUME====== ``` ### func (\*Encoding) EncodedLen ``` func (enc *Encoding) EncodedLen(n int) int ``` EncodedLen returns the length in bytes of the base32 encoding of an input buffer of length n. ### func (Encoding) WithPadding 1.9 ``` func (enc Encoding) WithPadding(padding rune) *Encoding ``` WithPadding creates a new encoding identical to enc except with a specified padding character, or NoPadding to disable padding. The padding character must not be '\r' or '\n', must not be contained in the encoding's alphabet and must be a rune equal or below '\xff'. go Package gob Package gob ============ * `import "encoding/gob"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package gob manages streams of gobs - binary values exchanged between an Encoder (transmitter) and a Decoder (receiver). A typical use is transporting arguments and results of remote procedure calls (RPCs) such as those provided by package "net/rpc". The implementation compiles a custom codec for each data type in the stream and is most efficient when a single Encoder is used to transmit a stream of values, amortizing the cost of compilation. ### Basics A stream of gobs is self-describing. Each data item in the stream is preceded by a specification of its type, expressed in terms of a small set of predefined types. Pointers are not transmitted, but the things they point to are transmitted; that is, the values are flattened. Nil pointers are not permitted, as they have no value. Recursive types work fine, but recursive values (data with cycles) are problematic. This may change. To use gobs, create an Encoder and present it with a series of data items as values or addresses that can be dereferenced to values. The Encoder makes sure all type information is sent before it is needed. At the receive side, a Decoder retrieves values from the encoded stream and unpacks them into local variables. ### Types and Values The source and destination values/types need not correspond exactly. For structs, fields (identified by name) that are in the source but absent from the receiving variable will be ignored. Fields that are in the receiving variable but missing from the transmitted type or value will be ignored in the destination. If a field with the same name is present in both, their types must be compatible. Both the receiver and transmitter will do all necessary indirection and dereferencing to convert between gobs and actual Go values. For instance, a gob type that is schematically, ``` struct { A, B int } ``` can be sent from or received into any of these Go types: ``` struct { A, B int } // the same *struct { A, B int } // extra indirection of the struct struct { *A, **B int } // extra indirection of the fields struct { A, B int64 } // different concrete value type; see below ``` It may also be received into any of these: ``` struct { A, B int } // the same struct { B, A int } // ordering doesn't matter; matching is by name struct { A, B, C int } // extra field (C) ignored struct { B int } // missing field (A) ignored; data will be dropped struct { B, C int } // missing field (A) ignored; extra field (C) ignored. ``` Attempting to receive into these types will draw a decode error: ``` struct { A int; B uint } // change of signedness for B struct { A int; B float } // change of type for B struct { } // no field names in common struct { C, D int } // no field names in common ``` Integers are transmitted two ways: arbitrary precision signed integers or arbitrary precision unsigned integers. There is no int8, int16 etc. discrimination in the gob format; there are only signed and unsigned integers. As described below, the transmitter sends the value in a variable-length encoding; the receiver accepts the value and stores it in the destination variable. Floating-point numbers are always sent using IEEE-754 64-bit precision (see below). Signed integers may be received into any signed integer variable: int, int16, etc.; unsigned integers may be received into any unsigned integer variable; and floating point values may be received into any floating point variable. However, the destination variable must be able to represent the value or the decode operation will fail. Structs, arrays and slices are also supported. Structs encode and decode only exported fields. Strings and arrays of bytes are supported with a special, efficient representation (see below). When a slice is decoded, if the existing slice has capacity the slice will be extended in place; if not, a new array is allocated. Regardless, the length of the resulting slice reports the number of elements decoded. In general, if allocation is required, the decoder will allocate memory. If not, it will update the destination variables with values read from the stream. It does not initialize them first, so if the destination is a compound value such as a map, struct, or slice, the decoded values will be merged elementwise into the existing variables. Functions and channels will not be sent in a gob. Attempting to encode such a value at the top level will fail. A struct field of chan or func type is treated exactly like an unexported field and is ignored. Gob can encode a value of any type implementing the GobEncoder or encoding.BinaryMarshaler interfaces by calling the corresponding method, in that order of preference. Gob can decode a value of any type implementing the GobDecoder or encoding.BinaryUnmarshaler interfaces by calling the corresponding method, again in that order of preference. ### Encoding Details This section documents the encoding, details that are not important for most users. Details are presented bottom-up. An unsigned integer is sent one of two ways. If it is less than 128, it is sent as a byte with that value. Otherwise it is sent as a minimal-length big-endian (high byte first) byte stream holding the value, preceded by one byte holding the byte count, negated. Thus 0 is transmitted as (00), 7 is transmitted as (07) and 256 is transmitted as (FE 01 00). A boolean is encoded within an unsigned integer: 0 for false, 1 for true. A signed integer, i, is encoded within an unsigned integer, u. Within u, bits 1 upward contain the value; bit 0 says whether they should be complemented upon receipt. The encode algorithm looks like this: ``` var u uint if i < 0 { u = (^uint(i) << 1) | 1 // complement i, bit 0 is 1 } else { u = (uint(i) << 1) // do not complement i, bit 0 is 0 } encodeUnsigned(u) ``` The low bit is therefore analogous to a sign bit, but making it the complement bit instead guarantees that the largest negative integer is not a special case. For example, -129=^128=(^256>>1) encodes as (FE 01 01). Floating-point numbers are always sent as a representation of a float64 value. That value is converted to a uint64 using math.Float64bits. The uint64 is then byte-reversed and sent as a regular unsigned integer. The byte-reversal means the exponent and high-precision part of the mantissa go first. Since the low bits are often zero, this can save encoding bytes. For instance, 17.0 is encoded in only three bytes (FE 31 40). Strings and slices of bytes are sent as an unsigned count followed by that many uninterpreted bytes of the value. All other slices and arrays are sent as an unsigned count followed by that many elements using the standard gob encoding for their type, recursively. Maps are sent as an unsigned count followed by that many key, element pairs. Empty but non-nil maps are sent, so if the receiver has not allocated one already, one will always be allocated on receipt unless the transmitted map is nil and not at the top level. In slices and arrays, as well as maps, all elements, even zero-valued elements, are transmitted, even if all the elements are zero. Structs are sent as a sequence of (field number, field value) pairs. The field value is sent using the standard gob encoding for its type, recursively. If a field has the zero value for its type (except for arrays; see above), it is omitted from the transmission. The field number is defined by the type of the encoded struct: the first field of the encoded type is field 0, the second is field 1, etc. When encoding a value, the field numbers are delta encoded for efficiency and the fields are always sent in order of increasing field number; the deltas are therefore unsigned. The initialization for the delta encoding sets the field number to -1, so an unsigned integer field 0 with value 7 is transmitted as unsigned delta = 1, unsigned value = 7 or (01 07). Finally, after all the fields have been sent a terminating mark denotes the end of the struct. That mark is a delta=0 value, which has representation (00). Interface types are not checked for compatibility; all interface types are treated, for transmission, as members of a single "interface" type, analogous to int or []byte - in effect they're all treated as interface{}. Interface values are transmitted as a string identifying the concrete type being sent (a name that must be pre-defined by calling Register), followed by a byte count of the length of the following data (so the value can be skipped if it cannot be stored), followed by the usual encoding of concrete (dynamic) value stored in the interface value. (A nil interface value is identified by the empty string and transmits no value.) Upon receipt, the decoder verifies that the unpacked concrete item satisfies the interface of the receiving variable. If a value is passed to Encode and the type is not a struct (or pointer to struct, etc.), for simplicity of processing it is represented as a struct of one field. The only visible effect of this is to encode a zero byte after the value, just as after the last field of an encoded struct, so that the decode algorithm knows when the top-level value is complete. The representation of types is described below. When a type is defined on a given connection between an Encoder and Decoder, it is assigned a signed integer type id. When Encoder.Encode(v) is called, it makes sure there is an id assigned for the type of v and all its elements and then it sends the pair (typeid, encoded-v) where typeid is the type id of the encoded type of v and encoded-v is the gob encoding of the value v. To define a type, the encoder chooses an unused, positive type id and sends the pair (-type id, encoded-type) where encoded-type is the gob encoding of a wireType description, constructed from these types: ``` type wireType struct { ArrayT *ArrayType SliceT *SliceType StructT *StructType MapT *MapType GobEncoderT *gobEncoderType BinaryMarshalerT *gobEncoderType TextMarshalerT *gobEncoderType } type arrayType struct { CommonType Elem typeId Len int } type CommonType struct { Name string // the name of the struct type Id int // the id of the type, repeated so it's inside the type } type sliceType struct { CommonType Elem typeId } type structType struct { CommonType Field []*fieldType // the fields of the struct. } type fieldType struct { Name string // the name of the field. Id int // the type id of the field, which must be already defined } type mapType struct { CommonType Key typeId Elem typeId } type gobEncoderType struct { CommonType } ``` If there are nested type ids, the types for all inner type ids must be defined before the top-level type id is used to describe an encoded-v. For simplicity in setup, the connection is defined to understand these types a priori, as well as the basic gob types int, uint, etc. Their ids are: ``` bool 1 int 2 uint 3 float 4 []byte 5 string 6 complex 7 interface 8 // gap for reserved ids. WireType 16 ArrayType 17 CommonType 18 SliceType 19 StructType 20 FieldType 21 // 22 is slice of fieldType. MapType 23 ``` Finally, each message created by a call to Encode is preceded by an encoded unsigned integer count of the number of bytes remaining in the message. After the initial type name, interface values are wrapped the same way; in effect, the interface value acts like a recursive invocation of Encode. In summary, a gob stream looks like ``` (byteCount (-type id, encoding of a wireType)* (type id, encoding of a value))* ``` where \* signifies zero or more repetitions and the type id of a value must be predefined or be defined before the value in the stream. Compatibility: Any future changes to the package will endeavor to maintain compatibility with streams encoded using previous versions. That is, any released version of this package should be able to decode data written with any previously released version, subject to issues such as security fixes. See the Go compatibility document for background: <https://golang.org/doc/go1compat> See "Gobs of data" for a design discussion of the gob wire format: <https://blog.golang.org/gobs-of-data> ### Security This package is not designed to be hardened against adversarial inputs, and is outside the scope of <https://go.dev/security/policy>. In particular, the Decoder does only basic sanity checking on decoded input sizes, and its limits are not configurable. Care should be taken when decoding gob data from untrusted sources, which may consume significant resources. #### Example (Basic) This example shows the basic usage of the package: Create an encoder, transmit some values, receive them with a decoder. Code: ``` package gob_test import ( "bytes" "encoding/gob" "fmt" "log" ) type P struct { X, Y, Z int Name string } type Q struct { X, Y *int32 Name string } // This example shows the basic usage of the package: Create an encoder, // transmit some values, receive them with a decoder. func Example_basic() { // Initialize the encoder and decoder. Normally enc and dec would be // bound to network connections and the encoder and decoder would // run in different processes. var network bytes.Buffer // Stand-in for a network connection enc := gob.NewEncoder(&network) // Will write to network. dec := gob.NewDecoder(&network) // Will read from network. // Encode (send) some values. err := enc.Encode(P{3, 4, 5, "Pythagoras"}) if err != nil { log.Fatal("encode error:", err) } err = enc.Encode(P{1782, 1841, 1922, "Treehouse"}) if err != nil { log.Fatal("encode error:", err) } // Decode (receive) and print the values. var q Q err = dec.Decode(&q) if err != nil { log.Fatal("decode error 1:", err) } fmt.Printf("%q: {%d, %d}\n", q.Name, *q.X, *q.Y) err = dec.Decode(&q) if err != nil { log.Fatal("decode error 2:", err) } fmt.Printf("%q: {%d, %d}\n", q.Name, *q.X, *q.Y) // Output: // "Pythagoras": {3, 4} // "Treehouse": {1782, 1841} } ``` #### Example (EncodeDecode) This example transmits a value that implements the custom encoding and decoding methods. Code: ``` package gob_test import ( "bytes" "encoding/gob" "fmt" "log" ) // The Vector type has unexported fields, which the package cannot access. // We therefore write a BinaryMarshal/BinaryUnmarshal method pair to allow us // to send and receive the type with the gob package. These interfaces are // defined in the "encoding" package. // We could equivalently use the locally defined GobEncode/GobDecoder // interfaces. type Vector struct { x, y, z int } func (v Vector) MarshalBinary() ([]byte, error) { // A simple encoding: plain text. var b bytes.Buffer fmt.Fprintln(&b, v.x, v.y, v.z) return b.Bytes(), nil } // UnmarshalBinary modifies the receiver so it must take a pointer receiver. func (v *Vector) UnmarshalBinary(data []byte) error { // A simple encoding: plain text. b := bytes.NewBuffer(data) _, err := fmt.Fscanln(b, &v.x, &v.y, &v.z) return err } // This example transmits a value that implements the custom encoding and decoding methods. func Example_encodeDecode() { var network bytes.Buffer // Stand-in for the network. // Create an encoder and send a value. enc := gob.NewEncoder(&network) err := enc.Encode(Vector{3, 4, 5}) if err != nil { log.Fatal("encode:", err) } // Create a decoder and receive a value. dec := gob.NewDecoder(&network) var v Vector err = dec.Decode(&v) if err != nil { log.Fatal("decode:", err) } fmt.Println(v) // Output: // {3 4 5} } ``` #### Example (Interface) This example shows how to encode an interface value. The key distinction from regular types is to register the concrete type that implements the interface. Code: ``` package gob_test import ( "bytes" "encoding/gob" "fmt" "log" "math" ) type Point struct { X, Y int } func (p Point) Hypotenuse() float64 { return math.Hypot(float64(p.X), float64(p.Y)) } type Pythagoras interface { Hypotenuse() float64 } // This example shows how to encode an interface value. The key // distinction from regular types is to register the concrete type that // implements the interface. func Example_interface() { var network bytes.Buffer // Stand-in for the network. // We must register the concrete type for the encoder and decoder (which would // normally be on a separate machine from the encoder). On each end, this tells the // engine which concrete type is being sent that implements the interface. gob.Register(Point{}) // Create an encoder and send some values. enc := gob.NewEncoder(&network) for i := 1; i <= 3; i++ { interfaceEncode(enc, Point{3 * i, 4 * i}) } // Create a decoder and receive some values. dec := gob.NewDecoder(&network) for i := 1; i <= 3; i++ { result := interfaceDecode(dec) fmt.Println(result.Hypotenuse()) } // Output: // 5 // 10 // 15 } // interfaceEncode encodes the interface value into the encoder. func interfaceEncode(enc *gob.Encoder, p Pythagoras) { // The encode will fail unless the concrete type has been // registered. We registered it in the calling function. // Pass pointer to interface so Encode sees (and hence sends) a value of // interface type. If we passed p directly it would see the concrete type instead. // See the blog post, "The Laws of Reflection" for background. err := enc.Encode(&p) if err != nil { log.Fatal("encode:", err) } } // interfaceDecode decodes the next interface value from the stream and returns it. func interfaceDecode(dec *gob.Decoder) Pythagoras { // The decode will fail unless the concrete type on the wire has been // registered. We registered it in the calling function. var p Pythagoras err := dec.Decode(&p) if err != nil { log.Fatal("decode:", err) } return p } ``` Index ----- * [func Register(value any)](#Register) * [func RegisterName(name string, value any)](#RegisterName) * [type CommonType](#CommonType) * [type Decoder](#Decoder) * [func NewDecoder(r io.Reader) \*Decoder](#NewDecoder) * [func (dec \*Decoder) Decode(e any) error](#Decoder.Decode) * [func (dec \*Decoder) DecodeValue(v reflect.Value) error](#Decoder.DecodeValue) * [type Encoder](#Encoder) * [func NewEncoder(w io.Writer) \*Encoder](#NewEncoder) * [func (enc \*Encoder) Encode(e any) error](#Encoder.Encode) * [func (enc \*Encoder) EncodeValue(value reflect.Value) error](#Encoder.EncodeValue) * [type GobDecoder](#GobDecoder) * [type GobEncoder](#GobEncoder) ### Examples [Package (Basic)](#example__basic) [Package (EncodeDecode)](#example__encodeDecode) [Package (Interface)](#example__interface) ### Package files dec\_helpers.go decode.go decoder.go doc.go enc\_helpers.go encode.go encoder.go error.go type.go func Register ------------- ``` func Register(value any) ``` Register records a type, identified by a value for that type, under its internal type name. That name will identify the concrete type of a value sent or received as an interface variable. Only types that will be transferred as implementations of interface values need to be registered. Expecting to be used only during initialization, it panics if the mapping between types and names is not a bijection. func RegisterName ----------------- ``` func RegisterName(name string, value any) ``` RegisterName is like Register but uses the provided name rather than the type's default. type CommonType --------------- CommonType holds elements of all types. It is a historical artifact, kept for binary compatibility and exported only for the benefit of the package's encoding of type descriptors. It is not intended for direct use by clients. ``` type CommonType struct { Name string Id typeId } ``` type Decoder ------------ A Decoder manages the receipt of type and data information read from the remote side of a connection. It is safe for concurrent use by multiple goroutines. The Decoder does only basic sanity checking on decoded input sizes, and its limits are not configurable. Take caution when decoding gob data from untrusted sources. ``` type Decoder struct { // contains filtered or unexported fields } ``` ### func NewDecoder ``` func NewDecoder(r io.Reader) *Decoder ``` NewDecoder returns a new decoder that reads from the io.Reader. If r does not also implement io.ByteReader, it will be wrapped in a bufio.Reader. ### func (\*Decoder) Decode ``` func (dec *Decoder) Decode(e any) error ``` Decode reads the next value from the input stream and stores it in the data represented by the empty interface value. If e is nil, the value will be discarded. Otherwise, the value underlying e must be a pointer to the correct type for the next data item received. If the input is at EOF, Decode returns io.EOF and does not modify e. ### func (\*Decoder) DecodeValue ``` func (dec *Decoder) DecodeValue(v reflect.Value) error ``` DecodeValue reads the next value from the input stream. If v is the zero reflect.Value (v.Kind() == Invalid), DecodeValue discards the value. Otherwise, it stores the value into v. In that case, v must represent a non-nil pointer to data or be an assignable reflect.Value (v.CanSet()) If the input is at EOF, DecodeValue returns io.EOF and does not modify v. type Encoder ------------ An Encoder manages the transmission of type and data information to the other side of a connection. It is safe for concurrent use by multiple goroutines. ``` type Encoder struct { // contains filtered or unexported fields } ``` ### func NewEncoder ``` func NewEncoder(w io.Writer) *Encoder ``` NewEncoder returns a new encoder that will transmit on the io.Writer. ### func (\*Encoder) Encode ``` func (enc *Encoder) Encode(e any) error ``` Encode transmits the data item represented by the empty interface value, guaranteeing that all necessary type information has been transmitted first. Passing a nil pointer to Encoder will panic, as they cannot be transmitted by gob. ### func (\*Encoder) EncodeValue ``` func (enc *Encoder) EncodeValue(value reflect.Value) error ``` EncodeValue transmits the data item represented by the reflection value, guaranteeing that all necessary type information has been transmitted first. Passing a nil pointer to EncodeValue will panic, as they cannot be transmitted by gob. type GobDecoder --------------- GobDecoder is the interface describing data that provides its own routine for decoding transmitted values sent by a GobEncoder. ``` type GobDecoder interface { // GobDecode overwrites the receiver, which must be a pointer, // with the value represented by the byte slice, which was written // by GobEncode, usually for the same concrete type. GobDecode([]byte) error } ``` type GobEncoder --------------- GobEncoder is the interface describing data that provides its own representation for encoding values for transmission to a GobDecoder. A type that implements GobEncoder and GobDecoder has complete control over the representation of its data and may therefore contain things such as private fields, channels, and functions, which are not usually transmissible in gob streams. Note: Since gobs can be stored permanently, it is good design to guarantee the encoding used by a GobEncoder is stable as the software evolves. For instance, it might make sense for GobEncode to include a version number in the encoding. ``` type GobEncoder interface { // GobEncode returns a byte slice representing the encoding of the // receiver for transmission to a GobDecoder, usually of the same // concrete type. GobEncode() ([]byte, error) } ```
programming_docs
go Package html Package html ============= * `import "html"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package html provides functions for escaping and unescaping HTML text. Index ----- * [func EscapeString(s string) string](#EscapeString) * [func UnescapeString(s string) string](#UnescapeString) ### Examples [EscapeString](#example_EscapeString) [UnescapeString](#example_UnescapeString) ### Package files entity.go escape.go func EscapeString ----------------- ``` func EscapeString(s string) string ``` EscapeString escapes special characters like "<" to become "&lt;". It escapes only five such characters: <, >, &, ' and ". UnescapeString(EscapeString(s)) == s always holds, but the converse isn't always true. #### Example Code: ``` const s = `"Fran & Freddie's Diner" <[email protected]>` fmt.Println(html.EscapeString(s)) ``` Output: ``` &#34;Fran &amp; Freddie&#39;s Diner&#34; &lt;[email protected]&gt; ``` func UnescapeString ------------------- ``` func UnescapeString(s string) string ``` UnescapeString unescapes entities like "&lt;" to become "<". It unescapes a larger range of entities than EscapeString escapes. For example, "&aacute;" unescapes to "á", as does "&#225;" and "&#xE1;". UnescapeString(EscapeString(s)) == s always holds, but the converse isn't always true. #### Example Code: ``` const s = `&quot;Fran &amp; Freddie&#39;s Diner&quot; &lt;[email protected]&gt;` fmt.Println(html.UnescapeString(s)) ``` Output: ``` "Fran & Freddie's Diner" <[email protected]> ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [template](template/index) | Package template (html/template) implements data-driven templates for generating HTML output safe against code injection. | go Package template Package template ================= * `import "html/template"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package template (html/template) implements data-driven templates for generating HTML output safe against code injection. It provides the same interface as package text/template and should be used instead of text/template whenever the output is HTML. The documentation here focuses on the security features of the package. For information about how to program the templates themselves, see the documentation for text/template. ### Introduction This package wraps package text/template so you can share its template API to parse and execute HTML templates safely. ``` tmpl, err := template.New("name").Parse(...) // Error checking elided err = tmpl.Execute(out, data) ``` If successful, tmpl will now be injection-safe. Otherwise, err is an error defined in the docs for ErrorCode. HTML templates treat data values as plain text which should be encoded so they can be safely embedded in an HTML document. The escaping is contextual, so actions can appear within JavaScript, CSS, and URI contexts. The security model used by this package assumes that template authors are trusted, while Execute's data parameter is not. More details are provided below. Example ``` import "text/template" ... t, err := template.New("foo").Parse(`{{define "T"}}Hello, {{.}}!{{end}}`) err = t.ExecuteTemplate(out, "T", "<script>alert('you have been pwned')</script>") ``` produces ``` Hello, <script>alert('you have been pwned')</script>! ``` but the contextual autoescaping in html/template ``` import "html/template" ... t, err := template.New("foo").Parse(`{{define "T"}}Hello, {{.}}!{{end}}`) err = t.ExecuteTemplate(out, "T", "<script>alert('you have been pwned')</script>") ``` produces safe, escaped HTML output ``` Hello, &lt;script&gt;alert(&#39;you have been pwned&#39;)&lt;/script&gt;! ``` ### Contexts This package understands HTML, CSS, JavaScript, and URIs. It adds sanitizing functions to each simple action pipeline, so given the excerpt ``` <a href="/search?q={{.}}">{{.}}</a> ``` At parse time each {{.}} is overwritten to add escaping functions as necessary. In this case it becomes ``` <a href="/search?q={{. | urlescaper | attrescaper}}">{{. | htmlescaper}}</a> ``` where urlescaper, attrescaper, and htmlescaper are aliases for internal escaping functions. For these internal escaping functions, if an action pipeline evaluates to a nil interface value, it is treated as though it were an empty string. ### Namespaced and data- attributes Attributes with a namespace are treated as if they had no namespace. Given the excerpt ``` <a my:href="{{.}}"></a> ``` At parse time the attribute will be treated as if it were just "href". So at parse time the template becomes: ``` <a my:href="{{. | urlescaper | attrescaper}}"></a> ``` Similarly to attributes with namespaces, attributes with a "data-" prefix are treated as if they had no "data-" prefix. So given ``` <a data-href="{{.}}"></a> ``` At parse time this becomes ``` <a data-href="{{. | urlescaper | attrescaper}}"></a> ``` If an attribute has both a namespace and a "data-" prefix, only the namespace will be removed when determining the context. For example ``` <a my:data-href="{{.}}"></a> ``` This is handled as if "my:data-href" was just "data-href" and not "href" as it would be if the "data-" prefix were to be ignored too. Thus at parse time this becomes just ``` <a my:data-href="{{. | attrescaper}}"></a> ``` As a special case, attributes with the namespace "xmlns" are always treated as containing URLs. Given the excerpts ``` <a xmlns:title="{{.}}"></a> <a xmlns:href="{{.}}"></a> <a xmlns:onclick="{{.}}"></a> ``` At parse time they become: ``` <a xmlns:title="{{. | urlescaper | attrescaper}}"></a> <a xmlns:href="{{. | urlescaper | attrescaper}}"></a> <a xmlns:onclick="{{. | urlescaper | attrescaper}}"></a> ``` ### Errors See the documentation of ErrorCode for details. ### A fuller picture The rest of this package comment may be skipped on first reading; it includes details necessary to understand escaping contexts and error messages. Most users will not need to understand these details. ### Contexts Assuming {{.}} is `O'Reilly: How are <i>you</i>?`, the table below shows how {{.}} appears when used in the context to the left. ``` Context {{.}} After {{.}} O'Reilly: How are &lt;i&gt;you&lt;/i&gt;? <a title='{{.}}'> O&#39;Reilly: How are you? <a href="/{{.}}"> O&#39;Reilly: How are %3ci%3eyou%3c/i%3e? <a href="?q={{.}}"> O&#39;Reilly%3a%20How%20are%3ci%3e...%3f <a onx='f("{{.}}")'> O\x27Reilly: How are \x3ci\x3eyou...? <a onx='f({{.}})'> "O\x27Reilly: How are \x3ci\x3eyou...?" <a onx='pattern = /{{.}}/;'> O\x27Reilly: How are \x3ci\x3eyou...\x3f ``` If used in an unsafe context, then the value might be filtered out: ``` Context {{.}} After <a href="{{.}}"> #ZgotmplZ ``` since "O'Reilly:" is not an allowed protocol like "http:". If {{.}} is the innocuous word, `left`, then it can appear more widely, ``` Context {{.}} After {{.}} left <a title='{{.}}'> left <a href='{{.}}'> left <a href='/{{.}}'> left <a href='?dir={{.}}'> left <a style="border-{{.}}: 4px"> left <a style="align: {{.}}"> left <a style="background: '{{.}}'> left <a style="background: url('{{.}}')> left <style>p.{{.}} {color:red}</style> left ``` Non-string values can be used in JavaScript contexts. If {{.}} is ``` struct{A,B string}{ "foo", "bar" } ``` in the escaped template ``` <script>var pair = {{.}};</script> ``` then the template output is ``` <script>var pair = {"A": "foo", "B": "bar"};</script> ``` See package json to understand how non-string content is marshaled for embedding in JavaScript contexts. ### Typed Strings By default, this package assumes that all pipelines produce a plain text string. It adds escaping pipeline stages necessary to correctly and safely embed that plain text string in the appropriate context. When a data value is not plain text, you can make sure it is not over-escaped by marking it with its type. Types HTML, JS, URL, and others from content.go can carry safe content that is exempted from escaping. The template ``` Hello, {{.}}! ``` can be invoked with ``` tmpl.Execute(out, template.HTML(`<b>World</b>`)) ``` to produce ``` Hello, <b>World</b>! ``` instead of the ``` Hello, &lt;b&gt;World&lt;b&gt;! ``` that would have been produced if {{.}} was a regular string. ### Security Model <https://rawgit.com/mikesamuel/sanitized-jquery-templates/trunk/safetemplate.html#problem_definition> defines "safe" as used by this package. This package assumes that template authors are trusted, that Execute's data parameter is not, and seeks to preserve the properties below in the face of untrusted data: Structure Preservation Property: "... when a template author writes an HTML tag in a safe templating language, the browser will interpret the corresponding portion of the output as a tag regardless of the values of untrusted data, and similarly for other structures such as attribute boundaries and JS and CSS string boundaries." Code Effect Property: "... only code specified by the template author should run as a result of injecting the template output into a page and all code specified by the template author should run as a result of the same." Least Surprise Property: "A developer (or code reviewer) familiar with HTML, CSS, and JavaScript, who knows that contextual autoescaping happens should be able to look at a {{.}} and correctly infer what sanitization happens." #### Example Code: ``` const tpl = ` <!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <title>{{.Title}}</title> </head> <body> {{range .Items}}<div>{{ . }}</div>{{else}}<div><strong>no rows</strong></div>{{end}} </body> </html>` check := func(err error) { if err != nil { log.Fatal(err) } } t, err := template.New("webpage").Parse(tpl) check(err) data := struct { Title string Items []string }{ Title: "My page", Items: []string{ "My photos", "My blog", }, } err = t.Execute(os.Stdout, data) check(err) noItems := struct { Title string Items []string }{ Title: "My another page", Items: []string{}, } err = t.Execute(os.Stdout, noItems) check(err) ``` Output: ``` <!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <title>My page</title> </head> <body> <div>My photos</div><div>My blog</div> </body> </html> <!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <title>My another page</title> </head> <body> <div><strong>no rows</strong></div> </body> </html> ``` #### Example (Autoescaping) Code: ``` check := func(err error) { if err != nil { log.Fatal(err) } } t, err := template.New("foo").Parse(`{{define "T"}}Hello, {{.}}!{{end}}`) check(err) err = t.ExecuteTemplate(os.Stdout, "T", "<script>alert('you have been pwned')</script>") check(err) ``` Output: ``` Hello, &lt;script&gt;alert(&#39;you have been pwned&#39;)&lt;/script&gt;! ``` #### Example (Escape) Code: ``` const s = `"Fran & Freddie's Diner" <[email protected]>` v := []any{`"Fran & Freddie's Diner"`, ' ', `<[email protected]>`} fmt.Println(template.HTMLEscapeString(s)) template.HTMLEscape(os.Stdout, []byte(s)) fmt.Fprintln(os.Stdout, "") fmt.Println(template.HTMLEscaper(v...)) fmt.Println(template.JSEscapeString(s)) template.JSEscape(os.Stdout, []byte(s)) fmt.Fprintln(os.Stdout, "") fmt.Println(template.JSEscaper(v...)) fmt.Println(template.URLQueryEscaper(v...)) ``` Output: ``` &#34;Fran &amp; Freddie&#39;s Diner&#34; &lt;[email protected]&gt; &#34;Fran &amp; Freddie&#39;s Diner&#34; &lt;[email protected]&gt; &#34;Fran &amp; Freddie&#39;s Diner&#34;32&lt;[email protected]&gt; \"Fran \u0026 Freddie\'s Diner\" \[email protected]\u003E \"Fran \u0026 Freddie\'s Diner\" \[email protected]\u003E \"Fran \u0026 Freddie\'s Diner\"32\[email protected]\u003E %22Fran+%26+Freddie%27s+Diner%2232%3Ctasty%40example.com%3E ``` Index ----- * [func HTMLEscape(w io.Writer, b []byte)](#HTMLEscape) * [func HTMLEscapeString(s string) string](#HTMLEscapeString) * [func HTMLEscaper(args ...any) string](#HTMLEscaper) * [func IsTrue(val any) (truth, ok bool)](#IsTrue) * [func JSEscape(w io.Writer, b []byte)](#JSEscape) * [func JSEscapeString(s string) string](#JSEscapeString) * [func JSEscaper(args ...any) string](#JSEscaper) * [func URLQueryEscaper(args ...any) string](#URLQueryEscaper) * [type CSS](#CSS) * [type Error](#Error) * [func (e \*Error) Error() string](#Error.Error) * [type ErrorCode](#ErrorCode) * [type FuncMap](#FuncMap) * [type HTML](#HTML) * [type HTMLAttr](#HTMLAttr) * [type JS](#JS) * [type JSStr](#JSStr) * [type Srcset](#Srcset) * [type Template](#Template) * [func Must(t \*Template, err error) \*Template](#Must) * [func New(name string) \*Template](#New) * [func ParseFS(fs fs.FS, patterns ...string) (\*Template, error)](#ParseFS) * [func ParseFiles(filenames ...string) (\*Template, error)](#ParseFiles) * [func ParseGlob(pattern string) (\*Template, error)](#ParseGlob) * [func (t \*Template) AddParseTree(name string, tree \*parse.Tree) (\*Template, error)](#Template.AddParseTree) * [func (t \*Template) Clone() (\*Template, error)](#Template.Clone) * [func (t \*Template) DefinedTemplates() string](#Template.DefinedTemplates) * [func (t \*Template) Delims(left, right string) \*Template](#Template.Delims) * [func (t \*Template) Execute(wr io.Writer, data any) error](#Template.Execute) * [func (t \*Template) ExecuteTemplate(wr io.Writer, name string, data any) error](#Template.ExecuteTemplate) * [func (t \*Template) Funcs(funcMap FuncMap) \*Template](#Template.Funcs) * [func (t \*Template) Lookup(name string) \*Template](#Template.Lookup) * [func (t \*Template) Name() string](#Template.Name) * [func (t \*Template) New(name string) \*Template](#Template.New) * [func (t \*Template) Option(opt ...string) \*Template](#Template.Option) * [func (t \*Template) Parse(text string) (\*Template, error)](#Template.Parse) * [func (t \*Template) ParseFS(fs fs.FS, patterns ...string) (\*Template, error)](#Template.ParseFS) * [func (t \*Template) ParseFiles(filenames ...string) (\*Template, error)](#Template.ParseFiles) * [func (t \*Template) ParseGlob(pattern string) (\*Template, error)](#Template.ParseGlob) * [func (t \*Template) Templates() []\*Template](#Template.Templates) * [type URL](#URL) ### Examples [Package](#example_) [Template.Delims](#example_Template_Delims) [Template (Block)](#example_Template_block) [Template (Glob)](#example_Template_glob) [Template (Helpers)](#example_Template_helpers) [Template (Parsefiles)](#example_Template_parsefiles) [Template (Share)](#example_Template_share) [Package (Autoescaping)](#example__autoescaping) [Package (Escape)](#example__escape) ### Package files attr.go attr\_string.go content.go context.go css.go delim\_string.go doc.go element\_string.go error.go escape.go html.go js.go jsctx\_string.go state\_string.go template.go transition.go url.go urlpart\_string.go func HTMLEscape --------------- ``` func HTMLEscape(w io.Writer, b []byte) ``` HTMLEscape writes to w the escaped HTML equivalent of the plain text data b. func HTMLEscapeString --------------------- ``` func HTMLEscapeString(s string) string ``` HTMLEscapeString returns the escaped HTML equivalent of the plain text data s. func HTMLEscaper ---------------- ``` func HTMLEscaper(args ...any) string ``` HTMLEscaper returns the escaped HTML equivalent of the textual representation of its arguments. func IsTrue 1.6 --------------- ``` func IsTrue(val any) (truth, ok bool) ``` IsTrue reports whether the value is 'true', in the sense of not the zero of its type, and whether the value has a meaningful truth value. This is the definition of truth used by if and other such actions. func JSEscape ------------- ``` func JSEscape(w io.Writer, b []byte) ``` JSEscape writes to w the escaped JavaScript equivalent of the plain text data b. func JSEscapeString ------------------- ``` func JSEscapeString(s string) string ``` JSEscapeString returns the escaped JavaScript equivalent of the plain text data s. func JSEscaper -------------- ``` func JSEscaper(args ...any) string ``` JSEscaper returns the escaped JavaScript equivalent of the textual representation of its arguments. func URLQueryEscaper -------------------- ``` func URLQueryEscaper(args ...any) string ``` URLQueryEscaper returns the escaped value of the textual representation of its arguments in a form suitable for embedding in a URL query. type CSS -------- CSS encapsulates known safe content that matches any of: 1. The CSS3 stylesheet production, such as `p { color: purple }`. 2. The CSS3 rule production, such as `a[href=~"https:"].foo#bar`. 3. CSS3 declaration productions, such as `color: red; margin: 2px`. 4. The CSS3 value production, such as `rgba(0, 0, 255, 127)`. See <https://www.w3.org/TR/css3-syntax/#parsing> and <https://web.archive.org/web/20090211114933/http://w3.org/TR/css3-syntax#style> Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. ``` type CSS string ``` type Error ---------- Error describes a problem encountered during template Escaping. ``` type Error struct { // ErrorCode describes the kind of error. ErrorCode ErrorCode // Node is the node that caused the problem, if known. // If not nil, it overrides Name and Line. Node parse.Node // Go 1.4 // Name is the name of the template in which the error was encountered. Name string // Line is the line number of the error in the template source or 0. Line int // Description is a human-readable description of the problem. Description string } ``` ### func (\*Error) Error ``` func (e *Error) Error() string ``` type ErrorCode -------------- ErrorCode is a code for a kind of error. ``` type ErrorCode int ``` We define codes for each error that manifests while escaping templates, but escaped templates may also fail at runtime. Output: "ZgotmplZ" Example: ``` <img src="{{.X}}"> where {{.X}} evaluates to `javascript:...` ``` Discussion: ``` "ZgotmplZ" is a special value that indicates that unsafe content reached a CSS or URL context at runtime. The output of the example will be <img src="#ZgotmplZ"> If the data comes from a trusted source, use content types to exempt it from filtering: URL(`javascript:...`). ``` ``` const ( // OK indicates the lack of an error. OK ErrorCode = iota // ErrAmbigContext: "... appears in an ambiguous context within a URL" // Example: // <a href=" // {{if .C}} // /path/ // {{else}} // /search?q= // {{end}} // {{.X}} // "> // Discussion: // {{.X}} is in an ambiguous URL context since, depending on {{.C}}, // it may be either a URL suffix or a query parameter. // Moving {{.X}} into the condition removes the ambiguity: // <a href="{{if .C}}/path/{{.X}}{{else}}/search?q={{.X}}"> ErrAmbigContext // ErrBadHTML: "expected space, attr name, or end of tag, but got ...", // "... in unquoted attr", "... in attribute name" // Example: // <a href = /search?q=foo> // <href=foo> // <form na<e=...> // <option selected< // Discussion: // This is often due to a typo in an HTML element, but some runes // are banned in tag names, attribute names, and unquoted attribute // values because they can tickle parser ambiguities. // Quoting all attributes is the best policy. ErrBadHTML // ErrBranchEnd: "{{if}} branches end in different contexts" // Example: // {{if .C}}<a href="{{end}}{{.X}} // Discussion: // Package html/template statically examines each path through an // {{if}}, {{range}}, or {{with}} to escape any following pipelines. // The example is ambiguous since {{.X}} might be an HTML text node, // or a URL prefix in an HTML attribute. The context of {{.X}} is // used to figure out how to escape it, but that context depends on // the run-time value of {{.C}} which is not statically known. // // The problem is usually something like missing quotes or angle // brackets, or can be avoided by refactoring to put the two contexts // into different branches of an if, range or with. If the problem // is in a {{range}} over a collection that should never be empty, // adding a dummy {{else}} can help. ErrBranchEnd // ErrEndContext: "... ends in a non-text context: ..." // Examples: // <div // <div title="no close quote> // <script>f() // Discussion: // Executed templates should produce a DocumentFragment of HTML. // Templates that end without closing tags will trigger this error. // Templates that should not be used in an HTML context or that // produce incomplete Fragments should not be executed directly. // // {{define "main"}} <script>{{template "helper"}}</script> {{end}} // {{define "helper"}} document.write(' <div title=" ') {{end}} // // "helper" does not produce a valid document fragment, so should // not be Executed directly. ErrEndContext // ErrNoSuchTemplate: "no such template ..." // Examples: // {{define "main"}}<div {{template "attrs"}}>{{end}} // {{define "attrs"}}href="{{.URL}}"{{end}} // Discussion: // Package html/template looks through template calls to compute the // context. // Here the {{.URL}} in "attrs" must be treated as a URL when called // from "main", but you will get this error if "attrs" is not defined // when "main" is parsed. ErrNoSuchTemplate // ErrOutputContext: "cannot compute output context for template ..." // Examples: // {{define "t"}}{{if .T}}{{template "t" .T}}{{end}}{{.H}}",{{end}} // Discussion: // A recursive template does not end in the same context in which it // starts, and a reliable output context cannot be computed. // Look for typos in the named template. // If the template should not be called in the named start context, // look for calls to that template in unexpected contexts. // Maybe refactor recursive templates to not be recursive. ErrOutputContext // ErrPartialCharset: "unfinished JS regexp charset in ..." // Example: // <script>var pattern = /foo[{{.Chars}}]/</script> // Discussion: // Package html/template does not support interpolation into regular // expression literal character sets. ErrPartialCharset // ErrPartialEscape: "unfinished escape sequence in ..." // Example: // <script>alert("\{{.X}}")</script> // Discussion: // Package html/template does not support actions following a // backslash. // This is usually an error and there are better solutions; for // example // <script>alert("{{.X}}")</script> // should work, and if {{.X}} is a partial escape sequence such as // "xA0", mark the whole sequence as safe content: JSStr(`\xA0`) ErrPartialEscape // ErrRangeLoopReentry: "on range loop re-entry: ..." // Example: // <script>var x = [{{range .}}'{{.}},{{end}}]</script> // Discussion: // If an iteration through a range would cause it to end in a // different context than an earlier pass, there is no single context. // In the example, there is missing a quote, so it is not clear // whether {{.}} is meant to be inside a JS string or in a JS value // context. The second iteration would produce something like // // <script>var x = ['firstValue,'secondValue]</script> ErrRangeLoopReentry // ErrSlashAmbig: '/' could start a division or regexp. // Example: // <script> // {{if .C}}var x = 1{{end}} // /-{{.N}}/i.test(x) ? doThis : doThat(); // </script> // Discussion: // The example above could produce `var x = 1/-2/i.test(s)...` // in which the first '/' is a mathematical division operator or it // could produce `/-2/i.test(s)` in which the first '/' starts a // regexp literal. // Look for missing semicolons inside branches, and maybe add // parentheses to make it clear which interpretation you intend. ErrSlashAmbig // ErrPredefinedEscaper: "predefined escaper ... disallowed in template" // Example: // <div class={{. | html}}>Hello<div> // Discussion: // Package html/template already contextually escapes all pipelines to // produce HTML output safe against code injection. Manually escaping // pipeline output using the predefined escapers "html" or "urlquery" is // unnecessary, and may affect the correctness or safety of the escaped // pipeline output in Go 1.8 and earlier. // // In most cases, such as the given example, this error can be resolved by // simply removing the predefined escaper from the pipeline and letting the // contextual autoescaper handle the escaping of the pipeline. In other // instances, where the predefined escaper occurs in the middle of a // pipeline where subsequent commands expect escaped input, e.g. // {{.X | html | makeALink}} // where makeALink does // return `<a href="`+input+`">link</a>` // consider refactoring the surrounding template to make use of the // contextual autoescaper, i.e. // <a href="{{.X}}">link</a> // // To ease migration to Go 1.9 and beyond, "html" and "urlquery" will // continue to be allowed as the last command in a pipeline. However, if the // pipeline occurs in an unquoted attribute value context, "html" is // disallowed. Avoid using "html" and "urlquery" entirely in new templates. ErrPredefinedEscaper ) ``` type FuncMap ------------ ``` type FuncMap = template.FuncMap ``` type HTML --------- HTML encapsulates a known safe HTML document fragment. It should not be used for HTML from a third-party, or HTML with unclosed tags or comments. The outputs of a sound HTML sanitizer and a template escaped by this package are fine for use with HTML. Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. ``` type HTML string ``` type HTMLAttr ------------- HTMLAttr encapsulates an HTML attribute from a trusted source, for example, ` dir="ltr"`. Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. ``` type HTMLAttr string ``` type JS ------- JS encapsulates a known safe EcmaScript5 Expression, for example, `(x + y \* z())`. Template authors are responsible for ensuring that typed expressions do not break the intended precedence and that there is no statement/expression ambiguity as when passing an expression like "{ foo: bar() }\n['foo']()", which is both a valid Expression and a valid Program with a very different meaning. Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. Using JS to include valid but untrusted JSON is not safe. A safe alternative is to parse the JSON with json.Unmarshal and then pass the resultant object into the template, where it will be converted to sanitized JSON when presented in a JavaScript context. ``` type JS string ``` type JSStr ---------- JSStr encapsulates a sequence of characters meant to be embedded between quotes in a JavaScript expression. The string must match a series of StringCharacters: ``` StringCharacter :: SourceCharacter but not `\` or LineTerminator | EscapeSequence ``` Note that LineContinuations are not allowed. JSStr("foo\\nbar") is fine, but JSStr("foo\\\nbar") is not. Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. ``` type JSStr string ``` type Srcset 1.10 ---------------- Srcset encapsulates a known safe srcset attribute (see <https://w3c.github.io/html/semantics-embedded-content.html#element-attrdef-img-srcset>). Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. ``` type Srcset string ``` type Template ------------- Template is a specialized Template from "text/template" that produces a safe HTML document fragment. ``` type Template struct { // The underlying template's parse tree, updated to be HTML-safe. Tree *parse.Tree // Go 1.2 // contains filtered or unexported fields } ``` #### Example (Block) Code: ``` const ( master = `Names:{{block "list" .}}{{"\n"}}{{range .}}{{println "-" .}}{{end}}{{end}}` overlay = `{{define "list"}} {{join . ", "}}{{end}} ` ) var ( funcs = template.FuncMap{"join": strings.Join} guardians = []string{"Gamora", "Groot", "Nebula", "Rocket", "Star-Lord"} ) masterTmpl, err := template.New("master").Funcs(funcs).Parse(master) if err != nil { log.Fatal(err) } overlayTmpl, err := template.Must(masterTmpl.Clone()).Parse(overlay) if err != nil { log.Fatal(err) } if err := masterTmpl.Execute(os.Stdout, guardians); err != nil { log.Fatal(err) } if err := overlayTmpl.Execute(os.Stdout, guardians); err != nil { log.Fatal(err) } ``` Output: ``` Names: - Gamora - Groot - Nebula - Rocket - Star-Lord Names: Gamora, Groot, Nebula, Rocket, Star-Lord ``` #### Example (Glob) Here we demonstrate loading a set of templates from a directory. Code: ``` // Here we create a temporary directory and populate it with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir := createTestDir([]templateFile{ // T0.tmpl is a plain template file that just invokes T1. {"T0.tmpl", `T0 invokes T1: ({{template "T1"}})`}, // T1.tmpl defines a template, T1 that invokes T2. {"T1.tmpl", `{{define "T1"}}T1 invokes T2: ({{template "T2"}}){{end}}`}, // T2.tmpl defines a template T2. {"T2.tmpl", `{{define "T2"}}This is T2{{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer os.RemoveAll(dir) // pattern is the glob pattern used to find all the template files. pattern := filepath.Join(dir, "*.tmpl") // Here starts the example proper. // T0.tmpl is the first name matched, so it becomes the starting template, // the value returned by ParseGlob. tmpl := template.Must(template.ParseGlob(pattern)) err := tmpl.Execute(os.Stdout, nil) if err != nil { log.Fatalf("template execution: %s", err) } ``` Output: ``` T0 invokes T1: (T1 invokes T2: (This is T2)) ``` #### Example (Helpers) This example demonstrates one way to share some templates and use them in different contexts. In this variant we add multiple driver templates by hand to an existing bundle of templates. Code: ``` // Here we create a temporary directory and populate it with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir := createTestDir([]templateFile{ // T1.tmpl defines a template, T1 that invokes T2. {"T1.tmpl", `{{define "T1"}}T1 invokes T2: ({{template "T2"}}){{end}}`}, // T2.tmpl defines a template T2. {"T2.tmpl", `{{define "T2"}}This is T2{{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer os.RemoveAll(dir) // pattern is the glob pattern used to find all the template files. pattern := filepath.Join(dir, "*.tmpl") // Here starts the example proper. // Load the helpers. templates := template.Must(template.ParseGlob(pattern)) // Add one driver template to the bunch; we do this with an explicit template definition. _, err := templates.Parse("{{define `driver1`}}Driver 1 calls T1: ({{template `T1`}})\n{{end}}") if err != nil { log.Fatal("parsing driver1: ", err) } // Add another driver template. _, err = templates.Parse("{{define `driver2`}}Driver 2 calls T2: ({{template `T2`}})\n{{end}}") if err != nil { log.Fatal("parsing driver2: ", err) } // We load all the templates before execution. This package does not require // that behavior but html/template's escaping does, so it's a good habit. err = templates.ExecuteTemplate(os.Stdout, "driver1", nil) if err != nil { log.Fatalf("driver1 execution: %s", err) } err = templates.ExecuteTemplate(os.Stdout, "driver2", nil) if err != nil { log.Fatalf("driver2 execution: %s", err) } ``` Output: ``` Driver 1 calls T1: (T1 invokes T2: (This is T2)) Driver 2 calls T2: (This is T2) ``` #### Example (Parsefiles) Here we demonstrate loading a set of templates from files in different directories Code: ``` // Here we create different temporary directories and populate them with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir1 := createTestDir([]templateFile{ // T1.tmpl is a plain template file that just invokes T2. {"T1.tmpl", `T1 invokes T2: ({{template "T2"}})`}, }) dir2 := createTestDir([]templateFile{ // T2.tmpl defines a template T2. {"T2.tmpl", `{{define "T2"}}This is T2{{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer func(dirs ...string) { for _, dir := range dirs { os.RemoveAll(dir) } }(dir1, dir2) // Here starts the example proper. // Let's just parse only dir1/T0 and dir2/T2 paths := []string{ filepath.Join(dir1, "T1.tmpl"), filepath.Join(dir2, "T2.tmpl"), } tmpl := template.Must(template.ParseFiles(paths...)) err := tmpl.Execute(os.Stdout, nil) if err != nil { log.Fatalf("template execution: %s", err) } ``` Output: ``` T1 invokes T2: (This is T2) ``` #### Example (Share) This example demonstrates how to use one group of driver templates with distinct sets of helper templates. Code: ``` // Here we create a temporary directory and populate it with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir := createTestDir([]templateFile{ // T0.tmpl is a plain template file that just invokes T1. {"T0.tmpl", "T0 ({{.}} version) invokes T1: ({{template `T1`}})\n"}, // T1.tmpl defines a template, T1 that invokes T2. Note T2 is not defined {"T1.tmpl", `{{define "T1"}}T1 invokes T2: ({{template "T2"}}){{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer os.RemoveAll(dir) // pattern is the glob pattern used to find all the template files. pattern := filepath.Join(dir, "*.tmpl") // Here starts the example proper. // Load the drivers. drivers := template.Must(template.ParseGlob(pattern)) // We must define an implementation of the T2 template. First we clone // the drivers, then add a definition of T2 to the template name space. // 1. Clone the helper set to create a new name space from which to run them. first, err := drivers.Clone() if err != nil { log.Fatal("cloning helpers: ", err) } // 2. Define T2, version A, and parse it. _, err = first.Parse("{{define `T2`}}T2, version A{{end}}") if err != nil { log.Fatal("parsing T2: ", err) } // Now repeat the whole thing, using a different version of T2. // 1. Clone the drivers. second, err := drivers.Clone() if err != nil { log.Fatal("cloning drivers: ", err) } // 2. Define T2, version B, and parse it. _, err = second.Parse("{{define `T2`}}T2, version B{{end}}") if err != nil { log.Fatal("parsing T2: ", err) } // Execute the templates in the reverse order to verify the // first is unaffected by the second. err = second.ExecuteTemplate(os.Stdout, "T0.tmpl", "second") if err != nil { log.Fatalf("second execution: %s", err) } err = first.ExecuteTemplate(os.Stdout, "T0.tmpl", "first") if err != nil { log.Fatalf("first: execution: %s", err) } ``` Output: ``` T0 (second version) invokes T1: (T1 invokes T2: (T2, version B)) T0 (first version) invokes T1: (T1 invokes T2: (T2, version A)) ``` ### func Must ``` func Must(t *Template, err error) *Template ``` Must is a helper that wraps a call to a function returning (\*Template, error) and panics if the error is non-nil. It is intended for use in variable initializations such as ``` var t = template.Must(template.New("name").Parse("html")) ``` ### func New ``` func New(name string) *Template ``` New allocates a new HTML template with the given name. ### func ParseFS 1.16 ``` func ParseFS(fs fs.FS, patterns ...string) (*Template, error) ``` ParseFS is like ParseFiles or ParseGlob but reads from the file system fs instead of the host operating system's file system. It accepts a list of glob patterns. (Note that most file names serve as glob patterns matching only themselves.) ### func ParseFiles ``` func ParseFiles(filenames ...string) (*Template, error) ``` ParseFiles creates a new Template and parses the template definitions from the named files. The returned template's name will have the (base) name and (parsed) contents of the first file. There must be at least one file. If an error occurs, parsing stops and the returned \*Template is nil. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. For instance, ParseFiles("a/foo", "b/foo") stores "b/foo" as the template named "foo", while "a/foo" is unavailable. ### func ParseGlob ``` func ParseGlob(pattern string) (*Template, error) ``` ParseGlob creates a new Template and parses the template definitions from the files identified by the pattern. The files are matched according to the semantics of filepath.Match, and the pattern must match at least one file. The returned template will have the (base) name and (parsed) contents of the first file matched by the pattern. ParseGlob is equivalent to calling ParseFiles with the list of files matched by the pattern. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. ### func (\*Template) AddParseTree ``` func (t *Template) AddParseTree(name string, tree *parse.Tree) (*Template, error) ``` AddParseTree creates a new template with the name and parse tree and associates it with t. It returns an error if t or any associated template has already been executed. ### func (\*Template) Clone ``` func (t *Template) Clone() (*Template, error) ``` Clone returns a duplicate of the template, including all associated templates. The actual representation is not copied, but the name space of associated templates is, so further calls to Parse in the copy will add templates to the copy but not to the original. Clone can be used to prepare common templates and use them with variant definitions for other templates by adding the variants after the clone is made. It returns an error if t has already been executed. ### func (\*Template) DefinedTemplates 1.6 ``` func (t *Template) DefinedTemplates() string ``` DefinedTemplates returns a string listing the defined templates, prefixed by the string "; defined templates are: ". If there are none, it returns the empty string. Used to generate an error message. ### func (\*Template) Delims ``` func (t *Template) Delims(left, right string) *Template ``` Delims sets the action delimiters to the specified strings, to be used in subsequent calls to Parse, ParseFiles, or ParseGlob. Nested template definitions will inherit the settings. An empty delimiter stands for the corresponding default: {{ or }}. The return value is the template, so calls can be chained. #### Example Code: ``` const text = "<<.Greeting>> {{.Name}}" data := struct { Greeting string Name string }{ Greeting: "Hello", Name: "Joe", } t := template.Must(template.New("tpl").Delims("<<", ">>").Parse(text)) err := t.Execute(os.Stdout, data) if err != nil { log.Fatal(err) } ``` Output: ``` Hello {{.Name}} ``` ### func (\*Template) Execute ``` func (t *Template) Execute(wr io.Writer, data any) error ``` Execute applies a parsed template to the specified data object, writing the output to wr. If an error occurs executing the template or writing its output, execution stops, but partial results may already have been written to the output writer. A template may be executed safely in parallel, although if parallel executions share a Writer the output may be interleaved. ### func (\*Template) ExecuteTemplate ``` func (t *Template) ExecuteTemplate(wr io.Writer, name string, data any) error ``` ExecuteTemplate applies the template associated with t that has the given name to the specified data object and writes the output to wr. If an error occurs executing the template or writing its output, execution stops, but partial results may already have been written to the output writer. A template may be executed safely in parallel, although if parallel executions share a Writer the output may be interleaved. ### func (\*Template) Funcs ``` func (t *Template) Funcs(funcMap FuncMap) *Template ``` Funcs adds the elements of the argument map to the template's function map. It must be called before the template is parsed. It panics if a value in the map is not a function with appropriate return type. However, it is legal to overwrite elements of the map. The return value is the template, so calls can be chained. ### func (\*Template) Lookup ``` func (t *Template) Lookup(name string) *Template ``` Lookup returns the template with the given name that is associated with t, or nil if there is no such template. ### func (\*Template) Name ``` func (t *Template) Name() string ``` Name returns the name of the template. ### func (\*Template) New ``` func (t *Template) New(name string) *Template ``` New allocates a new HTML template associated with the given one and with the same delimiters. The association, which is transitive, allows one template to invoke another with a {{template}} action. If a template with the given name already exists, the new HTML template will replace it. The existing template will be reset and disassociated with t. ### func (\*Template) Option 1.5 ``` func (t *Template) Option(opt ...string) *Template ``` Option sets options for the template. Options are described by strings, either a simple string or "key=value". There can be at most one equals sign in an option string. If the option string is unrecognized or otherwise invalid, Option panics. Known options: missingkey: Control the behavior during execution if a map is indexed with a key that is not present in the map. ``` "missingkey=default" or "missingkey=invalid" The default behavior: Do nothing and continue execution. If printed, the result of the index operation is the string "<no value>". "missingkey=zero" The operation returns the zero value for the map type's element. "missingkey=error" Execution stops immediately with an error. ``` ### func (\*Template) Parse ``` func (t *Template) Parse(text string) (*Template, error) ``` Parse parses text as a template body for t. Named template definitions ({{define ...}} or {{block ...}} statements) in text define additional templates associated with t and are removed from the definition of t itself. Templates can be redefined in successive calls to Parse, before the first use of Execute on t or any associated template. A template definition with a body containing only white space and comments is considered empty and will not replace an existing template's body. This allows using Parse to add new named template definitions without overwriting the main template body. ### func (\*Template) ParseFS 1.16 ``` func (t *Template) ParseFS(fs fs.FS, patterns ...string) (*Template, error) ``` ParseFS is like ParseFiles or ParseGlob but reads from the file system fs instead of the host operating system's file system. It accepts a list of glob patterns. (Note that most file names serve as glob patterns matching only themselves.) ### func (\*Template) ParseFiles ``` func (t *Template) ParseFiles(filenames ...string) (*Template, error) ``` ParseFiles parses the named files and associates the resulting templates with t. If an error occurs, parsing stops and the returned template is nil; otherwise it is t. There must be at least one file. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. ParseFiles returns an error if t or any associated template has already been executed. ### func (\*Template) ParseGlob ``` func (t *Template) ParseGlob(pattern string) (*Template, error) ``` ParseGlob parses the template definitions in the files identified by the pattern and associates the resulting templates with t. The files are matched according to the semantics of filepath.Match, and the pattern must match at least one file. ParseGlob is equivalent to calling t.ParseFiles with the list of files matched by the pattern. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. ParseGlob returns an error if t or any associated template has already been executed. ### func (\*Template) Templates ``` func (t *Template) Templates() []*Template ``` Templates returns a slice of the templates associated with t, including t itself. type URL -------- URL encapsulates a known safe URL or URL substring (see RFC 3986). A URL like `javascript:checkThatFormNotEditedBeforeLeavingPage()` from a trusted source should go in the page, but by default dynamic `javascript:` URLs are filtered out since they are a frequently exploited injection vector. Use of this type presents a security risk: the encapsulated content should come from a trusted source, as it will be included verbatim in the template output. ``` type URL string ```
programming_docs
go Package math Package math ============= * `import "math"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package math provides basic constants and mathematical functions. This package does not guarantee bit-identical results across architectures. Index ----- * [Constants](#pkg-constants) * [func Abs(x float64) float64](#Abs) * [func Acos(x float64) float64](#Acos) * [func Acosh(x float64) float64](#Acosh) * [func Asin(x float64) float64](#Asin) * [func Asinh(x float64) float64](#Asinh) * [func Atan(x float64) float64](#Atan) * [func Atan2(y, x float64) float64](#Atan2) * [func Atanh(x float64) float64](#Atanh) * [func Cbrt(x float64) float64](#Cbrt) * [func Ceil(x float64) float64](#Ceil) * [func Copysign(f, sign float64) float64](#Copysign) * [func Cos(x float64) float64](#Cos) * [func Cosh(x float64) float64](#Cosh) * [func Dim(x, y float64) float64](#Dim) * [func Erf(x float64) float64](#Erf) * [func Erfc(x float64) float64](#Erfc) * [func Erfcinv(x float64) float64](#Erfcinv) * [func Erfinv(x float64) float64](#Erfinv) * [func Exp(x float64) float64](#Exp) * [func Exp2(x float64) float64](#Exp2) * [func Expm1(x float64) float64](#Expm1) * [func FMA(x, y, z float64) float64](#FMA) * [func Float32bits(f float32) uint32](#Float32bits) * [func Float32frombits(b uint32) float32](#Float32frombits) * [func Float64bits(f float64) uint64](#Float64bits) * [func Float64frombits(b uint64) float64](#Float64frombits) * [func Floor(x float64) float64](#Floor) * [func Frexp(f float64) (frac float64, exp int)](#Frexp) * [func Gamma(x float64) float64](#Gamma) * [func Hypot(p, q float64) float64](#Hypot) * [func Ilogb(x float64) int](#Ilogb) * [func Inf(sign int) float64](#Inf) * [func IsInf(f float64, sign int) bool](#IsInf) * [func IsNaN(f float64) (is bool)](#IsNaN) * [func J0(x float64) float64](#J0) * [func J1(x float64) float64](#J1) * [func Jn(n int, x float64) float64](#Jn) * [func Ldexp(frac float64, exp int) float64](#Ldexp) * [func Lgamma(x float64) (lgamma float64, sign int)](#Lgamma) * [func Log(x float64) float64](#Log) * [func Log10(x float64) float64](#Log10) * [func Log1p(x float64) float64](#Log1p) * [func Log2(x float64) float64](#Log2) * [func Logb(x float64) float64](#Logb) * [func Max(x, y float64) float64](#Max) * [func Min(x, y float64) float64](#Min) * [func Mod(x, y float64) float64](#Mod) * [func Modf(f float64) (int float64, frac float64)](#Modf) * [func NaN() float64](#NaN) * [func Nextafter(x, y float64) (r float64)](#Nextafter) * [func Nextafter32(x, y float32) (r float32)](#Nextafter32) * [func Pow(x, y float64) float64](#Pow) * [func Pow10(n int) float64](#Pow10) * [func Remainder(x, y float64) float64](#Remainder) * [func Round(x float64) float64](#Round) * [func RoundToEven(x float64) float64](#RoundToEven) * [func Signbit(x float64) bool](#Signbit) * [func Sin(x float64) float64](#Sin) * [func Sincos(x float64) (sin, cos float64)](#Sincos) * [func Sinh(x float64) float64](#Sinh) * [func Sqrt(x float64) float64](#Sqrt) * [func Tan(x float64) float64](#Tan) * [func Tanh(x float64) float64](#Tanh) * [func Trunc(x float64) float64](#Trunc) * [func Y0(x float64) float64](#Y0) * [func Y1(x float64) float64](#Y1) * [func Yn(n int, x float64) float64](#Yn) ### Examples [Abs](#example_Abs) [Acos](#example_Acos) [Acosh](#example_Acosh) [Asin](#example_Asin) [Asinh](#example_Asinh) [Atan](#example_Atan) [Atan2](#example_Atan2) [Atanh](#example_Atanh) [Cbrt](#example_Cbrt) [Ceil](#example_Ceil) [Copysign](#example_Copysign) [Cos](#example_Cos) [Cosh](#example_Cosh) [Dim](#example_Dim) [Exp](#example_Exp) [Exp2](#example_Exp2) [Expm1](#example_Expm1) [Floor](#example_Floor) [Log](#example_Log) [Log10](#example_Log10) [Log2](#example_Log2) [Mod](#example_Mod) [Modf](#example_Modf) [Pow](#example_Pow) [Pow10](#example_Pow10) [Remainder](#example_Remainder) [Round](#example_Round) [RoundToEven](#example_RoundToEven) [Sin](#example_Sin) [Sincos](#example_Sincos) [Sinh](#example_Sinh) [Sqrt](#example_Sqrt) [Tan](#example_Tan) [Tanh](#example_Tanh) [Trunc](#example_Trunc) ### Package files abs.go acosh.go asin.go asinh.go atan.go atan2.go atanh.go bits.go cbrt.go const.go copysign.go dim.go dim\_asm.go erf.go erfinv.go exp.go exp2\_noasm.go exp\_amd64.go exp\_asm.go expm1.go floor.go floor\_asm.go fma.go frexp.go gamma.go hypot.go hypot\_asm.go j0.go j1.go jn.go ldexp.go lgamma.go log.go log10.go log1p.go log\_asm.go logb.go mod.go modf.go modf\_noasm.go nextafter.go pow.go pow10.go remainder.go signbit.go sin.go sincos.go sinh.go sqrt.go stubs.go tan.go tanh.go trig\_reduce.go unsafe.go Constants --------- Mathematical constants. ``` const ( E = 2.71828182845904523536028747135266249775724709369995957496696763 // https://oeis.org/A001113 Pi = 3.14159265358979323846264338327950288419716939937510582097494459 // https://oeis.org/A000796 Phi = 1.61803398874989484820458683436563811772030917980576286213544862 // https://oeis.org/A001622 Sqrt2 = 1.41421356237309504880168872420969807856967187537694807317667974 // https://oeis.org/A002193 SqrtE = 1.64872127070012814684865078781416357165377610071014801157507931 // https://oeis.org/A019774 SqrtPi = 1.77245385090551602729816748334114518279754945612238712821380779 // https://oeis.org/A002161 SqrtPhi = 1.27201964951406896425242246173749149171560804184009624861664038 // https://oeis.org/A139339 Ln2 = 0.693147180559945309417232121458176568075500134360255254120680009 // https://oeis.org/A002162 Log2E = 1 / Ln2 Ln10 = 2.30258509299404568401799145468436420760110148862877297603332790 // https://oeis.org/A002392 Log10E = 1 / Ln10 ) ``` Floating-point limit values. Max is the largest finite value representable by the type. SmallestNonzero is the smallest positive, non-zero value representable by the type. ``` const ( MaxFloat32 = 0x1p127 * (1 + (1 - 0x1p-23)) // 3.40282346638528859811704183484516925440e+38 SmallestNonzeroFloat32 = 0x1p-126 * 0x1p-23 // 1.401298464324817070923729583289916131280e-45 MaxFloat64 = 0x1p1023 * (1 + (1 - 0x1p-52)) // 1.79769313486231570814527423731704356798070e+308 SmallestNonzeroFloat64 = 0x1p-1022 * 0x1p-52 // 4.9406564584124654417656879286822137236505980e-324 ) ``` Integer limit values. ``` const ( MaxInt = 1<<(intSize-1) - 1 // MaxInt32 or MaxInt64 depending on intSize. MinInt = -1 << (intSize - 1) // MinInt32 or MinInt64 depending on intSize. MaxInt8 = 1<<7 - 1 // 127 MinInt8 = -1 << 7 // -128 MaxInt16 = 1<<15 - 1 // 32767 MinInt16 = -1 << 15 // -32768 MaxInt32 = 1<<31 - 1 // 2147483647 MinInt32 = -1 << 31 // -2147483648 MaxInt64 = 1<<63 - 1 // 9223372036854775807 MinInt64 = -1 << 63 // -9223372036854775808 MaxUint = 1<<intSize - 1 // MaxUint32 or MaxUint64 depending on intSize. MaxUint8 = 1<<8 - 1 // 255 MaxUint16 = 1<<16 - 1 // 65535 MaxUint32 = 1<<32 - 1 // 4294967295 MaxUint64 = 1<<64 - 1 // 18446744073709551615 ) ``` func Abs -------- ``` func Abs(x float64) float64 ``` Abs returns the absolute value of x. Special cases are: ``` Abs(±Inf) = +Inf Abs(NaN) = NaN ``` #### Example Code: ``` x := math.Abs(-2) fmt.Printf("%.1f\n", x) y := math.Abs(2) fmt.Printf("%.1f\n", y) ``` Output: ``` 2.0 2.0 ``` func Acos --------- ``` func Acos(x float64) float64 ``` Acos returns the arccosine, in radians, of x. Special case is: ``` Acos(x) = NaN if x < -1 or x > 1 ``` #### Example Code: ``` fmt.Printf("%.2f", math.Acos(1)) ``` Output: ``` 0.00 ``` func Acosh ---------- ``` func Acosh(x float64) float64 ``` Acosh returns the inverse hyperbolic cosine of x. Special cases are: ``` Acosh(+Inf) = +Inf Acosh(x) = NaN if x < 1 Acosh(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Acosh(1)) ``` Output: ``` 0.00 ``` func Asin --------- ``` func Asin(x float64) float64 ``` Asin returns the arcsine, in radians, of x. Special cases are: ``` Asin(±0) = ±0 Asin(x) = NaN if x < -1 or x > 1 ``` #### Example Code: ``` fmt.Printf("%.2f", math.Asin(0)) ``` Output: ``` 0.00 ``` func Asinh ---------- ``` func Asinh(x float64) float64 ``` Asinh returns the inverse hyperbolic sine of x. Special cases are: ``` Asinh(±0) = ±0 Asinh(±Inf) = ±Inf Asinh(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Asinh(0)) ``` Output: ``` 0.00 ``` func Atan --------- ``` func Atan(x float64) float64 ``` Atan returns the arctangent, in radians, of x. Special cases are: ``` Atan(±0) = ±0 Atan(±Inf) = ±Pi/2 ``` #### Example Code: ``` fmt.Printf("%.2f", math.Atan(0)) ``` Output: ``` 0.00 ``` func Atan2 ---------- ``` func Atan2(y, x float64) float64 ``` Atan2 returns the arc tangent of y/x, using the signs of the two to determine the quadrant of the return value. Special cases are (in order): ``` Atan2(y, NaN) = NaN Atan2(NaN, x) = NaN Atan2(+0, x>=0) = +0 Atan2(-0, x>=0) = -0 Atan2(+0, x<=-0) = +Pi Atan2(-0, x<=-0) = -Pi Atan2(y>0, 0) = +Pi/2 Atan2(y<0, 0) = -Pi/2 Atan2(+Inf, +Inf) = +Pi/4 Atan2(-Inf, +Inf) = -Pi/4 Atan2(+Inf, -Inf) = 3Pi/4 Atan2(-Inf, -Inf) = -3Pi/4 Atan2(y, +Inf) = 0 Atan2(y>0, -Inf) = +Pi Atan2(y<0, -Inf) = -Pi Atan2(+Inf, x) = +Pi/2 Atan2(-Inf, x) = -Pi/2 ``` #### Example Code: ``` fmt.Printf("%.2f", math.Atan2(0, 0)) ``` Output: ``` 0.00 ``` func Atanh ---------- ``` func Atanh(x float64) float64 ``` Atanh returns the inverse hyperbolic tangent of x. Special cases are: ``` Atanh(1) = +Inf Atanh(±0) = ±0 Atanh(-1) = -Inf Atanh(x) = NaN if x < -1 or x > 1 Atanh(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Atanh(0)) ``` Output: ``` 0.00 ``` func Cbrt --------- ``` func Cbrt(x float64) float64 ``` Cbrt returns the cube root of x. Special cases are: ``` Cbrt(±0) = ±0 Cbrt(±Inf) = ±Inf Cbrt(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f\n", math.Cbrt(8)) fmt.Printf("%.2f\n", math.Cbrt(27)) ``` Output: ``` 2.00 3.00 ``` func Ceil --------- ``` func Ceil(x float64) float64 ``` Ceil returns the least integer value greater than or equal to x. Special cases are: ``` Ceil(±0) = ±0 Ceil(±Inf) = ±Inf Ceil(NaN) = NaN ``` #### Example Code: ``` c := math.Ceil(1.49) fmt.Printf("%.1f", c) ``` Output: ``` 2.0 ``` func Copysign ------------- ``` func Copysign(f, sign float64) float64 ``` Copysign returns a value with the magnitude of f and the sign of sign. #### Example Code: ``` fmt.Printf("%.2f", math.Copysign(3.2, -1)) ``` Output: ``` -3.20 ``` func Cos -------- ``` func Cos(x float64) float64 ``` Cos returns the cosine of the radian argument x. Special cases are: ``` Cos(±Inf) = NaN Cos(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Cos(math.Pi/2)) ``` Output: ``` 0.00 ``` func Cosh --------- ``` func Cosh(x float64) float64 ``` Cosh returns the hyperbolic cosine of x. Special cases are: ``` Cosh(±0) = 1 Cosh(±Inf) = +Inf Cosh(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Cosh(0)) ``` Output: ``` 1.00 ``` func Dim -------- ``` func Dim(x, y float64) float64 ``` Dim returns the maximum of x-y or 0. Special cases are: ``` Dim(+Inf, +Inf) = NaN Dim(-Inf, -Inf) = NaN Dim(x, NaN) = Dim(NaN, x) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f\n", math.Dim(4, -2)) fmt.Printf("%.2f\n", math.Dim(-4, 2)) ``` Output: ``` 6.00 0.00 ``` func Erf -------- ``` func Erf(x float64) float64 ``` Erf returns the error function of x. Special cases are: ``` Erf(+Inf) = 1 Erf(-Inf) = -1 Erf(NaN) = NaN ``` func Erfc --------- ``` func Erfc(x float64) float64 ``` Erfc returns the complementary error function of x. Special cases are: ``` Erfc(+Inf) = 0 Erfc(-Inf) = 2 Erfc(NaN) = NaN ``` func Erfcinv 1.10 ----------------- ``` func Erfcinv(x float64) float64 ``` Erfcinv returns the inverse of Erfc(x). Special cases are: ``` Erfcinv(0) = +Inf Erfcinv(2) = -Inf Erfcinv(x) = NaN if x < 0 or x > 2 Erfcinv(NaN) = NaN ``` func Erfinv 1.10 ---------------- ``` func Erfinv(x float64) float64 ``` Erfinv returns the inverse error function of x. Special cases are: ``` Erfinv(1) = +Inf Erfinv(-1) = -Inf Erfinv(x) = NaN if x < -1 or x > 1 Erfinv(NaN) = NaN ``` func Exp -------- ``` func Exp(x float64) float64 ``` Exp returns e\*\*x, the base-e exponential of x. Special cases are: ``` Exp(+Inf) = +Inf Exp(NaN) = NaN ``` Very large values overflow to 0 or +Inf. Very small values underflow to 1. #### Example Code: ``` fmt.Printf("%.2f\n", math.Exp(1)) fmt.Printf("%.2f\n", math.Exp(2)) fmt.Printf("%.2f\n", math.Exp(-1)) ``` Output: ``` 2.72 7.39 0.37 ``` func Exp2 --------- ``` func Exp2(x float64) float64 ``` Exp2 returns 2\*\*x, the base-2 exponential of x. Special cases are the same as Exp. #### Example Code: ``` fmt.Printf("%.2f\n", math.Exp2(1)) fmt.Printf("%.2f\n", math.Exp2(-3)) ``` Output: ``` 2.00 0.12 ``` func Expm1 ---------- ``` func Expm1(x float64) float64 ``` Expm1 returns e\*\*x - 1, the base-e exponential of x minus 1. It is more accurate than Exp(x) - 1 when x is near zero. Special cases are: ``` Expm1(+Inf) = +Inf Expm1(-Inf) = -1 Expm1(NaN) = NaN ``` Very large values overflow to -1 or +Inf. #### Example Code: ``` fmt.Printf("%.6f\n", math.Expm1(0.01)) fmt.Printf("%.6f\n", math.Expm1(-1)) ``` Output: ``` 0.010050 -0.632121 ``` func FMA 1.14 ------------- ``` func FMA(x, y, z float64) float64 ``` FMA returns x \* y + z, computed with only one rounding. (That is, FMA returns the fused multiply-add of x, y, and z.) func Float32bits ---------------- ``` func Float32bits(f float32) uint32 ``` Float32bits returns the IEEE 754 binary representation of f, with the sign bit of f and the result in the same bit position. Float32bits(Float32frombits(x)) == x. func Float32frombits -------------------- ``` func Float32frombits(b uint32) float32 ``` Float32frombits returns the floating-point number corresponding to the IEEE 754 binary representation b, with the sign bit of b and the result in the same bit position. Float32frombits(Float32bits(x)) == x. func Float64bits ---------------- ``` func Float64bits(f float64) uint64 ``` Float64bits returns the IEEE 754 binary representation of f, with the sign bit of f and the result in the same bit position, and Float64bits(Float64frombits(x)) == x. func Float64frombits -------------------- ``` func Float64frombits(b uint64) float64 ``` Float64frombits returns the floating-point number corresponding to the IEEE 754 binary representation b, with the sign bit of b and the result in the same bit position. Float64frombits(Float64bits(x)) == x. func Floor ---------- ``` func Floor(x float64) float64 ``` Floor returns the greatest integer value less than or equal to x. Special cases are: ``` Floor(±0) = ±0 Floor(±Inf) = ±Inf Floor(NaN) = NaN ``` #### Example Code: ``` c := math.Floor(1.51) fmt.Printf("%.1f", c) ``` Output: ``` 1.0 ``` func Frexp ---------- ``` func Frexp(f float64) (frac float64, exp int) ``` Frexp breaks f into a normalized fraction and an integral power of two. It returns frac and exp satisfying f == frac × 2\*\*exp, with the absolute value of frac in the interval [½, 1). Special cases are: ``` Frexp(±0) = ±0, 0 Frexp(±Inf) = ±Inf, 0 Frexp(NaN) = NaN, 0 ``` func Gamma ---------- ``` func Gamma(x float64) float64 ``` Gamma returns the Gamma function of x. Special cases are: ``` Gamma(+Inf) = +Inf Gamma(+0) = +Inf Gamma(-0) = -Inf Gamma(x) = NaN for integer x < 0 Gamma(-Inf) = NaN Gamma(NaN) = NaN ``` func Hypot ---------- ``` func Hypot(p, q float64) float64 ``` Hypot returns Sqrt(p\*p + q\*q), taking care to avoid unnecessary overflow and underflow. Special cases are: ``` Hypot(±Inf, q) = +Inf Hypot(p, ±Inf) = +Inf Hypot(NaN, q) = NaN Hypot(p, NaN) = NaN ``` func Ilogb ---------- ``` func Ilogb(x float64) int ``` Ilogb returns the binary exponent of x as an integer. Special cases are: ``` Ilogb(±Inf) = MaxInt32 Ilogb(0) = MinInt32 Ilogb(NaN) = MaxInt32 ``` func Inf -------- ``` func Inf(sign int) float64 ``` Inf returns positive infinity if sign >= 0, negative infinity if sign < 0. func IsInf ---------- ``` func IsInf(f float64, sign int) bool ``` IsInf reports whether f is an infinity, according to sign. If sign > 0, IsInf reports whether f is positive infinity. If sign < 0, IsInf reports whether f is negative infinity. If sign == 0, IsInf reports whether f is either infinity. func IsNaN ---------- ``` func IsNaN(f float64) (is bool) ``` IsNaN reports whether f is an IEEE 754 “not-a-number” value. func J0 ------- ``` func J0(x float64) float64 ``` J0 returns the order-zero Bessel function of the first kind. Special cases are: ``` J0(±Inf) = 0 J0(0) = 1 J0(NaN) = NaN ``` func J1 ------- ``` func J1(x float64) float64 ``` J1 returns the order-one Bessel function of the first kind. Special cases are: ``` J1(±Inf) = 0 J1(NaN) = NaN ``` func Jn ------- ``` func Jn(n int, x float64) float64 ``` Jn returns the order-n Bessel function of the first kind. Special cases are: ``` Jn(n, ±Inf) = 0 Jn(n, NaN) = NaN ``` func Ldexp ---------- ``` func Ldexp(frac float64, exp int) float64 ``` Ldexp is the inverse of Frexp. It returns frac × 2\*\*exp. Special cases are: ``` Ldexp(±0, exp) = ±0 Ldexp(±Inf, exp) = ±Inf Ldexp(NaN, exp) = NaN ``` func Lgamma ----------- ``` func Lgamma(x float64) (lgamma float64, sign int) ``` Lgamma returns the natural logarithm and sign (-1 or +1) of Gamma(x). Special cases are: ``` Lgamma(+Inf) = +Inf Lgamma(0) = +Inf Lgamma(-integer) = +Inf Lgamma(-Inf) = -Inf Lgamma(NaN) = NaN ``` func Log -------- ``` func Log(x float64) float64 ``` Log returns the natural logarithm of x. Special cases are: ``` Log(+Inf) = +Inf Log(0) = -Inf Log(x < 0) = NaN Log(NaN) = NaN ``` #### Example Code: ``` x := math.Log(1) fmt.Printf("%.1f\n", x) y := math.Log(2.7183) fmt.Printf("%.1f\n", y) ``` Output: ``` 0.0 1.0 ``` func Log10 ---------- ``` func Log10(x float64) float64 ``` Log10 returns the decimal logarithm of x. The special cases are the same as for Log. #### Example Code: ``` fmt.Printf("%.1f", math.Log10(100)) ``` Output: ``` 2.0 ``` func Log1p ---------- ``` func Log1p(x float64) float64 ``` Log1p returns the natural logarithm of 1 plus its argument x. It is more accurate than Log(1 + x) when x is near zero. Special cases are: ``` Log1p(+Inf) = +Inf Log1p(±0) = ±0 Log1p(-1) = -Inf Log1p(x < -1) = NaN Log1p(NaN) = NaN ``` func Log2 --------- ``` func Log2(x float64) float64 ``` Log2 returns the binary logarithm of x. The special cases are the same as for Log. #### Example Code: ``` fmt.Printf("%.1f", math.Log2(256)) ``` Output: ``` 8.0 ``` func Logb --------- ``` func Logb(x float64) float64 ``` Logb returns the binary exponent of x. Special cases are: ``` Logb(±Inf) = +Inf Logb(0) = -Inf Logb(NaN) = NaN ``` func Max -------- ``` func Max(x, y float64) float64 ``` Max returns the larger of x or y. Special cases are: ``` Max(x, +Inf) = Max(+Inf, x) = +Inf Max(x, NaN) = Max(NaN, x) = NaN Max(+0, ±0) = Max(±0, +0) = +0 Max(-0, -0) = -0 ``` func Min -------- ``` func Min(x, y float64) float64 ``` Min returns the smaller of x or y. Special cases are: ``` Min(x, -Inf) = Min(-Inf, x) = -Inf Min(x, NaN) = Min(NaN, x) = NaN Min(-0, ±0) = Min(±0, -0) = -0 ``` func Mod -------- ``` func Mod(x, y float64) float64 ``` Mod returns the floating-point remainder of x/y. The magnitude of the result is less than y and its sign agrees with that of x. Special cases are: ``` Mod(±Inf, y) = NaN Mod(NaN, y) = NaN Mod(x, 0) = NaN Mod(x, ±Inf) = x Mod(x, NaN) = NaN ``` #### Example Code: ``` c := math.Mod(7, 4) fmt.Printf("%.1f", c) ``` Output: ``` 3.0 ``` func Modf --------- ``` func Modf(f float64) (int float64, frac float64) ``` Modf returns integer and fractional floating-point numbers that sum to f. Both values have the same sign as f. Special cases are: ``` Modf(±Inf) = ±Inf, NaN Modf(NaN) = NaN, NaN ``` #### Example Code: ``` int, frac := math.Modf(3.14) fmt.Printf("%.2f, %.2f\n", int, frac) int, frac = math.Modf(-2.71) fmt.Printf("%.2f, %.2f\n", int, frac) ``` Output: ``` 3.00, 0.14 -2.00, -0.71 ``` func NaN -------- ``` func NaN() float64 ``` NaN returns an IEEE 754 “not-a-number” value. func Nextafter -------------- ``` func Nextafter(x, y float64) (r float64) ``` Nextafter returns the next representable float64 value after x towards y. Special cases are: ``` Nextafter(x, x) = x Nextafter(NaN, y) = NaN Nextafter(x, NaN) = NaN ``` func Nextafter32 1.4 -------------------- ``` func Nextafter32(x, y float32) (r float32) ``` Nextafter32 returns the next representable float32 value after x towards y. Special cases are: ``` Nextafter32(x, x) = x Nextafter32(NaN, y) = NaN Nextafter32(x, NaN) = NaN ``` func Pow -------- ``` func Pow(x, y float64) float64 ``` Pow returns x\*\*y, the base-x exponential of y. Special cases are (in order): ``` Pow(x, ±0) = 1 for any x Pow(1, y) = 1 for any y Pow(x, 1) = x for any x Pow(NaN, y) = NaN Pow(x, NaN) = NaN Pow(±0, y) = ±Inf for y an odd integer < 0 Pow(±0, -Inf) = +Inf Pow(±0, +Inf) = +0 Pow(±0, y) = +Inf for finite y < 0 and not an odd integer Pow(±0, y) = ±0 for y an odd integer > 0 Pow(±0, y) = +0 for finite y > 0 and not an odd integer Pow(-1, ±Inf) = 1 Pow(x, +Inf) = +Inf for |x| > 1 Pow(x, -Inf) = +0 for |x| > 1 Pow(x, +Inf) = +0 for |x| < 1 Pow(x, -Inf) = +Inf for |x| < 1 Pow(+Inf, y) = +Inf for y > 0 Pow(+Inf, y) = +0 for y < 0 Pow(-Inf, y) = Pow(-0, -y) Pow(x, y) = NaN for finite x < 0 and finite non-integer y ``` #### Example Code: ``` c := math.Pow(2, 3) fmt.Printf("%.1f", c) ``` Output: ``` 8.0 ``` func Pow10 ---------- ``` func Pow10(n int) float64 ``` Pow10 returns 10\*\*n, the base-10 exponential of n. Special cases are: ``` Pow10(n) = 0 for n < -323 Pow10(n) = +Inf for n > 308 ``` #### Example Code: ``` c := math.Pow10(2) fmt.Printf("%.1f", c) ``` Output: ``` 100.0 ``` func Remainder -------------- ``` func Remainder(x, y float64) float64 ``` Remainder returns the IEEE 754 floating-point remainder of x/y. Special cases are: ``` Remainder(±Inf, y) = NaN Remainder(NaN, y) = NaN Remainder(x, 0) = NaN Remainder(x, ±Inf) = x Remainder(x, NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.1f", math.Remainder(100, 30)) ``` Output: ``` 10.0 ``` func Round 1.10 --------------- ``` func Round(x float64) float64 ``` Round returns the nearest integer, rounding half away from zero. Special cases are: ``` Round(±0) = ±0 Round(±Inf) = ±Inf Round(NaN) = NaN ``` #### Example Code: ``` p := math.Round(10.5) fmt.Printf("%.1f\n", p) n := math.Round(-10.5) fmt.Printf("%.1f\n", n) ``` Output: ``` 11.0 -11.0 ``` func RoundToEven 1.10 --------------------- ``` func RoundToEven(x float64) float64 ``` RoundToEven returns the nearest integer, rounding ties to even. Special cases are: ``` RoundToEven(±0) = ±0 RoundToEven(±Inf) = ±Inf RoundToEven(NaN) = NaN ``` #### Example Code: ``` u := math.RoundToEven(11.5) fmt.Printf("%.1f\n", u) d := math.RoundToEven(12.5) fmt.Printf("%.1f\n", d) ``` Output: ``` 12.0 12.0 ``` func Signbit ------------ ``` func Signbit(x float64) bool ``` Signbit reports whether x is negative or negative zero. func Sin -------- ``` func Sin(x float64) float64 ``` Sin returns the sine of the radian argument x. Special cases are: ``` Sin(±0) = ±0 Sin(±Inf) = NaN Sin(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Sin(math.Pi)) ``` Output: ``` 0.00 ``` func Sincos ----------- ``` func Sincos(x float64) (sin, cos float64) ``` Sincos returns Sin(x), Cos(x). Special cases are: ``` Sincos(±0) = ±0, 1 Sincos(±Inf) = NaN, NaN Sincos(NaN) = NaN, NaN ``` #### Example Code: ``` sin, cos := math.Sincos(0) fmt.Printf("%.2f, %.2f", sin, cos) ``` Output: ``` 0.00, 1.00 ``` func Sinh --------- ``` func Sinh(x float64) float64 ``` Sinh returns the hyperbolic sine of x. Special cases are: ``` Sinh(±0) = ±0 Sinh(±Inf) = ±Inf Sinh(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Sinh(0)) ``` Output: ``` 0.00 ``` func Sqrt --------- ``` func Sqrt(x float64) float64 ``` Sqrt returns the square root of x. Special cases are: ``` Sqrt(+Inf) = +Inf Sqrt(±0) = ±0 Sqrt(x < 0) = NaN Sqrt(NaN) = NaN ``` #### Example Code: ``` const ( a = 3 b = 4 ) c := math.Sqrt(a*a + b*b) fmt.Printf("%.1f", c) ``` Output: ``` 5.0 ``` func Tan -------- ``` func Tan(x float64) float64 ``` Tan returns the tangent of the radian argument x. Special cases are: ``` Tan(±0) = ±0 Tan(±Inf) = NaN Tan(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Tan(0)) ``` Output: ``` 0.00 ``` func Tanh --------- ``` func Tanh(x float64) float64 ``` Tanh returns the hyperbolic tangent of x. Special cases are: ``` Tanh(±0) = ±0 Tanh(±Inf) = ±1 Tanh(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f", math.Tanh(0)) ``` Output: ``` 0.00 ``` func Trunc ---------- ``` func Trunc(x float64) float64 ``` Trunc returns the integer value of x. Special cases are: ``` Trunc(±0) = ±0 Trunc(±Inf) = ±Inf Trunc(NaN) = NaN ``` #### Example Code: ``` fmt.Printf("%.2f\n", math.Trunc(math.Pi)) fmt.Printf("%.2f\n", math.Trunc(-1.2345)) ``` Output: ``` 3.00 -1.00 ``` func Y0 ------- ``` func Y0(x float64) float64 ``` Y0 returns the order-zero Bessel function of the second kind. Special cases are: ``` Y0(+Inf) = 0 Y0(0) = -Inf Y0(x < 0) = NaN Y0(NaN) = NaN ``` func Y1 ------- ``` func Y1(x float64) float64 ``` Y1 returns the order-one Bessel function of the second kind. Special cases are: ``` Y1(+Inf) = 0 Y1(0) = -Inf Y1(x < 0) = NaN Y1(NaN) = NaN ``` func Yn ------- ``` func Yn(n int, x float64) float64 ``` Yn returns the order-n Bessel function of the second kind. Special cases are: ``` Yn(n, +Inf) = 0 Yn(n ≥ 0, 0) = -Inf Yn(n < 0, 0) = +Inf if n is odd, -Inf if n is even Yn(n, x < 0) = NaN Yn(n, NaN) = NaN ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [big](big/index) | Package big implements arbitrary-precision arithmetic (big numbers). | | [bits](bits/index) | Package bits implements bit counting and manipulation functions for the predeclared unsigned integer types. | | [cmplx](cmplx/index) | Package cmplx provides basic constants and mathematical functions for complex numbers. | | [rand](rand/index) | Package rand implements pseudo-random number generators unsuitable for security-sensitive work. |
programming_docs
go Package cmplx Package cmplx ============== * `import "math/cmplx"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package cmplx provides basic constants and mathematical functions for complex numbers. Special case handling conforms to the C99 standard Annex G IEC 60559-compatible complex arithmetic. Index ----- * [func Abs(x complex128) float64](#Abs) * [func Acos(x complex128) complex128](#Acos) * [func Acosh(x complex128) complex128](#Acosh) * [func Asin(x complex128) complex128](#Asin) * [func Asinh(x complex128) complex128](#Asinh) * [func Atan(x complex128) complex128](#Atan) * [func Atanh(x complex128) complex128](#Atanh) * [func Conj(x complex128) complex128](#Conj) * [func Cos(x complex128) complex128](#Cos) * [func Cosh(x complex128) complex128](#Cosh) * [func Cot(x complex128) complex128](#Cot) * [func Exp(x complex128) complex128](#Exp) * [func Inf() complex128](#Inf) * [func IsInf(x complex128) bool](#IsInf) * [func IsNaN(x complex128) bool](#IsNaN) * [func Log(x complex128) complex128](#Log) * [func Log10(x complex128) complex128](#Log10) * [func NaN() complex128](#NaN) * [func Phase(x complex128) float64](#Phase) * [func Polar(x complex128) (r, θ float64)](#Polar) * [func Pow(x, y complex128) complex128](#Pow) * [func Rect(r, θ float64) complex128](#Rect) * [func Sin(x complex128) complex128](#Sin) * [func Sinh(x complex128) complex128](#Sinh) * [func Sqrt(x complex128) complex128](#Sqrt) * [func Tan(x complex128) complex128](#Tan) * [func Tanh(x complex128) complex128](#Tanh) ### Examples [Abs](#example_Abs) [Exp](#example_Exp) [Polar](#example_Polar) ### Package files abs.go asin.go conj.go exp.go isinf.go isnan.go log.go phase.go polar.go pow.go rect.go sin.go sqrt.go tan.go func Abs -------- ``` func Abs(x complex128) float64 ``` Abs returns the absolute value (also called the modulus) of x. #### Example Code: ``` fmt.Printf("%.1f", cmplx.Abs(3+4i)) ``` Output: ``` 5.0 ``` func Acos --------- ``` func Acos(x complex128) complex128 ``` Acos returns the inverse cosine of x. func Acosh ---------- ``` func Acosh(x complex128) complex128 ``` Acosh returns the inverse hyperbolic cosine of x. func Asin --------- ``` func Asin(x complex128) complex128 ``` Asin returns the inverse sine of x. func Asinh ---------- ``` func Asinh(x complex128) complex128 ``` Asinh returns the inverse hyperbolic sine of x. func Atan --------- ``` func Atan(x complex128) complex128 ``` Atan returns the inverse tangent of x. func Atanh ---------- ``` func Atanh(x complex128) complex128 ``` Atanh returns the inverse hyperbolic tangent of x. func Conj --------- ``` func Conj(x complex128) complex128 ``` Conj returns the complex conjugate of x. func Cos -------- ``` func Cos(x complex128) complex128 ``` Cos returns the cosine of x. func Cosh --------- ``` func Cosh(x complex128) complex128 ``` Cosh returns the hyperbolic cosine of x. func Cot -------- ``` func Cot(x complex128) complex128 ``` Cot returns the cotangent of x. func Exp -------- ``` func Exp(x complex128) complex128 ``` Exp returns e\*\*x, the base-e exponential of x. #### Example ExampleExp computes Euler's identity. Code: ``` fmt.Printf("%.1f", cmplx.Exp(1i*math.Pi)+1) ``` Output: ``` (0.0+0.0i) ``` func Inf -------- ``` func Inf() complex128 ``` Inf returns a complex infinity, complex(+Inf, +Inf). func IsInf ---------- ``` func IsInf(x complex128) bool ``` IsInf reports whether either real(x) or imag(x) is an infinity. func IsNaN ---------- ``` func IsNaN(x complex128) bool ``` IsNaN reports whether either real(x) or imag(x) is NaN and neither is an infinity. func Log -------- ``` func Log(x complex128) complex128 ``` Log returns the natural logarithm of x. func Log10 ---------- ``` func Log10(x complex128) complex128 ``` Log10 returns the decimal logarithm of x. func NaN -------- ``` func NaN() complex128 ``` NaN returns a complex “not-a-number” value. func Phase ---------- ``` func Phase(x complex128) float64 ``` Phase returns the phase (also called the argument) of x. The returned value is in the range [-Pi, Pi]. func Polar ---------- ``` func Polar(x complex128) (r, θ float64) ``` Polar returns the absolute value r and phase θ of x, such that x = r \* e\*\*θi. The phase is in the range [-Pi, Pi]. #### Example Code: ``` r, theta := cmplx.Polar(2i) fmt.Printf("r: %.1f, θ: %.1f*π", r, theta/math.Pi) ``` Output: ``` r: 2.0, θ: 0.5*π ``` func Pow -------- ``` func Pow(x, y complex128) complex128 ``` Pow returns x\*\*y, the base-x exponential of y. For generalized compatibility with math.Pow: ``` Pow(0, ±0) returns 1+0i Pow(0, c) for real(c)<0 returns Inf+0i if imag(c) is zero, otherwise Inf+Inf i. ``` func Rect --------- ``` func Rect(r, θ float64) complex128 ``` Rect returns the complex number x with polar coordinates r, θ. func Sin -------- ``` func Sin(x complex128) complex128 ``` Sin returns the sine of x. func Sinh --------- ``` func Sinh(x complex128) complex128 ``` Sinh returns the hyperbolic sine of x. func Sqrt --------- ``` func Sqrt(x complex128) complex128 ``` Sqrt returns the square root of x. The result r is chosen so that real(r) ≥ 0 and imag(r) has the same sign as imag(x). func Tan -------- ``` func Tan(x complex128) complex128 ``` Tan returns the tangent of x. func Tanh --------- ``` func Tanh(x complex128) complex128 ``` Tanh returns the hyperbolic tangent of x. go Package big Package big ============ * `import "math/big"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package big implements arbitrary-precision arithmetic (big numbers). The following numeric types are supported: ``` Int signed integers Rat rational numbers Float floating-point numbers ``` The zero value for an Int, Rat, or Float correspond to 0. Thus, new values can be declared in the usual ways and denote 0 without further initialization: ``` var x Int // &x is an *Int of value 0 var r = &Rat{} // r is a *Rat of value 0 y := new(Float) // y is a *Float of value 0 ``` Alternatively, new values can be allocated and initialized with factory functions of the form: ``` func NewT(v V) *T ``` For instance, NewInt(x) returns an \*Int set to the value of the int64 argument x, NewRat(a, b) returns a \*Rat set to the fraction a/b where a and b are int64 values, and NewFloat(f) returns a \*Float initialized to the float64 argument f. More flexibility is provided with explicit setters, for instance: ``` var z1 Int z1.SetUint64(123) // z1 := 123 z2 := new(Rat).SetFloat64(1.25) // z2 := 5/4 z3 := new(Float).SetInt(z1) // z3 := 123.0 ``` Setters, numeric operations and predicates are represented as methods of the form: ``` func (z *T) SetV(v V) *T // z = v func (z *T) Unary(x *T) *T // z = unary x func (z *T) Binary(x, y *T) *T // z = x binary y func (x *T) Pred() P // p = pred(x) ``` with T one of Int, Rat, or Float. For unary and binary operations, the result is the receiver (usually named z in that case; see below); if it is one of the operands x or y it may be safely overwritten (and its memory reused). Arithmetic expressions are typically written as a sequence of individual method calls, with each call corresponding to an operation. The receiver denotes the result and the method arguments are the operation's operands. For instance, given three \*Int values a, b and c, the invocation ``` c.Add(a, b) ``` computes the sum a + b and stores the result in c, overwriting whatever value was held in c before. Unless specified otherwise, operations permit aliasing of parameters, so it is perfectly ok to write ``` sum.Add(sum, x) ``` to accumulate values x in a sum. (By always passing in a result value via the receiver, memory use can be much better controlled. Instead of having to allocate new memory for each result, an operation can reuse the space allocated for the result value, and overwrite that value with the new result in the process.) Notational convention: Incoming method parameters (including the receiver) are named consistently in the API to clarify their use. Incoming operands are usually named x, y, a, b, and so on, but never z. A parameter specifying the result is named z (typically the receiver). For instance, the arguments for (\*Int).Add are named x and y, and because the receiver specifies the result destination, it is called z: ``` func (z *Int) Add(x, y *Int) *Int ``` Methods of this form typically return the incoming receiver as well, to enable simple call chaining. Methods which don't require a result value to be passed in (for instance, Int.Sign), simply return the result. In this case, the receiver is typically the first operand, named x: ``` func (x *Int) Sign() int ``` Various methods support conversions between strings and corresponding numeric values, and vice versa: \*Int, \*Rat, and \*Float values implement the Stringer interface for a (default) string representation of the value, but also provide SetString methods to initialize a value from a string in a variety of supported formats (see the respective SetString documentation). Finally, \*Int, \*Rat, and \*Float satisfy the fmt package's Scanner interface for scanning and (except for \*Rat) the Formatter interface for formatted printing. #### Example (EConvergents) This example demonstrates how to use big.Rat to compute the first 15 terms in the sequence of rational convergents for the constant e (base of natural logarithm). Code: ``` package big_test import ( "fmt" "math/big" ) // Use the classic continued fraction for e // // e = [1; 0, 1, 1, 2, 1, 1, ... 2n, 1, 1, ...] // // i.e., for the nth term, use // // 1 if n mod 3 != 1 // (n-1)/3 * 2 if n mod 3 == 1 func recur(n, lim int64) *big.Rat { term := new(big.Rat) if n%3 != 1 { term.SetInt64(1) } else { term.SetInt64((n - 1) / 3 * 2) } if n > lim { return term } // Directly initialize frac as the fractional // inverse of the result of recur. frac := new(big.Rat).Inv(recur(n+1, lim)) return term.Add(term, frac) } // This example demonstrates how to use big.Rat to compute the // first 15 terms in the sequence of rational convergents for // the constant e (base of natural logarithm). func Example_eConvergents() { for i := 1; i <= 15; i++ { r := recur(0, int64(i)) // Print r both as a fraction and as a floating-point number. // Since big.Rat implements fmt.Formatter, we can use %-13s to // get a left-aligned string representation of the fraction. fmt.Printf("%-13s = %s\n", r, r.FloatString(8)) } // Output: // 2/1 = 2.00000000 // 3/1 = 3.00000000 // 8/3 = 2.66666667 // 11/4 = 2.75000000 // 19/7 = 2.71428571 // 87/32 = 2.71875000 // 106/39 = 2.71794872 // 193/71 = 2.71830986 // 1264/465 = 2.71827957 // 1457/536 = 2.71828358 // 2721/1001 = 2.71828172 // 23225/8544 = 2.71828184 // 25946/9545 = 2.71828182 // 49171/18089 = 2.71828183 // 517656/190435 = 2.71828183 } ``` #### Example (Fibonacci) This example demonstrates how to use big.Int to compute the smallest Fibonacci number with 100 decimal digits and to test whether it is prime. Code: ``` // Initialize two big ints with the first two numbers in the sequence. a := big.NewInt(0) b := big.NewInt(1) // Initialize limit as 10^99, the smallest integer with 100 digits. var limit big.Int limit.Exp(big.NewInt(10), big.NewInt(99), nil) // Loop while a is smaller than 1e100. for a.Cmp(&limit) < 0 { // Compute the next Fibonacci number, storing it in a. a.Add(a, b) // Swap a and b so that b is the next number in the sequence. a, b = b, a } fmt.Println(a) // 100-digit Fibonacci number // Test a for primality. // (ProbablyPrimes' argument sets the number of Miller-Rabin // rounds to be performed. 20 is a good value.) fmt.Println(a.ProbablyPrime(20)) ``` Output: ``` 1344719667586153181419716641724567886890850696275767987106294472017884974410332069524504824747437757 false ``` #### Example (Sqrt2) This example shows how to use big.Float to compute the square root of 2 with a precision of 200 bits, and how to print the result as a decimal number. Code: ``` // We'll do computations with 200 bits of precision in the mantissa. const prec = 200 // Compute the square root of 2 using Newton's Method. We start with // an initial estimate for sqrt(2), and then iterate: // x_{n+1} = 1/2 * ( x_n + (2.0 / x_n) ) // Since Newton's Method doubles the number of correct digits at each // iteration, we need at least log_2(prec) steps. steps := int(math.Log2(prec)) // Initialize values we need for the computation. two := new(big.Float).SetPrec(prec).SetInt64(2) half := new(big.Float).SetPrec(prec).SetFloat64(0.5) // Use 1 as the initial estimate. x := new(big.Float).SetPrec(prec).SetInt64(1) // We use t as a temporary variable. There's no need to set its precision // since big.Float values with unset (== 0) precision automatically assume // the largest precision of the arguments when used as the result (receiver) // of a big.Float operation. t := new(big.Float) // Iterate. for i := 0; i <= steps; i++ { t.Quo(two, x) // t = 2.0 / x_n t.Add(x, t) // t = x_n + (2.0 / x_n) x.Mul(half, t) // x_{n+1} = 0.5 * t } // We can use the usual fmt.Printf verbs since big.Float implements fmt.Formatter fmt.Printf("sqrt(2) = %.50f\n", x) // Print the error between 2 and x*x. t.Mul(x, x) // t = x*x fmt.Printf("error = %e\n", t.Sub(two, t)) ``` Output: ``` sqrt(2) = 1.41421356237309504880168872420969807856967187537695 error = 0.000000e+00 ``` Index ----- * [Constants](#pkg-constants) * [func Jacobi(x, y \*Int) int](#Jacobi) * [type Accuracy](#Accuracy) * [func (i Accuracy) String() string](#Accuracy.String) * [type ErrNaN](#ErrNaN) * [func (err ErrNaN) Error() string](#ErrNaN.Error) * [type Float](#Float) * [func NewFloat(x float64) \*Float](#NewFloat) * [func ParseFloat(s string, base int, prec uint, mode RoundingMode) (f \*Float, b int, err error)](#ParseFloat) * [func (z \*Float) Abs(x \*Float) \*Float](#Float.Abs) * [func (x \*Float) Acc() Accuracy](#Float.Acc) * [func (z \*Float) Add(x, y \*Float) \*Float](#Float.Add) * [func (x \*Float) Append(buf []byte, fmt byte, prec int) []byte](#Float.Append) * [func (x \*Float) Cmp(y \*Float) int](#Float.Cmp) * [func (z \*Float) Copy(x \*Float) \*Float](#Float.Copy) * [func (x \*Float) Float32() (float32, Accuracy)](#Float.Float32) * [func (x \*Float) Float64() (float64, Accuracy)](#Float.Float64) * [func (x \*Float) Format(s fmt.State, format rune)](#Float.Format) * [func (z \*Float) GobDecode(buf []byte) error](#Float.GobDecode) * [func (x \*Float) GobEncode() ([]byte, error)](#Float.GobEncode) * [func (x \*Float) Int(z \*Int) (\*Int, Accuracy)](#Float.Int) * [func (x \*Float) Int64() (int64, Accuracy)](#Float.Int64) * [func (x \*Float) IsInf() bool](#Float.IsInf) * [func (x \*Float) IsInt() bool](#Float.IsInt) * [func (x \*Float) MantExp(mant \*Float) (exp int)](#Float.MantExp) * [func (x \*Float) MarshalText() (text []byte, err error)](#Float.MarshalText) * [func (x \*Float) MinPrec() uint](#Float.MinPrec) * [func (x \*Float) Mode() RoundingMode](#Float.Mode) * [func (z \*Float) Mul(x, y \*Float) \*Float](#Float.Mul) * [func (z \*Float) Neg(x \*Float) \*Float](#Float.Neg) * [func (z \*Float) Parse(s string, base int) (f \*Float, b int, err error)](#Float.Parse) * [func (x \*Float) Prec() uint](#Float.Prec) * [func (z \*Float) Quo(x, y \*Float) \*Float](#Float.Quo) * [func (x \*Float) Rat(z \*Rat) (\*Rat, Accuracy)](#Float.Rat) * [func (z \*Float) Scan(s fmt.ScanState, ch rune) error](#Float.Scan) * [func (z \*Float) Set(x \*Float) \*Float](#Float.Set) * [func (z \*Float) SetFloat64(x float64) \*Float](#Float.SetFloat64) * [func (z \*Float) SetInf(signbit bool) \*Float](#Float.SetInf) * [func (z \*Float) SetInt(x \*Int) \*Float](#Float.SetInt) * [func (z \*Float) SetInt64(x int64) \*Float](#Float.SetInt64) * [func (z \*Float) SetMantExp(mant \*Float, exp int) \*Float](#Float.SetMantExp) * [func (z \*Float) SetMode(mode RoundingMode) \*Float](#Float.SetMode) * [func (z \*Float) SetPrec(prec uint) \*Float](#Float.SetPrec) * [func (z \*Float) SetRat(x \*Rat) \*Float](#Float.SetRat) * [func (z \*Float) SetString(s string) (\*Float, bool)](#Float.SetString) * [func (z \*Float) SetUint64(x uint64) \*Float](#Float.SetUint64) * [func (x \*Float) Sign() int](#Float.Sign) * [func (x \*Float) Signbit() bool](#Float.Signbit) * [func (z \*Float) Sqrt(x \*Float) \*Float](#Float.Sqrt) * [func (x \*Float) String() string](#Float.String) * [func (z \*Float) Sub(x, y \*Float) \*Float](#Float.Sub) * [func (x \*Float) Text(format byte, prec int) string](#Float.Text) * [func (x \*Float) Uint64() (uint64, Accuracy)](#Float.Uint64) * [func (z \*Float) UnmarshalText(text []byte) error](#Float.UnmarshalText) * [type Int](#Int) * [func NewInt(x int64) \*Int](#NewInt) * [func (z \*Int) Abs(x \*Int) \*Int](#Int.Abs) * [func (z \*Int) Add(x, y \*Int) \*Int](#Int.Add) * [func (z \*Int) And(x, y \*Int) \*Int](#Int.And) * [func (z \*Int) AndNot(x, y \*Int) \*Int](#Int.AndNot) * [func (x \*Int) Append(buf []byte, base int) []byte](#Int.Append) * [func (z \*Int) Binomial(n, k int64) \*Int](#Int.Binomial) * [func (x \*Int) Bit(i int) uint](#Int.Bit) * [func (x \*Int) BitLen() int](#Int.BitLen) * [func (x \*Int) Bits() []Word](#Int.Bits) * [func (x \*Int) Bytes() []byte](#Int.Bytes) * [func (x \*Int) Cmp(y \*Int) (r int)](#Int.Cmp) * [func (x \*Int) CmpAbs(y \*Int) int](#Int.CmpAbs) * [func (z \*Int) Div(x, y \*Int) \*Int](#Int.Div) * [func (z \*Int) DivMod(x, y, m \*Int) (\*Int, \*Int)](#Int.DivMod) * [func (z \*Int) Exp(x, y, m \*Int) \*Int](#Int.Exp) * [func (x \*Int) FillBytes(buf []byte) []byte](#Int.FillBytes) * [func (x \*Int) Format(s fmt.State, ch rune)](#Int.Format) * [func (z \*Int) GCD(x, y, a, b \*Int) \*Int](#Int.GCD) * [func (z \*Int) GobDecode(buf []byte) error](#Int.GobDecode) * [func (x \*Int) GobEncode() ([]byte, error)](#Int.GobEncode) * [func (x \*Int) Int64() int64](#Int.Int64) * [func (x \*Int) IsInt64() bool](#Int.IsInt64) * [func (x \*Int) IsUint64() bool](#Int.IsUint64) * [func (z \*Int) Lsh(x \*Int, n uint) \*Int](#Int.Lsh) * [func (x \*Int) MarshalJSON() ([]byte, error)](#Int.MarshalJSON) * [func (x \*Int) MarshalText() (text []byte, err error)](#Int.MarshalText) * [func (z \*Int) Mod(x, y \*Int) \*Int](#Int.Mod) * [func (z \*Int) ModInverse(g, n \*Int) \*Int](#Int.ModInverse) * [func (z \*Int) ModSqrt(x, p \*Int) \*Int](#Int.ModSqrt) * [func (z \*Int) Mul(x, y \*Int) \*Int](#Int.Mul) * [func (z \*Int) MulRange(a, b int64) \*Int](#Int.MulRange) * [func (z \*Int) Neg(x \*Int) \*Int](#Int.Neg) * [func (z \*Int) Not(x \*Int) \*Int](#Int.Not) * [func (z \*Int) Or(x, y \*Int) \*Int](#Int.Or) * [func (x \*Int) ProbablyPrime(n int) bool](#Int.ProbablyPrime) * [func (z \*Int) Quo(x, y \*Int) \*Int](#Int.Quo) * [func (z \*Int) QuoRem(x, y, r \*Int) (\*Int, \*Int)](#Int.QuoRem) * [func (z \*Int) Rand(rnd \*rand.Rand, n \*Int) \*Int](#Int.Rand) * [func (z \*Int) Rem(x, y \*Int) \*Int](#Int.Rem) * [func (z \*Int) Rsh(x \*Int, n uint) \*Int](#Int.Rsh) * [func (z \*Int) Scan(s fmt.ScanState, ch rune) error](#Int.Scan) * [func (z \*Int) Set(x \*Int) \*Int](#Int.Set) * [func (z \*Int) SetBit(x \*Int, i int, b uint) \*Int](#Int.SetBit) * [func (z \*Int) SetBits(abs []Word) \*Int](#Int.SetBits) * [func (z \*Int) SetBytes(buf []byte) \*Int](#Int.SetBytes) * [func (z \*Int) SetInt64(x int64) \*Int](#Int.SetInt64) * [func (z \*Int) SetString(s string, base int) (\*Int, bool)](#Int.SetString) * [func (z \*Int) SetUint64(x uint64) \*Int](#Int.SetUint64) * [func (x \*Int) Sign() int](#Int.Sign) * [func (z \*Int) Sqrt(x \*Int) \*Int](#Int.Sqrt) * [func (x \*Int) String() string](#Int.String) * [func (z \*Int) Sub(x, y \*Int) \*Int](#Int.Sub) * [func (x \*Int) Text(base int) string](#Int.Text) * [func (x \*Int) TrailingZeroBits() uint](#Int.TrailingZeroBits) * [func (x \*Int) Uint64() uint64](#Int.Uint64) * [func (z \*Int) UnmarshalJSON(text []byte) error](#Int.UnmarshalJSON) * [func (z \*Int) UnmarshalText(text []byte) error](#Int.UnmarshalText) * [func (z \*Int) Xor(x, y \*Int) \*Int](#Int.Xor) * [type Rat](#Rat) * [func NewRat(a, b int64) \*Rat](#NewRat) * [func (z \*Rat) Abs(x \*Rat) \*Rat](#Rat.Abs) * [func (z \*Rat) Add(x, y \*Rat) \*Rat](#Rat.Add) * [func (x \*Rat) Cmp(y \*Rat) int](#Rat.Cmp) * [func (x \*Rat) Denom() \*Int](#Rat.Denom) * [func (x \*Rat) Float32() (f float32, exact bool)](#Rat.Float32) * [func (x \*Rat) Float64() (f float64, exact bool)](#Rat.Float64) * [func (x \*Rat) FloatString(prec int) string](#Rat.FloatString) * [func (z \*Rat) GobDecode(buf []byte) error](#Rat.GobDecode) * [func (x \*Rat) GobEncode() ([]byte, error)](#Rat.GobEncode) * [func (z \*Rat) Inv(x \*Rat) \*Rat](#Rat.Inv) * [func (x \*Rat) IsInt() bool](#Rat.IsInt) * [func (x \*Rat) MarshalText() (text []byte, err error)](#Rat.MarshalText) * [func (z \*Rat) Mul(x, y \*Rat) \*Rat](#Rat.Mul) * [func (z \*Rat) Neg(x \*Rat) \*Rat](#Rat.Neg) * [func (x \*Rat) Num() \*Int](#Rat.Num) * [func (z \*Rat) Quo(x, y \*Rat) \*Rat](#Rat.Quo) * [func (x \*Rat) RatString() string](#Rat.RatString) * [func (z \*Rat) Scan(s fmt.ScanState, ch rune) error](#Rat.Scan) * [func (z \*Rat) Set(x \*Rat) \*Rat](#Rat.Set) * [func (z \*Rat) SetFloat64(f float64) \*Rat](#Rat.SetFloat64) * [func (z \*Rat) SetFrac(a, b \*Int) \*Rat](#Rat.SetFrac) * [func (z \*Rat) SetFrac64(a, b int64) \*Rat](#Rat.SetFrac64) * [func (z \*Rat) SetInt(x \*Int) \*Rat](#Rat.SetInt) * [func (z \*Rat) SetInt64(x int64) \*Rat](#Rat.SetInt64) * [func (z \*Rat) SetString(s string) (\*Rat, bool)](#Rat.SetString) * [func (z \*Rat) SetUint64(x uint64) \*Rat](#Rat.SetUint64) * [func (x \*Rat) Sign() int](#Rat.Sign) * [func (x \*Rat) String() string](#Rat.String) * [func (z \*Rat) Sub(x, y \*Rat) \*Rat](#Rat.Sub) * [func (z \*Rat) UnmarshalText(text []byte) error](#Rat.UnmarshalText) * [type RoundingMode](#RoundingMode) * [func (i RoundingMode) String() string](#RoundingMode.String) * [type Word](#Word) ### Examples [Float.Add](#example_Float_Add) [Float.Cmp](#example_Float_Cmp) [Float.Scan](#example_Float_Scan) [Float.SetString](#example_Float_SetString) [Float (Shift)](#example_Float_shift) [Int.Scan](#example_Int_Scan) [Int.SetString](#example_Int_SetString) [Rat.Scan](#example_Rat_Scan) [Rat.SetString](#example_Rat_SetString) [RoundingMode](#example_RoundingMode) [Package (EConvergents)](#example__eConvergents) [Package (Fibonacci)](#example__fibonacci) [Package (Sqrt2)](#example__sqrt2) ### Package files accuracy\_string.go arith.go arith\_amd64.go arith\_decl.go decimal.go doc.go float.go floatconv.go floatmarsh.go ftoa.go int.go intconv.go intmarsh.go nat.go natconv.go natdiv.go prime.go rat.go ratconv.go ratmarsh.go roundingmode\_string.go sqrt.go Constants --------- Exponent and precision limits. ``` const ( MaxExp = math.MaxInt32 // largest supported exponent MinExp = math.MinInt32 // smallest supported exponent MaxPrec = math.MaxUint32 // largest (theoretically) supported precision; likely memory-limited ) ``` MaxBase is the largest number base accepted for string conversions. ``` const MaxBase = 10 + ('z' - 'a' + 1) + ('Z' - 'A' + 1) ``` func Jacobi 1.5 --------------- ``` func Jacobi(x, y *Int) int ``` Jacobi returns the Jacobi symbol (x/y), either +1, -1, or 0. The y argument must be an odd integer. type Accuracy 1.5 ----------------- Accuracy describes the rounding error produced by the most recent operation that generated a Float value, relative to the exact value. ``` type Accuracy int8 ``` Constants describing the Accuracy of a Float. ``` const ( Below Accuracy = -1 Exact Accuracy = 0 Above Accuracy = +1 ) ``` ### func (Accuracy) String 1.5 ``` func (i Accuracy) String() string ``` type ErrNaN 1.5 --------------- An ErrNaN panic is raised by a Float operation that would lead to a NaN under IEEE-754 rules. An ErrNaN implements the error interface. ``` type ErrNaN struct { // contains filtered or unexported fields } ``` ### func (ErrNaN) Error 1.5 ``` func (err ErrNaN) Error() string ``` type Float 1.5 -------------- A nonzero finite Float represents a multi-precision floating point number ``` sign × mantissa × 2**exponent ``` with 0.5 <= mantissa < 1.0, and MinExp <= exponent <= MaxExp. A Float may also be zero (+0, -0) or infinite (+Inf, -Inf). All Floats are ordered, and the ordering of two Floats x and y is defined by x.Cmp(y). Each Float value also has a precision, rounding mode, and accuracy. The precision is the maximum number of mantissa bits available to represent the value. The rounding mode specifies how a result should be rounded to fit into the mantissa bits, and accuracy describes the rounding error with respect to the exact result. Unless specified otherwise, all operations (including setters) that specify a \*Float variable for the result (usually via the receiver with the exception of MantExp), round the numeric result according to the precision and rounding mode of the result variable. If the provided result precision is 0 (see below), it is set to the precision of the argument with the largest precision value before any rounding takes place, and the rounding mode remains unchanged. Thus, uninitialized Floats provided as result arguments will have their precision set to a reasonable value determined by the operands, and their mode is the zero value for RoundingMode (ToNearestEven). By setting the desired precision to 24 or 53 and using matching rounding mode (typically ToNearestEven), Float operations produce the same results as the corresponding float32 or float64 IEEE-754 arithmetic for operands that correspond to normal (i.e., not denormal) float32 or float64 numbers. Exponent underflow and overflow lead to a 0 or an Infinity for different values than IEEE-754 because Float exponents have a much larger range. The zero (uninitialized) value for a Float is ready to use and represents the number +0.0 exactly, with precision 0 and rounding mode ToNearestEven. Operations always take pointer arguments (\*Float) rather than Float values, and each unique Float value requires its own unique \*Float pointer. To "copy" a Float value, an existing (or newly allocated) Float must be set to a new value using the Float.Set method; shallow copies of Floats are not supported and may lead to errors. ``` type Float struct { // contains filtered or unexported fields } ``` #### Example (Shift) Code: ``` // Implement Float "shift" by modifying the (binary) exponents directly. for s := -5; s <= 5; s++ { x := big.NewFloat(0.5) x.SetMantExp(x, x.MantExp(nil)+s) // shift x by s fmt.Println(x) } ``` Output: ``` 0.015625 0.03125 0.0625 0.125 0.25 0.5 1 2 4 8 16 ``` ### func NewFloat 1.5 ``` func NewFloat(x float64) *Float ``` NewFloat allocates and returns a new Float set to x, with precision 53 and rounding mode ToNearestEven. NewFloat panics with ErrNaN if x is a NaN. ### func ParseFloat 1.5 ``` func ParseFloat(s string, base int, prec uint, mode RoundingMode) (f *Float, b int, err error) ``` ParseFloat is like f.Parse(s, base) with f set to the given precision and rounding mode. ### func (\*Float) Abs 1.5 ``` func (z *Float) Abs(x *Float) *Float ``` Abs sets z to the (possibly rounded) value |x| (the absolute value of x) and returns z. ### func (\*Float) Acc 1.5 ``` func (x *Float) Acc() Accuracy ``` Acc returns the accuracy of x produced by the most recent operation, unless explicitly documented otherwise by that operation. ### func (\*Float) Add 1.5 ``` func (z *Float) Add(x, y *Float) *Float ``` Add sets z to the rounded sum x+y and returns z. If z's precision is 0, it is changed to the larger of x's or y's precision before the operation. Rounding is performed according to z's precision and rounding mode; and z's accuracy reports the result error relative to the exact (not rounded) result. Add panics with ErrNaN if x and y are infinities with opposite signs. The value of z is undefined in that case. #### Example Code: ``` // Operate on numbers of different precision. var x, y, z big.Float x.SetInt64(1000) // x is automatically set to 64bit precision y.SetFloat64(2.718281828) // y is automatically set to 53bit precision z.SetPrec(32) z.Add(&x, &y) fmt.Printf("x = %.10g (%s, prec = %d, acc = %s)\n", &x, x.Text('p', 0), x.Prec(), x.Acc()) fmt.Printf("y = %.10g (%s, prec = %d, acc = %s)\n", &y, y.Text('p', 0), y.Prec(), y.Acc()) fmt.Printf("z = %.10g (%s, prec = %d, acc = %s)\n", &z, z.Text('p', 0), z.Prec(), z.Acc()) ``` Output: ``` x = 1000 (0x.fap+10, prec = 64, acc = Exact) y = 2.718281828 (0x.adf85458248cd8p+2, prec = 53, acc = Exact) z = 1002.718282 (0x.faadf854p+10, prec = 32, acc = Below) ``` ### func (\*Float) Append 1.5 ``` func (x *Float) Append(buf []byte, fmt byte, prec int) []byte ``` Append appends to buf the string form of the floating-point number x, as generated by x.Text, and returns the extended buffer. ### func (\*Float) Cmp 1.5 ``` func (x *Float) Cmp(y *Float) int ``` Cmp compares x and y and returns: ``` -1 if x < y 0 if x == y (incl. -0 == 0, -Inf == -Inf, and +Inf == +Inf) +1 if x > y ``` #### Example Code: ``` inf := math.Inf(1) zero := 0.0 operands := []float64{-inf, -1.2, -zero, 0, +1.2, +inf} fmt.Println(" x y cmp") fmt.Println("---------------") for _, x64 := range operands { x := big.NewFloat(x64) for _, y64 := range operands { y := big.NewFloat(y64) fmt.Printf("%4g %4g %3d\n", x, y, x.Cmp(y)) } fmt.Println() } ``` Output: ``` x y cmp --------------- -Inf -Inf 0 -Inf -1.2 -1 -Inf -0 -1 -Inf 0 -1 -Inf 1.2 -1 -Inf +Inf -1 -1.2 -Inf 1 -1.2 -1.2 0 -1.2 -0 -1 -1.2 0 -1 -1.2 1.2 -1 -1.2 +Inf -1 -0 -Inf 1 -0 -1.2 1 -0 -0 0 -0 0 0 -0 1.2 -1 -0 +Inf -1 0 -Inf 1 0 -1.2 1 0 -0 0 0 0 0 0 1.2 -1 0 +Inf -1 1.2 -Inf 1 1.2 -1.2 1 1.2 -0 1 1.2 0 1 1.2 1.2 0 1.2 +Inf -1 +Inf -Inf 1 +Inf -1.2 1 +Inf -0 1 +Inf 0 1 +Inf 1.2 1 +Inf +Inf 0 ``` ### func (\*Float) Copy 1.5 ``` func (z *Float) Copy(x *Float) *Float ``` Copy sets z to x, with the same precision, rounding mode, and accuracy as x, and returns z. x is not changed even if z and x are the same. ### func (\*Float) Float32 1.5 ``` func (x *Float) Float32() (float32, Accuracy) ``` Float32 returns the float32 value nearest to x. If x is too small to be represented by a float32 (|x| < math.SmallestNonzeroFloat32), the result is (0, Below) or (-0, Above), respectively, depending on the sign of x. If x is too large to be represented by a float32 (|x| > math.MaxFloat32), the result is (+Inf, Above) or (-Inf, Below), depending on the sign of x. ### func (\*Float) Float64 1.5 ``` func (x *Float) Float64() (float64, Accuracy) ``` Float64 returns the float64 value nearest to x. If x is too small to be represented by a float64 (|x| < math.SmallestNonzeroFloat64), the result is (0, Below) or (-0, Above), respectively, depending on the sign of x. If x is too large to be represented by a float64 (|x| > math.MaxFloat64), the result is (+Inf, Above) or (-Inf, Below), depending on the sign of x. ### func (\*Float) Format 1.5 ``` func (x *Float) Format(s fmt.State, format rune) ``` Format implements fmt.Formatter. It accepts all the regular formats for floating-point numbers ('b', 'e', 'E', 'f', 'F', 'g', 'G', 'x') as well as 'p' and 'v'. See (\*Float).Text for the interpretation of 'p'. The 'v' format is handled like 'g'. Format also supports specification of the minimum precision in digits, the output field width, as well as the format flags '+' and ' ' for sign control, '0' for space or zero padding, and '-' for left or right justification. See the fmt package for details. ### func (\*Float) GobDecode 1.7 ``` func (z *Float) GobDecode(buf []byte) error ``` GobDecode implements the gob.GobDecoder interface. The result is rounded per the precision and rounding mode of z unless z's precision is 0, in which case z is set exactly to the decoded value. ### func (\*Float) GobEncode 1.7 ``` func (x *Float) GobEncode() ([]byte, error) ``` GobEncode implements the gob.GobEncoder interface. The Float value and all its attributes (precision, rounding mode, accuracy) are marshaled. ### func (\*Float) Int 1.5 ``` func (x *Float) Int(z *Int) (*Int, Accuracy) ``` Int returns the result of truncating x towards zero; or nil if x is an infinity. The result is Exact if x.IsInt(); otherwise it is Below for x > 0, and Above for x < 0. If a non-nil \*Int argument z is provided, Int stores the result in z instead of allocating a new Int. ### func (\*Float) Int64 1.5 ``` func (x *Float) Int64() (int64, Accuracy) ``` Int64 returns the integer resulting from truncating x towards zero. If math.MinInt64 <= x <= math.MaxInt64, the result is Exact if x is an integer, and Above (x < 0) or Below (x > 0) otherwise. The result is (math.MinInt64, Above) for x < math.MinInt64, and (math.MaxInt64, Below) for x > math.MaxInt64. ### func (\*Float) IsInf 1.5 ``` func (x *Float) IsInf() bool ``` IsInf reports whether x is +Inf or -Inf. ### func (\*Float) IsInt 1.5 ``` func (x *Float) IsInt() bool ``` IsInt reports whether x is an integer. ±Inf values are not integers. ### func (\*Float) MantExp 1.5 ``` func (x *Float) MantExp(mant *Float) (exp int) ``` MantExp breaks x into its mantissa and exponent components and returns the exponent. If a non-nil mant argument is provided its value is set to the mantissa of x, with the same precision and rounding mode as x. The components satisfy x == mant × 2\*\*exp, with 0.5 <= |mant| < 1.0. Calling MantExp with a nil argument is an efficient way to get the exponent of the receiver. Special cases are: ``` ( ±0).MantExp(mant) = 0, with mant set to ±0 (±Inf).MantExp(mant) = 0, with mant set to ±Inf ``` x and mant may be the same in which case x is set to its mantissa value. ### func (\*Float) MarshalText 1.6 ``` func (x *Float) MarshalText() (text []byte, err error) ``` MarshalText implements the encoding.TextMarshaler interface. Only the Float value is marshaled (in full precision), other attributes such as precision or accuracy are ignored. ### func (\*Float) MinPrec 1.5 ``` func (x *Float) MinPrec() uint ``` MinPrec returns the minimum precision required to represent x exactly (i.e., the smallest prec before x.SetPrec(prec) would start rounding x). The result is 0 for |x| == 0 and |x| == Inf. ### func (\*Float) Mode 1.5 ``` func (x *Float) Mode() RoundingMode ``` Mode returns the rounding mode of x. ### func (\*Float) Mul 1.5 ``` func (z *Float) Mul(x, y *Float) *Float ``` Mul sets z to the rounded product x\*y and returns z. Precision, rounding, and accuracy reporting are as for Add. Mul panics with ErrNaN if one operand is zero and the other operand an infinity. The value of z is undefined in that case. ### func (\*Float) Neg 1.5 ``` func (z *Float) Neg(x *Float) *Float ``` Neg sets z to the (possibly rounded) value of x with its sign negated, and returns z. ### func (\*Float) Parse 1.5 ``` func (z *Float) Parse(s string, base int) (f *Float, b int, err error) ``` Parse parses s which must contain a text representation of a floating- point number with a mantissa in the given conversion base (the exponent is always a decimal number), or a string representing an infinite value. For base 0, an underscore character “\_” may appear between a base prefix and an adjacent digit, and between successive digits; such underscores do not change the value of the number, or the returned digit count. Incorrect placement of underscores is reported as an error if there are no other errors. If base != 0, underscores are not recognized and thus terminate scanning like any other character that is not a valid radix point or digit. It sets z to the (possibly rounded) value of the corresponding floating- point value, and returns z, the actual base b, and an error err, if any. The entire string (not just a prefix) must be consumed for success. If z's precision is 0, it is changed to 64 before rounding takes effect. The number must be of the form: ``` number = [ sign ] ( float | "inf" | "Inf" ) . sign = "+" | "-" . float = ( mantissa | prefix pmantissa ) [ exponent ] . prefix = "0" [ "b" | "B" | "o" | "O" | "x" | "X" ] . mantissa = digits "." [ digits ] | digits | "." digits . pmantissa = [ "_" ] digits "." [ digits ] | [ "_" ] digits | "." digits . exponent = ( "e" | "E" | "p" | "P" ) [ sign ] digits . digits = digit { [ "_" ] digit } . digit = "0" ... "9" | "a" ... "z" | "A" ... "Z" . ``` The base argument must be 0, 2, 8, 10, or 16. Providing an invalid base argument will lead to a run-time panic. For base 0, the number prefix determines the actual base: A prefix of “0b” or “0B” selects base 2, “0o” or “0O” selects base 8, and “0x” or “0X” selects base 16. Otherwise, the actual base is 10 and no prefix is accepted. The octal prefix "0" is not supported (a leading "0" is simply considered a "0"). A "p" or "P" exponent indicates a base 2 (rather then base 10) exponent; for instance, "0x1.fffffffffffffp1023" (using base 0) represents the maximum float64 value. For hexadecimal mantissae, the exponent character must be one of 'p' or 'P', if present (an "e" or "E" exponent indicator cannot be distinguished from a mantissa digit). The returned \*Float f is nil and the value of z is valid but not defined if an error is reported. ### func (\*Float) Prec 1.5 ``` func (x *Float) Prec() uint ``` Prec returns the mantissa precision of x in bits. The result may be 0 for |x| == 0 and |x| == Inf. ### func (\*Float) Quo 1.5 ``` func (z *Float) Quo(x, y *Float) *Float ``` Quo sets z to the rounded quotient x/y and returns z. Precision, rounding, and accuracy reporting are as for Add. Quo panics with ErrNaN if both operands are zero or infinities. The value of z is undefined in that case. ### func (\*Float) Rat 1.5 ``` func (x *Float) Rat(z *Rat) (*Rat, Accuracy) ``` Rat returns the rational number corresponding to x; or nil if x is an infinity. The result is Exact if x is not an Inf. If a non-nil \*Rat argument z is provided, Rat stores the result in z instead of allocating a new Rat. ### func (\*Float) Scan 1.8 ``` func (z *Float) Scan(s fmt.ScanState, ch rune) error ``` Scan is a support routine for fmt.Scanner; it sets z to the value of the scanned number. It accepts formats whose verbs are supported by fmt.Scan for floating point values, which are: 'b' (binary), 'e', 'E', 'f', 'F', 'g' and 'G'. Scan doesn't handle ±Inf. #### Example Code: ``` // The Scan function is rarely used directly; // the fmt package recognizes it as an implementation of fmt.Scanner. f := new(big.Float) _, err := fmt.Sscan("1.19282e99", f) if err != nil { log.Println("error scanning value:", err) } else { fmt.Println(f) } ``` Output: ``` 1.19282e+99 ``` ### func (\*Float) Set 1.5 ``` func (z *Float) Set(x *Float) *Float ``` Set sets z to the (possibly rounded) value of x and returns z. If z's precision is 0, it is changed to the precision of x before setting z (and rounding will have no effect). Rounding is performed according to z's precision and rounding mode; and z's accuracy reports the result error relative to the exact (not rounded) result. ### func (\*Float) SetFloat64 1.5 ``` func (z *Float) SetFloat64(x float64) *Float ``` SetFloat64 sets z to the (possibly rounded) value of x and returns z. If z's precision is 0, it is changed to 53 (and rounding will have no effect). SetFloat64 panics with ErrNaN if x is a NaN. ### func (\*Float) SetInf 1.5 ``` func (z *Float) SetInf(signbit bool) *Float ``` SetInf sets z to the infinite Float -Inf if signbit is set, or +Inf if signbit is not set, and returns z. The precision of z is unchanged and the result is always Exact. ### func (\*Float) SetInt 1.5 ``` func (z *Float) SetInt(x *Int) *Float ``` SetInt sets z to the (possibly rounded) value of x and returns z. If z's precision is 0, it is changed to the larger of x.BitLen() or 64 (and rounding will have no effect). ### func (\*Float) SetInt64 1.5 ``` func (z *Float) SetInt64(x int64) *Float ``` SetInt64 sets z to the (possibly rounded) value of x and returns z. If z's precision is 0, it is changed to 64 (and rounding will have no effect). ### func (\*Float) SetMantExp 1.5 ``` func (z *Float) SetMantExp(mant *Float, exp int) *Float ``` SetMantExp sets z to mant × 2\*\*exp and returns z. The result z has the same precision and rounding mode as mant. SetMantExp is an inverse of MantExp but does not require 0.5 <= |mant| < 1.0. Specifically, for a given x of type \*Float, SetMantExp relates to MantExp as follows: ``` mant := new(Float) new(Float).SetMantExp(mant, x.MantExp(mant)).Cmp(x) == 0 ``` Special cases are: ``` z.SetMantExp( ±0, exp) = ±0 z.SetMantExp(±Inf, exp) = ±Inf ``` z and mant may be the same in which case z's exponent is set to exp. ### func (\*Float) SetMode 1.5 ``` func (z *Float) SetMode(mode RoundingMode) *Float ``` SetMode sets z's rounding mode to mode and returns an exact z. z remains unchanged otherwise. z.SetMode(z.Mode()) is a cheap way to set z's accuracy to Exact. ### func (\*Float) SetPrec 1.5 ``` func (z *Float) SetPrec(prec uint) *Float ``` SetPrec sets z's precision to prec and returns the (possibly) rounded value of z. Rounding occurs according to z's rounding mode if the mantissa cannot be represented in prec bits without loss of precision. SetPrec(0) maps all finite values to ±0; infinite values remain unchanged. If prec > MaxPrec, it is set to MaxPrec. ### func (\*Float) SetRat 1.5 ``` func (z *Float) SetRat(x *Rat) *Float ``` SetRat sets z to the (possibly rounded) value of x and returns z. If z's precision is 0, it is changed to the largest of a.BitLen(), b.BitLen(), or 64; with x = a/b. ### func (\*Float) SetString 1.5 ``` func (z *Float) SetString(s string) (*Float, bool) ``` SetString sets z to the value of s and returns z and a boolean indicating success. s must be a floating-point number of the same format as accepted by Parse, with base argument 0. The entire string (not just a prefix) must be valid for success. If the operation failed, the value of z is undefined but the returned value is nil. #### Example Code: ``` f := new(big.Float) f.SetString("3.14159") fmt.Println(f) ``` Output: ``` 3.14159 ``` ### func (\*Float) SetUint64 1.5 ``` func (z *Float) SetUint64(x uint64) *Float ``` SetUint64 sets z to the (possibly rounded) value of x and returns z. If z's precision is 0, it is changed to 64 (and rounding will have no effect). ### func (\*Float) Sign 1.5 ``` func (x *Float) Sign() int ``` Sign returns: ``` -1 if x < 0 0 if x is ±0 +1 if x > 0 ``` ### func (\*Float) Signbit 1.5 ``` func (x *Float) Signbit() bool ``` Signbit reports whether x is negative or negative zero. ### func (\*Float) Sqrt 1.10 ``` func (z *Float) Sqrt(x *Float) *Float ``` Sqrt sets z to the rounded square root of x, and returns it. If z's precision is 0, it is changed to x's precision before the operation. Rounding is performed according to z's precision and rounding mode, but z's accuracy is not computed. Specifically, the result of z.Acc() is undefined. The function panics if z < 0. The value of z is undefined in that case. ### func (\*Float) String 1.5 ``` func (x *Float) String() string ``` String formats x like x.Text('g', 10). (String must be called explicitly, Float.Format does not support %s verb.) ### func (\*Float) Sub 1.5 ``` func (z *Float) Sub(x, y *Float) *Float ``` Sub sets z to the rounded difference x-y and returns z. Precision, rounding, and accuracy reporting are as for Add. Sub panics with ErrNaN if x and y are infinities with equal signs. The value of z is undefined in that case. ### func (\*Float) Text 1.5 ``` func (x *Float) Text(format byte, prec int) string ``` Text converts the floating-point number x to a string according to the given format and precision prec. The format is one of: ``` 'e' -d.dddde±dd, decimal exponent, at least two (possibly 0) exponent digits 'E' -d.ddddE±dd, decimal exponent, at least two (possibly 0) exponent digits 'f' -ddddd.dddd, no exponent 'g' like 'e' for large exponents, like 'f' otherwise 'G' like 'E' for large exponents, like 'f' otherwise 'x' -0xd.dddddp±dd, hexadecimal mantissa, decimal power of two exponent 'p' -0x.dddp±dd, hexadecimal mantissa, decimal power of two exponent (non-standard) 'b' -ddddddp±dd, decimal mantissa, decimal power of two exponent (non-standard) ``` For the power-of-two exponent formats, the mantissa is printed in normalized form: ``` 'x' hexadecimal mantissa in [1, 2), or 0 'p' hexadecimal mantissa in [½, 1), or 0 'b' decimal integer mantissa using x.Prec() bits, or 0 ``` Note that the 'x' form is the one used by most other languages and libraries. If format is a different character, Text returns a "%" followed by the unrecognized format character. The precision prec controls the number of digits (excluding the exponent) printed by the 'e', 'E', 'f', 'g', 'G', and 'x' formats. For 'e', 'E', 'f', and 'x', it is the number of digits after the decimal point. For 'g' and 'G' it is the total number of digits. A negative precision selects the smallest number of decimal digits necessary to identify the value x uniquely using x.Prec() mantissa bits. The prec value is ignored for the 'b' and 'p' formats. ### func (\*Float) Uint64 1.5 ``` func (x *Float) Uint64() (uint64, Accuracy) ``` Uint64 returns the unsigned integer resulting from truncating x towards zero. If 0 <= x <= math.MaxUint64, the result is Exact if x is an integer and Below otherwise. The result is (0, Above) for x < 0, and (math.MaxUint64, Below) for x > math.MaxUint64. ### func (\*Float) UnmarshalText 1.6 ``` func (z *Float) UnmarshalText(text []byte) error ``` UnmarshalText implements the encoding.TextUnmarshaler interface. The result is rounded per the precision and rounding mode of z. If z's precision is 0, it is changed to 64 before rounding takes effect. type Int -------- An Int represents a signed multi-precision integer. The zero value for an Int represents the value 0. Operations always take pointer arguments (\*Int) rather than Int values, and each unique Int value requires its own unique \*Int pointer. To "copy" an Int value, an existing (or newly allocated) Int must be set to a new value using the Int.Set method; shallow copies of Ints are not supported and may lead to errors. ``` type Int struct { // contains filtered or unexported fields } ``` ### func NewInt ``` func NewInt(x int64) *Int ``` NewInt allocates and returns a new Int set to x. ### func (\*Int) Abs ``` func (z *Int) Abs(x *Int) *Int ``` Abs sets z to |x| (the absolute value of x) and returns z. ### func (\*Int) Add ``` func (z *Int) Add(x, y *Int) *Int ``` Add sets z to the sum x+y and returns z. ### func (\*Int) And ``` func (z *Int) And(x, y *Int) *Int ``` And sets z = x & y and returns z. ### func (\*Int) AndNot ``` func (z *Int) AndNot(x, y *Int) *Int ``` AndNot sets z = x &^ y and returns z. ### func (\*Int) Append 1.6 ``` func (x *Int) Append(buf []byte, base int) []byte ``` Append appends the string representation of x, as generated by x.Text(base), to buf and returns the extended buffer. ### func (\*Int) Binomial ``` func (z *Int) Binomial(n, k int64) *Int ``` Binomial sets z to the binomial coefficient C(n, k) and returns z. ### func (\*Int) Bit ``` func (x *Int) Bit(i int) uint ``` Bit returns the value of the i'th bit of x. That is, it returns (x>>i)&1. The bit index i must be >= 0. ### func (\*Int) BitLen ``` func (x *Int) BitLen() int ``` BitLen returns the length of the absolute value of x in bits. The bit length of 0 is 0. ### func (\*Int) Bits ``` func (x *Int) Bits() []Word ``` Bits provides raw (unchecked but fast) access to x by returning its absolute value as a little-endian Word slice. The result and x share the same underlying array. Bits is intended to support implementation of missing low-level Int functionality outside this package; it should be avoided otherwise. ### func (\*Int) Bytes ``` func (x *Int) Bytes() []byte ``` Bytes returns the absolute value of x as a big-endian byte slice. To use a fixed length slice, or a preallocated one, use FillBytes. ### func (\*Int) Cmp ``` func (x *Int) Cmp(y *Int) (r int) ``` Cmp compares x and y and returns: ``` -1 if x < y 0 if x == y +1 if x > y ``` ### func (\*Int) CmpAbs 1.10 ``` func (x *Int) CmpAbs(y *Int) int ``` CmpAbs compares the absolute values of x and y and returns: ``` -1 if |x| < |y| 0 if |x| == |y| +1 if |x| > |y| ``` ### func (\*Int) Div ``` func (z *Int) Div(x, y *Int) *Int ``` Div sets z to the quotient x/y for y != 0 and returns z. If y == 0, a division-by-zero run-time panic occurs. Div implements Euclidean division (unlike Go); see DivMod for more details. ### func (\*Int) DivMod ``` func (z *Int) DivMod(x, y, m *Int) (*Int, *Int) ``` DivMod sets z to the quotient x div y and m to the modulus x mod y and returns the pair (z, m) for y != 0. If y == 0, a division-by-zero run-time panic occurs. DivMod implements Euclidean division and modulus (unlike Go): ``` q = x div y such that m = x - y*q with 0 <= m < |y| ``` (See Raymond T. Boute, “The Euclidean definition of the functions div and mod”. ACM Transactions on Programming Languages and Systems (TOPLAS), 14(2):127-144, New York, NY, USA, 4/1992. ACM press.) See QuoRem for T-division and modulus (like Go). ### func (\*Int) Exp ``` func (z *Int) Exp(x, y, m *Int) *Int ``` Exp sets z = x\*\*y mod |m| (i.e. the sign of m is ignored), and returns z. If m == nil or m == 0, z = x\*\*y unless y <= 0 then z = 1. If m != 0, y < 0, and x and m are not relatively prime, z is unchanged and nil is returned. Modular exponentiation of inputs of a particular size is not a cryptographically constant-time operation. ### func (\*Int) FillBytes 1.15 ``` func (x *Int) FillBytes(buf []byte) []byte ``` FillBytes sets buf to the absolute value of x, storing it as a zero-extended big-endian byte slice, and returns buf. If the absolute value of x doesn't fit in buf, FillBytes will panic. ### func (\*Int) Format ``` func (x *Int) Format(s fmt.State, ch rune) ``` Format implements fmt.Formatter. It accepts the formats 'b' (binary), 'o' (octal with 0 prefix), 'O' (octal with 0o prefix), 'd' (decimal), 'x' (lowercase hexadecimal), and 'X' (uppercase hexadecimal). Also supported are the full suite of package fmt's format flags for integral types, including '+' and ' ' for sign control, '#' for leading zero in octal and for hexadecimal, a leading "0x" or "0X" for "%#x" and "%#X" respectively, specification of minimum digits precision, output field width, space or zero padding, and '-' for left or right justification. ### func (\*Int) GCD ``` func (z *Int) GCD(x, y, a, b *Int) *Int ``` GCD sets z to the greatest common divisor of a and b and returns z. If x or y are not nil, GCD sets their value such that z = a\*x + b\*y. a and b may be positive, zero or negative. (Before Go 1.14 both had to be > 0.) Regardless of the signs of a and b, z is always >= 0. If a == b == 0, GCD sets z = x = y = 0. If a == 0 and b != 0, GCD sets z = |b|, x = 0, y = sign(b) \* 1. If a != 0 and b == 0, GCD sets z = |a|, x = sign(a) \* 1, y = 0. ### func (\*Int) GobDecode ``` func (z *Int) GobDecode(buf []byte) error ``` GobDecode implements the gob.GobDecoder interface. ### func (\*Int) GobEncode ``` func (x *Int) GobEncode() ([]byte, error) ``` GobEncode implements the gob.GobEncoder interface. ### func (\*Int) Int64 ``` func (x *Int) Int64() int64 ``` Int64 returns the int64 representation of x. If x cannot be represented in an int64, the result is undefined. ### func (\*Int) IsInt64 1.9 ``` func (x *Int) IsInt64() bool ``` IsInt64 reports whether x can be represented as an int64. ### func (\*Int) IsUint64 1.9 ``` func (x *Int) IsUint64() bool ``` IsUint64 reports whether x can be represented as a uint64. ### func (\*Int) Lsh ``` func (z *Int) Lsh(x *Int, n uint) *Int ``` Lsh sets z = x << n and returns z. ### func (\*Int) MarshalJSON 1.1 ``` func (x *Int) MarshalJSON() ([]byte, error) ``` MarshalJSON implements the json.Marshaler interface. ### func (\*Int) MarshalText 1.3 ``` func (x *Int) MarshalText() (text []byte, err error) ``` MarshalText implements the encoding.TextMarshaler interface. ### func (\*Int) Mod ``` func (z *Int) Mod(x, y *Int) *Int ``` Mod sets z to the modulus x%y for y != 0 and returns z. If y == 0, a division-by-zero run-time panic occurs. Mod implements Euclidean modulus (unlike Go); see DivMod for more details. ### func (\*Int) ModInverse ``` func (z *Int) ModInverse(g, n *Int) *Int ``` ModInverse sets z to the multiplicative inverse of g in the ring ℤ/nℤ and returns z. If g and n are not relatively prime, g has no multiplicative inverse in the ring ℤ/nℤ. In this case, z is unchanged and the return value is nil. If n == 0, a division-by-zero run-time panic occurs. ### func (\*Int) ModSqrt 1.5 ``` func (z *Int) ModSqrt(x, p *Int) *Int ``` ModSqrt sets z to a square root of x mod p if such a square root exists, and returns z. The modulus p must be an odd prime. If x is not a square mod p, ModSqrt leaves z unchanged and returns nil. This function panics if p is not an odd integer, its behavior is undefined if p is odd but not prime. ### func (\*Int) Mul ``` func (z *Int) Mul(x, y *Int) *Int ``` Mul sets z to the product x\*y and returns z. ### func (\*Int) MulRange ``` func (z *Int) MulRange(a, b int64) *Int ``` MulRange sets z to the product of all integers in the range [a, b] inclusively and returns z. If a > b (empty range), the result is 1. ### func (\*Int) Neg ``` func (z *Int) Neg(x *Int) *Int ``` Neg sets z to -x and returns z. ### func (\*Int) Not ``` func (z *Int) Not(x *Int) *Int ``` Not sets z = ^x and returns z. ### func (\*Int) Or ``` func (z *Int) Or(x, y *Int) *Int ``` Or sets z = x | y and returns z. ### func (\*Int) ProbablyPrime ``` func (x *Int) ProbablyPrime(n int) bool ``` ProbablyPrime reports whether x is probably prime, applying the Miller-Rabin test with n pseudorandomly chosen bases as well as a Baillie-PSW test. If x is prime, ProbablyPrime returns true. If x is chosen randomly and not prime, ProbablyPrime probably returns false. The probability of returning true for a randomly chosen non-prime is at most ¼ⁿ. ProbablyPrime is 100% accurate for inputs less than 2⁶⁴. See Menezes et al., Handbook of Applied Cryptography, 1997, pp. 145-149, and FIPS 186-4 Appendix F for further discussion of the error probabilities. ProbablyPrime is not suitable for judging primes that an adversary may have crafted to fool the test. As of Go 1.8, ProbablyPrime(0) is allowed and applies only a Baillie-PSW test. Before Go 1.8, ProbablyPrime applied only the Miller-Rabin tests, and ProbablyPrime(0) panicked. ### func (\*Int) Quo ``` func (z *Int) Quo(x, y *Int) *Int ``` Quo sets z to the quotient x/y for y != 0 and returns z. If y == 0, a division-by-zero run-time panic occurs. Quo implements truncated division (like Go); see QuoRem for more details. ### func (\*Int) QuoRem ``` func (z *Int) QuoRem(x, y, r *Int) (*Int, *Int) ``` QuoRem sets z to the quotient x/y and r to the remainder x%y and returns the pair (z, r) for y != 0. If y == 0, a division-by-zero run-time panic occurs. QuoRem implements T-division and modulus (like Go): ``` q = x/y with the result truncated to zero r = x - y*q ``` (See Daan Leijen, “Division and Modulus for Computer Scientists”.) See DivMod for Euclidean division and modulus (unlike Go). ### func (\*Int) Rand ``` func (z *Int) Rand(rnd *rand.Rand, n *Int) *Int ``` Rand sets z to a pseudo-random number in [0, n) and returns z. As this uses the math/rand package, it must not be used for security-sensitive work. Use crypto/rand.Int instead. ### func (\*Int) Rem ``` func (z *Int) Rem(x, y *Int) *Int ``` Rem sets z to the remainder x%y for y != 0 and returns z. If y == 0, a division-by-zero run-time panic occurs. Rem implements truncated modulus (like Go); see QuoRem for more details. ### func (\*Int) Rsh ``` func (z *Int) Rsh(x *Int, n uint) *Int ``` Rsh sets z = x >> n and returns z. ### func (\*Int) Scan ``` func (z *Int) Scan(s fmt.ScanState, ch rune) error ``` Scan is a support routine for fmt.Scanner; it sets z to the value of the scanned number. It accepts the formats 'b' (binary), 'o' (octal), 'd' (decimal), 'x' (lowercase hexadecimal), and 'X' (uppercase hexadecimal). #### Example Code: ``` // The Scan function is rarely used directly; // the fmt package recognizes it as an implementation of fmt.Scanner. i := new(big.Int) _, err := fmt.Sscan("18446744073709551617", i) if err != nil { log.Println("error scanning value:", err) } else { fmt.Println(i) } ``` Output: ``` 18446744073709551617 ``` ### func (\*Int) Set ``` func (z *Int) Set(x *Int) *Int ``` Set sets z to x and returns z. ### func (\*Int) SetBit ``` func (z *Int) SetBit(x *Int, i int, b uint) *Int ``` SetBit sets z to x, with x's i'th bit set to b (0 or 1). That is, if b is 1 SetBit sets z = x | (1 << i); if b is 0 SetBit sets z = x &^ (1 << i). If b is not 0 or 1, SetBit will panic. ### func (\*Int) SetBits ``` func (z *Int) SetBits(abs []Word) *Int ``` SetBits provides raw (unchecked but fast) access to z by setting its value to abs, interpreted as a little-endian Word slice, and returning z. The result and abs share the same underlying array. SetBits is intended to support implementation of missing low-level Int functionality outside this package; it should be avoided otherwise. ### func (\*Int) SetBytes ``` func (z *Int) SetBytes(buf []byte) *Int ``` SetBytes interprets buf as the bytes of a big-endian unsigned integer, sets z to that value, and returns z. ### func (\*Int) SetInt64 ``` func (z *Int) SetInt64(x int64) *Int ``` SetInt64 sets z to x and returns z. ### func (\*Int) SetString ``` func (z *Int) SetString(s string, base int) (*Int, bool) ``` SetString sets z to the value of s, interpreted in the given base, and returns z and a boolean indicating success. The entire string (not just a prefix) must be valid for success. If SetString fails, the value of z is undefined but the returned value is nil. The base argument must be 0 or a value between 2 and MaxBase. For base 0, the number prefix determines the actual base: A prefix of “0b” or “0B” selects base 2, “0”, “0o” or “0O” selects base 8, and “0x” or “0X” selects base 16. Otherwise, the selected base is 10 and no prefix is accepted. For bases <= 36, lower and upper case letters are considered the same: The letters 'a' to 'z' and 'A' to 'Z' represent digit values 10 to 35. For bases > 36, the upper case letters 'A' to 'Z' represent the digit values 36 to 61. For base 0, an underscore character “\_” may appear between a base prefix and an adjacent digit, and between successive digits; such underscores do not change the value of the number. Incorrect placement of underscores is reported as an error if there are no other errors. If base != 0, underscores are not recognized and act like any other character that is not a valid digit. #### Example Code: ``` i := new(big.Int) i.SetString("644", 8) // octal fmt.Println(i) ``` Output: ``` 420 ``` ### func (\*Int) SetUint64 1.1 ``` func (z *Int) SetUint64(x uint64) *Int ``` SetUint64 sets z to x and returns z. ### func (\*Int) Sign ``` func (x *Int) Sign() int ``` Sign returns: ``` -1 if x < 0 0 if x == 0 +1 if x > 0 ``` ### func (\*Int) Sqrt 1.8 ``` func (z *Int) Sqrt(x *Int) *Int ``` Sqrt sets z to ⌊√x⌋, the largest integer such that z² ≤ x, and returns z. It panics if x is negative. ### func (\*Int) String ``` func (x *Int) String() string ``` String returns the decimal representation of x as generated by x.Text(10). ### func (\*Int) Sub ``` func (z *Int) Sub(x, y *Int) *Int ``` Sub sets z to the difference x-y and returns z. ### func (\*Int) Text 1.6 ``` func (x *Int) Text(base int) string ``` Text returns the string representation of x in the given base. Base must be between 2 and 62, inclusive. The result uses the lower-case letters 'a' to 'z' for digit values 10 to 35, and the upper-case letters 'A' to 'Z' for digit values 36 to 61. No prefix (such as "0x") is added to the string. If x is a nil pointer it returns "<nil>". ### func (\*Int) TrailingZeroBits 1.13 ``` func (x *Int) TrailingZeroBits() uint ``` TrailingZeroBits returns the number of consecutive least significant zero bits of |x|. ### func (\*Int) Uint64 1.1 ``` func (x *Int) Uint64() uint64 ``` Uint64 returns the uint64 representation of x. If x cannot be represented in a uint64, the result is undefined. ### func (\*Int) UnmarshalJSON 1.1 ``` func (z *Int) UnmarshalJSON(text []byte) error ``` UnmarshalJSON implements the json.Unmarshaler interface. ### func (\*Int) UnmarshalText 1.3 ``` func (z *Int) UnmarshalText(text []byte) error ``` UnmarshalText implements the encoding.TextUnmarshaler interface. ### func (\*Int) Xor ``` func (z *Int) Xor(x, y *Int) *Int ``` Xor sets z = x ^ y and returns z. type Rat -------- A Rat represents a quotient a/b of arbitrary precision. The zero value for a Rat represents the value 0. Operations always take pointer arguments (\*Rat) rather than Rat values, and each unique Rat value requires its own unique \*Rat pointer. To "copy" a Rat value, an existing (or newly allocated) Rat must be set to a new value using the Rat.Set method; shallow copies of Rats are not supported and may lead to errors. ``` type Rat struct { // contains filtered or unexported fields } ``` ### func NewRat ``` func NewRat(a, b int64) *Rat ``` NewRat creates a new Rat with numerator a and denominator b. ### func (\*Rat) Abs ``` func (z *Rat) Abs(x *Rat) *Rat ``` Abs sets z to |x| (the absolute value of x) and returns z. ### func (\*Rat) Add ``` func (z *Rat) Add(x, y *Rat) *Rat ``` Add sets z to the sum x+y and returns z. ### func (\*Rat) Cmp ``` func (x *Rat) Cmp(y *Rat) int ``` Cmp compares x and y and returns: ``` -1 if x < y 0 if x == y +1 if x > y ``` ### func (\*Rat) Denom ``` func (x *Rat) Denom() *Int ``` Denom returns the denominator of x; it is always > 0. The result is a reference to x's denominator, unless x is an uninitialized (zero value) Rat, in which case the result is a new Int of value 1. (To initialize x, any operation that sets x will do, including x.Set(x).) If the result is a reference to x's denominator it may change if a new value is assigned to x, and vice versa. ### func (\*Rat) Float32 1.4 ``` func (x *Rat) Float32() (f float32, exact bool) ``` Float32 returns the nearest float32 value for x and a bool indicating whether f represents x exactly. If the magnitude of x is too large to be represented by a float32, f is an infinity and exact is false. The sign of f always matches the sign of x, even if f == 0. ### func (\*Rat) Float64 1.1 ``` func (x *Rat) Float64() (f float64, exact bool) ``` Float64 returns the nearest float64 value for x and a bool indicating whether f represents x exactly. If the magnitude of x is too large to be represented by a float64, f is an infinity and exact is false. The sign of f always matches the sign of x, even if f == 0. ### func (\*Rat) FloatString ``` func (x *Rat) FloatString(prec int) string ``` FloatString returns a string representation of x in decimal form with prec digits of precision after the radix point. The last digit is rounded to nearest, with halves rounded away from zero. ### func (\*Rat) GobDecode ``` func (z *Rat) GobDecode(buf []byte) error ``` GobDecode implements the gob.GobDecoder interface. ### func (\*Rat) GobEncode ``` func (x *Rat) GobEncode() ([]byte, error) ``` GobEncode implements the gob.GobEncoder interface. ### func (\*Rat) Inv ``` func (z *Rat) Inv(x *Rat) *Rat ``` Inv sets z to 1/x and returns z. If x == 0, Inv panics. ### func (\*Rat) IsInt ``` func (x *Rat) IsInt() bool ``` IsInt reports whether the denominator of x is 1. ### func (\*Rat) MarshalText 1.3 ``` func (x *Rat) MarshalText() (text []byte, err error) ``` MarshalText implements the encoding.TextMarshaler interface. ### func (\*Rat) Mul ``` func (z *Rat) Mul(x, y *Rat) *Rat ``` Mul sets z to the product x\*y and returns z. ### func (\*Rat) Neg ``` func (z *Rat) Neg(x *Rat) *Rat ``` Neg sets z to -x and returns z. ### func (\*Rat) Num ``` func (x *Rat) Num() *Int ``` Num returns the numerator of x; it may be <= 0. The result is a reference to x's numerator; it may change if a new value is assigned to x, and vice versa. The sign of the numerator corresponds to the sign of x. ### func (\*Rat) Quo ``` func (z *Rat) Quo(x, y *Rat) *Rat ``` Quo sets z to the quotient x/y and returns z. If y == 0, Quo panics. ### func (\*Rat) RatString ``` func (x *Rat) RatString() string ``` RatString returns a string representation of x in the form "a/b" if b != 1, and in the form "a" if b == 1. ### func (\*Rat) Scan ``` func (z *Rat) Scan(s fmt.ScanState, ch rune) error ``` Scan is a support routine for fmt.Scanner. It accepts the formats 'e', 'E', 'f', 'F', 'g', 'G', and 'v'. All formats are equivalent. #### Example Code: ``` // The Scan function is rarely used directly; // the fmt package recognizes it as an implementation of fmt.Scanner. r := new(big.Rat) _, err := fmt.Sscan("1.5000", r) if err != nil { log.Println("error scanning value:", err) } else { fmt.Println(r) } ``` Output: ``` 3/2 ``` ### func (\*Rat) Set ``` func (z *Rat) Set(x *Rat) *Rat ``` Set sets z to x (by making a copy of x) and returns z. ### func (\*Rat) SetFloat64 1.1 ``` func (z *Rat) SetFloat64(f float64) *Rat ``` SetFloat64 sets z to exactly f and returns z. If f is not finite, SetFloat returns nil. ### func (\*Rat) SetFrac ``` func (z *Rat) SetFrac(a, b *Int) *Rat ``` SetFrac sets z to a/b and returns z. If b == 0, SetFrac panics. ### func (\*Rat) SetFrac64 ``` func (z *Rat) SetFrac64(a, b int64) *Rat ``` SetFrac64 sets z to a/b and returns z. If b == 0, SetFrac64 panics. ### func (\*Rat) SetInt ``` func (z *Rat) SetInt(x *Int) *Rat ``` SetInt sets z to x (by making a copy of x) and returns z. ### func (\*Rat) SetInt64 ``` func (z *Rat) SetInt64(x int64) *Rat ``` SetInt64 sets z to x and returns z. ### func (\*Rat) SetString ``` func (z *Rat) SetString(s string) (*Rat, bool) ``` SetString sets z to the value of s and returns z and a boolean indicating success. s can be given as a (possibly signed) fraction "a/b", or as a floating-point number optionally followed by an exponent. If a fraction is provided, both the dividend and the divisor may be a decimal integer or independently use a prefix of “0b”, “0” or “0o”, or “0x” (or their upper-case variants) to denote a binary, octal, or hexadecimal integer, respectively. The divisor may not be signed. If a floating-point number is provided, it may be in decimal form or use any of the same prefixes as above but for “0” to denote a non-decimal mantissa. A leading “0” is considered a decimal leading 0; it does not indicate octal representation in this case. An optional base-10 “e” or base-2 “p” (or their upper-case variants) exponent may be provided as well, except for hexadecimal floats which only accept an (optional) “p” exponent (because an “e” or “E” cannot be distinguished from a mantissa digit). If the exponent's absolute value is too large, the operation may fail. The entire string, not just a prefix, must be valid for success. If the operation failed, the value of z is undefined but the returned value is nil. #### Example Code: ``` r := new(big.Rat) r.SetString("355/113") fmt.Println(r.FloatString(3)) ``` Output: ``` 3.142 ``` ### func (\*Rat) SetUint64 1.13 ``` func (z *Rat) SetUint64(x uint64) *Rat ``` SetUint64 sets z to x and returns z. ### func (\*Rat) Sign ``` func (x *Rat) Sign() int ``` Sign returns: ``` -1 if x < 0 0 if x == 0 +1 if x > 0 ``` ### func (\*Rat) String ``` func (x *Rat) String() string ``` String returns a string representation of x in the form "a/b" (even if b == 1). ### func (\*Rat) Sub ``` func (z *Rat) Sub(x, y *Rat) *Rat ``` Sub sets z to the difference x-y and returns z. ### func (\*Rat) UnmarshalText 1.3 ``` func (z *Rat) UnmarshalText(text []byte) error ``` UnmarshalText implements the encoding.TextUnmarshaler interface. type RoundingMode 1.5 --------------------- RoundingMode determines how a Float value is rounded to the desired precision. Rounding may change the Float value; the rounding error is described by the Float's Accuracy. ``` type RoundingMode byte ``` These constants define supported rounding modes. ``` const ( ToNearestEven RoundingMode = iota // == IEEE 754-2008 roundTiesToEven ToNearestAway // == IEEE 754-2008 roundTiesToAway ToZero // == IEEE 754-2008 roundTowardZero AwayFromZero // no IEEE 754-2008 equivalent ToNegativeInf // == IEEE 754-2008 roundTowardNegative ToPositiveInf // == IEEE 754-2008 roundTowardPositive ) ``` #### Example Code: ``` operands := []float64{2.6, 2.5, 2.1, -2.1, -2.5, -2.6} fmt.Print(" x") for mode := big.ToNearestEven; mode <= big.ToPositiveInf; mode++ { fmt.Printf(" %s", mode) } fmt.Println() for _, f64 := range operands { fmt.Printf("%4g", f64) for mode := big.ToNearestEven; mode <= big.ToPositiveInf; mode++ { // sample operands above require 2 bits to represent mantissa // set binary precision to 2 to round them to integer values f := new(big.Float).SetPrec(2).SetMode(mode).SetFloat64(f64) fmt.Printf(" %*g", len(mode.String()), f) } fmt.Println() } ``` Output: ``` x ToNearestEven ToNearestAway ToZero AwayFromZero ToNegativeInf ToPositiveInf 2.6 3 3 2 3 2 3 2.5 2 3 2 3 2 3 2.1 2 2 2 3 2 3 -2.1 -2 -2 -2 -3 -3 -2 -2.5 -2 -3 -2 -3 -3 -2 -2.6 -3 -3 -2 -3 -3 -2 ``` ### func (RoundingMode) String 1.5 ``` func (i RoundingMode) String() string ``` type Word --------- A Word represents a single digit of a multi-precision unsigned integer. ``` type Word uint ```
programming_docs
go Package bits Package bits ============= * `import "math/bits"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package bits implements bit counting and manipulation functions for the predeclared unsigned integer types. Functions in this package may be implemented directly by the compiler, for better performance. For those functions the code in this package will not be used. Which functions are implemented by the compiler depends on the architecture and the Go release. Index ----- * [Constants](#pkg-constants) * [func Add(x, y, carry uint) (sum, carryOut uint)](#Add) * [func Add32(x, y, carry uint32) (sum, carryOut uint32)](#Add32) * [func Add64(x, y, carry uint64) (sum, carryOut uint64)](#Add64) * [func Div(hi, lo, y uint) (quo, rem uint)](#Div) * [func Div32(hi, lo, y uint32) (quo, rem uint32)](#Div32) * [func Div64(hi, lo, y uint64) (quo, rem uint64)](#Div64) * [func LeadingZeros(x uint) int](#LeadingZeros) * [func LeadingZeros16(x uint16) int](#LeadingZeros16) * [func LeadingZeros32(x uint32) int](#LeadingZeros32) * [func LeadingZeros64(x uint64) int](#LeadingZeros64) * [func LeadingZeros8(x uint8) int](#LeadingZeros8) * [func Len(x uint) int](#Len) * [func Len16(x uint16) (n int)](#Len16) * [func Len32(x uint32) (n int)](#Len32) * [func Len64(x uint64) (n int)](#Len64) * [func Len8(x uint8) int](#Len8) * [func Mul(x, y uint) (hi, lo uint)](#Mul) * [func Mul32(x, y uint32) (hi, lo uint32)](#Mul32) * [func Mul64(x, y uint64) (hi, lo uint64)](#Mul64) * [func OnesCount(x uint) int](#OnesCount) * [func OnesCount16(x uint16) int](#OnesCount16) * [func OnesCount32(x uint32) int](#OnesCount32) * [func OnesCount64(x uint64) int](#OnesCount64) * [func OnesCount8(x uint8) int](#OnesCount8) * [func Rem(hi, lo, y uint) uint](#Rem) * [func Rem32(hi, lo, y uint32) uint32](#Rem32) * [func Rem64(hi, lo, y uint64) uint64](#Rem64) * [func Reverse(x uint) uint](#Reverse) * [func Reverse16(x uint16) uint16](#Reverse16) * [func Reverse32(x uint32) uint32](#Reverse32) * [func Reverse64(x uint64) uint64](#Reverse64) * [func Reverse8(x uint8) uint8](#Reverse8) * [func ReverseBytes(x uint) uint](#ReverseBytes) * [func ReverseBytes16(x uint16) uint16](#ReverseBytes16) * [func ReverseBytes32(x uint32) uint32](#ReverseBytes32) * [func ReverseBytes64(x uint64) uint64](#ReverseBytes64) * [func RotateLeft(x uint, k int) uint](#RotateLeft) * [func RotateLeft16(x uint16, k int) uint16](#RotateLeft16) * [func RotateLeft32(x uint32, k int) uint32](#RotateLeft32) * [func RotateLeft64(x uint64, k int) uint64](#RotateLeft64) * [func RotateLeft8(x uint8, k int) uint8](#RotateLeft8) * [func Sub(x, y, borrow uint) (diff, borrowOut uint)](#Sub) * [func Sub32(x, y, borrow uint32) (diff, borrowOut uint32)](#Sub32) * [func Sub64(x, y, borrow uint64) (diff, borrowOut uint64)](#Sub64) * [func TrailingZeros(x uint) int](#TrailingZeros) * [func TrailingZeros16(x uint16) int](#TrailingZeros16) * [func TrailingZeros32(x uint32) int](#TrailingZeros32) * [func TrailingZeros64(x uint64) int](#TrailingZeros64) * [func TrailingZeros8(x uint8) int](#TrailingZeros8) ### Examples [Add32](#example_Add32) [Add64](#example_Add64) [Div32](#example_Div32) [Div64](#example_Div64) [LeadingZeros16](#example_LeadingZeros16) [LeadingZeros32](#example_LeadingZeros32) [LeadingZeros64](#example_LeadingZeros64) [LeadingZeros8](#example_LeadingZeros8) [Len16](#example_Len16) [Len32](#example_Len32) [Len64](#example_Len64) [Len8](#example_Len8) [Mul32](#example_Mul32) [Mul64](#example_Mul64) [OnesCount](#example_OnesCount) [OnesCount16](#example_OnesCount16) [OnesCount32](#example_OnesCount32) [OnesCount64](#example_OnesCount64) [OnesCount8](#example_OnesCount8) [Reverse16](#example_Reverse16) [Reverse32](#example_Reverse32) [Reverse64](#example_Reverse64) [Reverse8](#example_Reverse8) [ReverseBytes16](#example_ReverseBytes16) [ReverseBytes32](#example_ReverseBytes32) [ReverseBytes64](#example_ReverseBytes64) [RotateLeft16](#example_RotateLeft16) [RotateLeft32](#example_RotateLeft32) [RotateLeft64](#example_RotateLeft64) [RotateLeft8](#example_RotateLeft8) [Sub32](#example_Sub32) [Sub64](#example_Sub64) [TrailingZeros16](#example_TrailingZeros16) [TrailingZeros32](#example_TrailingZeros32) [TrailingZeros64](#example_TrailingZeros64) [TrailingZeros8](#example_TrailingZeros8) ### Package files bits.go bits\_errors.go bits\_tables.go Constants --------- UintSize is the size of a uint in bits. ``` const UintSize = uintSize ``` func Add 1.12 ------------- ``` func Add(x, y, carry uint) (sum, carryOut uint) ``` Add returns the sum with carry of x, y and carry: sum = x + y + carry. The carry input must be 0 or 1; otherwise the behavior is undefined. The carryOut output is guaranteed to be 0 or 1. This function's execution time does not depend on the inputs. func Add32 1.12 --------------- ``` func Add32(x, y, carry uint32) (sum, carryOut uint32) ``` Add32 returns the sum with carry of x, y and carry: sum = x + y + carry. The carry input must be 0 or 1; otherwise the behavior is undefined. The carryOut output is guaranteed to be 0 or 1. This function's execution time does not depend on the inputs. #### Example Code: ``` // First number is 33<<32 + 12 n1 := []uint32{33, 12} // Second number is 21<<32 + 23 n2 := []uint32{21, 23} // Add them together without producing carry. d1, carry := bits.Add32(n1[1], n2[1], 0) d0, _ := bits.Add32(n1[0], n2[0], carry) nsum := []uint32{d0, d1} fmt.Printf("%v + %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) // First number is 1<<32 + 2147483648 n1 = []uint32{1, 0x80000000} // Second number is 1<<32 + 2147483648 n2 = []uint32{1, 0x80000000} // Add them together producing carry. d1, carry = bits.Add32(n1[1], n2[1], 0) d0, _ = bits.Add32(n1[0], n2[0], carry) nsum = []uint32{d0, d1} fmt.Printf("%v + %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) ``` Output: ``` [33 12] + [21 23] = [54 35] (carry bit was 0) [1 2147483648] + [1 2147483648] = [3 0] (carry bit was 1) ``` func Add64 1.12 --------------- ``` func Add64(x, y, carry uint64) (sum, carryOut uint64) ``` Add64 returns the sum with carry of x, y and carry: sum = x + y + carry. The carry input must be 0 or 1; otherwise the behavior is undefined. The carryOut output is guaranteed to be 0 or 1. This function's execution time does not depend on the inputs. #### Example Code: ``` // First number is 33<<64 + 12 n1 := []uint64{33, 12} // Second number is 21<<64 + 23 n2 := []uint64{21, 23} // Add them together without producing carry. d1, carry := bits.Add64(n1[1], n2[1], 0) d0, _ := bits.Add64(n1[0], n2[0], carry) nsum := []uint64{d0, d1} fmt.Printf("%v + %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) // First number is 1<<64 + 9223372036854775808 n1 = []uint64{1, 0x8000000000000000} // Second number is 1<<64 + 9223372036854775808 n2 = []uint64{1, 0x8000000000000000} // Add them together producing carry. d1, carry = bits.Add64(n1[1], n2[1], 0) d0, _ = bits.Add64(n1[0], n2[0], carry) nsum = []uint64{d0, d1} fmt.Printf("%v + %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) ``` Output: ``` [33 12] + [21 23] = [54 35] (carry bit was 0) [1 9223372036854775808] + [1 9223372036854775808] = [3 0] (carry bit was 1) ``` func Div 1.12 ------------- ``` func Div(hi, lo, y uint) (quo, rem uint) ``` Div returns the quotient and remainder of (hi, lo) divided by y: quo = (hi, lo)/y, rem = (hi, lo)%y with the dividend bits' upper half in parameter hi and the lower half in parameter lo. Div panics for y == 0 (division by zero) or y <= hi (quotient overflow). func Div32 1.12 --------------- ``` func Div32(hi, lo, y uint32) (quo, rem uint32) ``` Div32 returns the quotient and remainder of (hi, lo) divided by y: quo = (hi, lo)/y, rem = (hi, lo)%y with the dividend bits' upper half in parameter hi and the lower half in parameter lo. Div32 panics for y == 0 (division by zero) or y <= hi (quotient overflow). #### Example Code: ``` // First number is 0<<32 + 6 n1 := []uint32{0, 6} // Second number is 0<<32 + 3 n2 := []uint32{0, 3} // Divide them together. quo, rem := bits.Div32(n1[0], n1[1], n2[1]) nsum := []uint32{quo, rem} fmt.Printf("[%v %v] / %v = %v\n", n1[0], n1[1], n2[1], nsum) // First number is 2<<32 + 2147483648 n1 = []uint32{2, 0x80000000} // Second number is 0<<32 + 2147483648 n2 = []uint32{0, 0x80000000} // Divide them together. quo, rem = bits.Div32(n1[0], n1[1], n2[1]) nsum = []uint32{quo, rem} fmt.Printf("[%v %v] / %v = %v\n", n1[0], n1[1], n2[1], nsum) ``` Output: ``` [0 6] / 3 = [2 0] [2 2147483648] / 2147483648 = [5 0] ``` func Div64 1.12 --------------- ``` func Div64(hi, lo, y uint64) (quo, rem uint64) ``` Div64 returns the quotient and remainder of (hi, lo) divided by y: quo = (hi, lo)/y, rem = (hi, lo)%y with the dividend bits' upper half in parameter hi and the lower half in parameter lo. Div64 panics for y == 0 (division by zero) or y <= hi (quotient overflow). #### Example Code: ``` // First number is 0<<64 + 6 n1 := []uint64{0, 6} // Second number is 0<<64 + 3 n2 := []uint64{0, 3} // Divide them together. quo, rem := bits.Div64(n1[0], n1[1], n2[1]) nsum := []uint64{quo, rem} fmt.Printf("[%v %v] / %v = %v\n", n1[0], n1[1], n2[1], nsum) // First number is 2<<64 + 9223372036854775808 n1 = []uint64{2, 0x8000000000000000} // Second number is 0<<64 + 9223372036854775808 n2 = []uint64{0, 0x8000000000000000} // Divide them together. quo, rem = bits.Div64(n1[0], n1[1], n2[1]) nsum = []uint64{quo, rem} fmt.Printf("[%v %v] / %v = %v\n", n1[0], n1[1], n2[1], nsum) ``` Output: ``` [0 6] / 3 = [2 0] [2 9223372036854775808] / 9223372036854775808 = [5 0] ``` func LeadingZeros 1.9 --------------------- ``` func LeadingZeros(x uint) int ``` LeadingZeros returns the number of leading zero bits in x; the result is UintSize for x == 0. func LeadingZeros16 1.9 ----------------------- ``` func LeadingZeros16(x uint16) int ``` LeadingZeros16 returns the number of leading zero bits in x; the result is 16 for x == 0. #### Example Code: ``` fmt.Printf("LeadingZeros16(%016b) = %d\n", 1, bits.LeadingZeros16(1)) ``` Output: ``` LeadingZeros16(0000000000000001) = 15 ``` func LeadingZeros32 1.9 ----------------------- ``` func LeadingZeros32(x uint32) int ``` LeadingZeros32 returns the number of leading zero bits in x; the result is 32 for x == 0. #### Example Code: ``` fmt.Printf("LeadingZeros32(%032b) = %d\n", 1, bits.LeadingZeros32(1)) ``` Output: ``` LeadingZeros32(00000000000000000000000000000001) = 31 ``` func LeadingZeros64 1.9 ----------------------- ``` func LeadingZeros64(x uint64) int ``` LeadingZeros64 returns the number of leading zero bits in x; the result is 64 for x == 0. #### Example Code: ``` fmt.Printf("LeadingZeros64(%064b) = %d\n", 1, bits.LeadingZeros64(1)) ``` Output: ``` LeadingZeros64(0000000000000000000000000000000000000000000000000000000000000001) = 63 ``` func LeadingZeros8 1.9 ---------------------- ``` func LeadingZeros8(x uint8) int ``` LeadingZeros8 returns the number of leading zero bits in x; the result is 8 for x == 0. #### Example Code: ``` fmt.Printf("LeadingZeros8(%08b) = %d\n", 1, bits.LeadingZeros8(1)) ``` Output: ``` LeadingZeros8(00000001) = 7 ``` func Len 1.9 ------------ ``` func Len(x uint) int ``` Len returns the minimum number of bits required to represent x; the result is 0 for x == 0. func Len16 1.9 -------------- ``` func Len16(x uint16) (n int) ``` Len16 returns the minimum number of bits required to represent x; the result is 0 for x == 0. #### Example Code: ``` fmt.Printf("Len16(%016b) = %d\n", 8, bits.Len16(8)) ``` Output: ``` Len16(0000000000001000) = 4 ``` func Len32 1.9 -------------- ``` func Len32(x uint32) (n int) ``` Len32 returns the minimum number of bits required to represent x; the result is 0 for x == 0. #### Example Code: ``` fmt.Printf("Len32(%032b) = %d\n", 8, bits.Len32(8)) ``` Output: ``` Len32(00000000000000000000000000001000) = 4 ``` func Len64 1.9 -------------- ``` func Len64(x uint64) (n int) ``` Len64 returns the minimum number of bits required to represent x; the result is 0 for x == 0. #### Example Code: ``` fmt.Printf("Len64(%064b) = %d\n", 8, bits.Len64(8)) ``` Output: ``` Len64(0000000000000000000000000000000000000000000000000000000000001000) = 4 ``` func Len8 1.9 ------------- ``` func Len8(x uint8) int ``` Len8 returns the minimum number of bits required to represent x; the result is 0 for x == 0. #### Example Code: ``` fmt.Printf("Len8(%08b) = %d\n", 8, bits.Len8(8)) ``` Output: ``` Len8(00001000) = 4 ``` func Mul 1.12 ------------- ``` func Mul(x, y uint) (hi, lo uint) ``` Mul returns the full-width product of x and y: (hi, lo) = x \* y with the product bits' upper half returned in hi and the lower half returned in lo. This function's execution time does not depend on the inputs. func Mul32 1.12 --------------- ``` func Mul32(x, y uint32) (hi, lo uint32) ``` Mul32 returns the 64-bit product of x and y: (hi, lo) = x \* y with the product bits' upper half returned in hi and the lower half returned in lo. This function's execution time does not depend on the inputs. #### Example Code: ``` // First number is 0<<32 + 12 n1 := []uint32{0, 12} // Second number is 0<<32 + 12 n2 := []uint32{0, 12} // Multiply them together without producing overflow. hi, lo := bits.Mul32(n1[1], n2[1]) nsum := []uint32{hi, lo} fmt.Printf("%v * %v = %v\n", n1[1], n2[1], nsum) // First number is 0<<32 + 2147483648 n1 = []uint32{0, 0x80000000} // Second number is 0<<32 + 2 n2 = []uint32{0, 2} // Multiply them together producing overflow. hi, lo = bits.Mul32(n1[1], n2[1]) nsum = []uint32{hi, lo} fmt.Printf("%v * %v = %v\n", n1[1], n2[1], nsum) ``` Output: ``` 12 * 12 = [0 144] 2147483648 * 2 = [1 0] ``` func Mul64 1.12 --------------- ``` func Mul64(x, y uint64) (hi, lo uint64) ``` Mul64 returns the 128-bit product of x and y: (hi, lo) = x \* y with the product bits' upper half returned in hi and the lower half returned in lo. This function's execution time does not depend on the inputs. #### Example Code: ``` // First number is 0<<64 + 12 n1 := []uint64{0, 12} // Second number is 0<<64 + 12 n2 := []uint64{0, 12} // Multiply them together without producing overflow. hi, lo := bits.Mul64(n1[1], n2[1]) nsum := []uint64{hi, lo} fmt.Printf("%v * %v = %v\n", n1[1], n2[1], nsum) // First number is 0<<64 + 9223372036854775808 n1 = []uint64{0, 0x8000000000000000} // Second number is 0<<64 + 2 n2 = []uint64{0, 2} // Multiply them together producing overflow. hi, lo = bits.Mul64(n1[1], n2[1]) nsum = []uint64{hi, lo} fmt.Printf("%v * %v = %v\n", n1[1], n2[1], nsum) ``` Output: ``` 12 * 12 = [0 144] 9223372036854775808 * 2 = [1 0] ``` func OnesCount 1.9 ------------------ ``` func OnesCount(x uint) int ``` OnesCount returns the number of one bits ("population count") in x. #### Example Code: ``` fmt.Printf("OnesCount(%b) = %d\n", 14, bits.OnesCount(14)) ``` Output: ``` OnesCount(1110) = 3 ``` func OnesCount16 1.9 -------------------- ``` func OnesCount16(x uint16) int ``` OnesCount16 returns the number of one bits ("population count") in x. #### Example Code: ``` fmt.Printf("OnesCount16(%016b) = %d\n", 14, bits.OnesCount16(14)) ``` Output: ``` OnesCount16(0000000000001110) = 3 ``` func OnesCount32 1.9 -------------------- ``` func OnesCount32(x uint32) int ``` OnesCount32 returns the number of one bits ("population count") in x. #### Example Code: ``` fmt.Printf("OnesCount32(%032b) = %d\n", 14, bits.OnesCount32(14)) ``` Output: ``` OnesCount32(00000000000000000000000000001110) = 3 ``` func OnesCount64 1.9 -------------------- ``` func OnesCount64(x uint64) int ``` OnesCount64 returns the number of one bits ("population count") in x. #### Example Code: ``` fmt.Printf("OnesCount64(%064b) = %d\n", 14, bits.OnesCount64(14)) ``` Output: ``` OnesCount64(0000000000000000000000000000000000000000000000000000000000001110) = 3 ``` func OnesCount8 1.9 ------------------- ``` func OnesCount8(x uint8) int ``` OnesCount8 returns the number of one bits ("population count") in x. #### Example Code: ``` fmt.Printf("OnesCount8(%08b) = %d\n", 14, bits.OnesCount8(14)) ``` Output: ``` OnesCount8(00001110) = 3 ``` func Rem 1.14 ------------- ``` func Rem(hi, lo, y uint) uint ``` Rem returns the remainder of (hi, lo) divided by y. Rem panics for y == 0 (division by zero) but, unlike Div, it doesn't panic on a quotient overflow. func Rem32 1.14 --------------- ``` func Rem32(hi, lo, y uint32) uint32 ``` Rem32 returns the remainder of (hi, lo) divided by y. Rem32 panics for y == 0 (division by zero) but, unlike Div32, it doesn't panic on a quotient overflow. func Rem64 1.14 --------------- ``` func Rem64(hi, lo, y uint64) uint64 ``` Rem64 returns the remainder of (hi, lo) divided by y. Rem64 panics for y == 0 (division by zero) but, unlike Div64, it doesn't panic on a quotient overflow. func Reverse 1.9 ---------------- ``` func Reverse(x uint) uint ``` Reverse returns the value of x with its bits in reversed order. func Reverse16 1.9 ------------------ ``` func Reverse16(x uint16) uint16 ``` Reverse16 returns the value of x with its bits in reversed order. #### Example Code: ``` fmt.Printf("%016b\n", 19) fmt.Printf("%016b\n", bits.Reverse16(19)) ``` Output: ``` 0000000000010011 1100100000000000 ``` func Reverse32 1.9 ------------------ ``` func Reverse32(x uint32) uint32 ``` Reverse32 returns the value of x with its bits in reversed order. #### Example Code: ``` fmt.Printf("%032b\n", 19) fmt.Printf("%032b\n", bits.Reverse32(19)) ``` Output: ``` 00000000000000000000000000010011 11001000000000000000000000000000 ``` func Reverse64 1.9 ------------------ ``` func Reverse64(x uint64) uint64 ``` Reverse64 returns the value of x with its bits in reversed order. #### Example Code: ``` fmt.Printf("%064b\n", 19) fmt.Printf("%064b\n", bits.Reverse64(19)) ``` Output: ``` 0000000000000000000000000000000000000000000000000000000000010011 1100100000000000000000000000000000000000000000000000000000000000 ``` func Reverse8 1.9 ----------------- ``` func Reverse8(x uint8) uint8 ``` Reverse8 returns the value of x with its bits in reversed order. #### Example Code: ``` fmt.Printf("%08b\n", 19) fmt.Printf("%08b\n", bits.Reverse8(19)) ``` Output: ``` 00010011 11001000 ``` func ReverseBytes 1.9 --------------------- ``` func ReverseBytes(x uint) uint ``` ReverseBytes returns the value of x with its bytes in reversed order. This function's execution time does not depend on the inputs. func ReverseBytes16 1.9 ----------------------- ``` func ReverseBytes16(x uint16) uint16 ``` ReverseBytes16 returns the value of x with its bytes in reversed order. This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%016b\n", 15) fmt.Printf("%016b\n", bits.ReverseBytes16(15)) ``` Output: ``` 0000000000001111 0000111100000000 ``` func ReverseBytes32 1.9 ----------------------- ``` func ReverseBytes32(x uint32) uint32 ``` ReverseBytes32 returns the value of x with its bytes in reversed order. This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%032b\n", 15) fmt.Printf("%032b\n", bits.ReverseBytes32(15)) ``` Output: ``` 00000000000000000000000000001111 00001111000000000000000000000000 ``` func ReverseBytes64 1.9 ----------------------- ``` func ReverseBytes64(x uint64) uint64 ``` ReverseBytes64 returns the value of x with its bytes in reversed order. This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%064b\n", 15) fmt.Printf("%064b\n", bits.ReverseBytes64(15)) ``` Output: ``` 0000000000000000000000000000000000000000000000000000000000001111 0000111100000000000000000000000000000000000000000000000000000000 ``` func RotateLeft 1.9 ------------------- ``` func RotateLeft(x uint, k int) uint ``` RotateLeft returns the value of x rotated left by (k mod UintSize) bits. To rotate x right by k bits, call RotateLeft(x, -k). This function's execution time does not depend on the inputs. func RotateLeft16 1.9 --------------------- ``` func RotateLeft16(x uint16, k int) uint16 ``` RotateLeft16 returns the value of x rotated left by (k mod 16) bits. To rotate x right by k bits, call RotateLeft16(x, -k). This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%016b\n", 15) fmt.Printf("%016b\n", bits.RotateLeft16(15, 2)) fmt.Printf("%016b\n", bits.RotateLeft16(15, -2)) ``` Output: ``` 0000000000001111 0000000000111100 1100000000000011 ``` func RotateLeft32 1.9 --------------------- ``` func RotateLeft32(x uint32, k int) uint32 ``` RotateLeft32 returns the value of x rotated left by (k mod 32) bits. To rotate x right by k bits, call RotateLeft32(x, -k). This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%032b\n", 15) fmt.Printf("%032b\n", bits.RotateLeft32(15, 2)) fmt.Printf("%032b\n", bits.RotateLeft32(15, -2)) ``` Output: ``` 00000000000000000000000000001111 00000000000000000000000000111100 11000000000000000000000000000011 ``` func RotateLeft64 1.9 --------------------- ``` func RotateLeft64(x uint64, k int) uint64 ``` RotateLeft64 returns the value of x rotated left by (k mod 64) bits. To rotate x right by k bits, call RotateLeft64(x, -k). This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%064b\n", 15) fmt.Printf("%064b\n", bits.RotateLeft64(15, 2)) fmt.Printf("%064b\n", bits.RotateLeft64(15, -2)) ``` Output: ``` 0000000000000000000000000000000000000000000000000000000000001111 0000000000000000000000000000000000000000000000000000000000111100 1100000000000000000000000000000000000000000000000000000000000011 ``` func RotateLeft8 1.9 -------------------- ``` func RotateLeft8(x uint8, k int) uint8 ``` RotateLeft8 returns the value of x rotated left by (k mod 8) bits. To rotate x right by k bits, call RotateLeft8(x, -k). This function's execution time does not depend on the inputs. #### Example Code: ``` fmt.Printf("%08b\n", 15) fmt.Printf("%08b\n", bits.RotateLeft8(15, 2)) fmt.Printf("%08b\n", bits.RotateLeft8(15, -2)) ``` Output: ``` 00001111 00111100 11000011 ``` func Sub 1.12 ------------- ``` func Sub(x, y, borrow uint) (diff, borrowOut uint) ``` Sub returns the difference of x, y and borrow: diff = x - y - borrow. The borrow input must be 0 or 1; otherwise the behavior is undefined. The borrowOut output is guaranteed to be 0 or 1. This function's execution time does not depend on the inputs. func Sub32 1.12 --------------- ``` func Sub32(x, y, borrow uint32) (diff, borrowOut uint32) ``` Sub32 returns the difference of x, y and borrow, diff = x - y - borrow. The borrow input must be 0 or 1; otherwise the behavior is undefined. The borrowOut output is guaranteed to be 0 or 1. This function's execution time does not depend on the inputs. #### Example Code: ``` // First number is 33<<32 + 23 n1 := []uint32{33, 23} // Second number is 21<<32 + 12 n2 := []uint32{21, 12} // Sub them together without producing carry. d1, carry := bits.Sub32(n1[1], n2[1], 0) d0, _ := bits.Sub32(n1[0], n2[0], carry) nsum := []uint32{d0, d1} fmt.Printf("%v - %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) // First number is 3<<32 + 2147483647 n1 = []uint32{3, 0x7fffffff} // Second number is 1<<32 + 2147483648 n2 = []uint32{1, 0x80000000} // Sub them together producing carry. d1, carry = bits.Sub32(n1[1], n2[1], 0) d0, _ = bits.Sub32(n1[0], n2[0], carry) nsum = []uint32{d0, d1} fmt.Printf("%v - %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) ``` Output: ``` [33 23] - [21 12] = [12 11] (carry bit was 0) [3 2147483647] - [1 2147483648] = [1 4294967295] (carry bit was 1) ``` func Sub64 1.12 --------------- ``` func Sub64(x, y, borrow uint64) (diff, borrowOut uint64) ``` Sub64 returns the difference of x, y and borrow: diff = x - y - borrow. The borrow input must be 0 or 1; otherwise the behavior is undefined. The borrowOut output is guaranteed to be 0 or 1. This function's execution time does not depend on the inputs. #### Example Code: ``` // First number is 33<<64 + 23 n1 := []uint64{33, 23} // Second number is 21<<64 + 12 n2 := []uint64{21, 12} // Sub them together without producing carry. d1, carry := bits.Sub64(n1[1], n2[1], 0) d0, _ := bits.Sub64(n1[0], n2[0], carry) nsum := []uint64{d0, d1} fmt.Printf("%v - %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) // First number is 3<<64 + 9223372036854775807 n1 = []uint64{3, 0x7fffffffffffffff} // Second number is 1<<64 + 9223372036854775808 n2 = []uint64{1, 0x8000000000000000} // Sub them together producing carry. d1, carry = bits.Sub64(n1[1], n2[1], 0) d0, _ = bits.Sub64(n1[0], n2[0], carry) nsum = []uint64{d0, d1} fmt.Printf("%v - %v = %v (carry bit was %v)\n", n1, n2, nsum, carry) ``` Output: ``` [33 23] - [21 12] = [12 11] (carry bit was 0) [3 9223372036854775807] - [1 9223372036854775808] = [1 18446744073709551615] (carry bit was 1) ``` func TrailingZeros 1.9 ---------------------- ``` func TrailingZeros(x uint) int ``` TrailingZeros returns the number of trailing zero bits in x; the result is UintSize for x == 0. func TrailingZeros16 1.9 ------------------------ ``` func TrailingZeros16(x uint16) int ``` TrailingZeros16 returns the number of trailing zero bits in x; the result is 16 for x == 0. #### Example Code: ``` fmt.Printf("TrailingZeros16(%016b) = %d\n", 14, bits.TrailingZeros16(14)) ``` Output: ``` TrailingZeros16(0000000000001110) = 1 ``` func TrailingZeros32 1.9 ------------------------ ``` func TrailingZeros32(x uint32) int ``` TrailingZeros32 returns the number of trailing zero bits in x; the result is 32 for x == 0. #### Example Code: ``` fmt.Printf("TrailingZeros32(%032b) = %d\n", 14, bits.TrailingZeros32(14)) ``` Output: ``` TrailingZeros32(00000000000000000000000000001110) = 1 ``` func TrailingZeros64 1.9 ------------------------ ``` func TrailingZeros64(x uint64) int ``` TrailingZeros64 returns the number of trailing zero bits in x; the result is 64 for x == 0. #### Example Code: ``` fmt.Printf("TrailingZeros64(%064b) = %d\n", 14, bits.TrailingZeros64(14)) ``` Output: ``` TrailingZeros64(0000000000000000000000000000000000000000000000000000000000001110) = 1 ``` func TrailingZeros8 1.9 ----------------------- ``` func TrailingZeros8(x uint8) int ``` TrailingZeros8 returns the number of trailing zero bits in x; the result is 8 for x == 0. #### Example Code: ``` fmt.Printf("TrailingZeros8(%08b) = %d\n", 14, bits.TrailingZeros8(14)) ``` Output: ``` TrailingZeros8(00001110) = 1 ```
programming_docs
go Package rand Package rand ============= * `import "math/rand"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package rand implements pseudo-random number generators unsuitable for security-sensitive work. Random numbers are generated by a [Source](#Source), usually wrapped in a [Rand](#Rand). Both types should be used by a single goroutine at a time: sharing among multiple goroutines requires some kind of synchronization. Top-level functions, such as [Float64](#Float64) and [Int](#Int), are safe for concurrent use by multiple goroutines. This package's outputs might be easily predictable regardless of how it's seeded. For random numbers suitable for security-sensitive work, see the crypto/rand package. #### Example Code: ``` answers := []string{ "It is certain", "It is decidedly so", "Without a doubt", "Yes definitely", "You may rely on it", "As I see it yes", "Most likely", "Outlook good", "Yes", "Signs point to yes", "Reply hazy try again", "Ask again later", "Better not tell you now", "Cannot predict now", "Concentrate and ask again", "Don't count on it", "My reply is no", "My sources say no", "Outlook not so good", "Very doubtful", } fmt.Println("Magic 8-Ball says:", answers[rand.Intn(len(answers))]) ``` #### Example (Rand) This example shows the use of each of the methods on a \*Rand. The use of the global functions is the same, without the receiver. Code: ``` // Create and seed the generator. // Typically a non-fixed seed should be used, such as time.Now().UnixNano(). // Using a fixed seed will produce the same output on every run. r := rand.New(rand.NewSource(99)) // The tabwriter here helps us generate aligned output. w := tabwriter.NewWriter(os.Stdout, 1, 1, 1, ' ', 0) defer w.Flush() show := func(name string, v1, v2, v3 any) { fmt.Fprintf(w, "%s\t%v\t%v\t%v\n", name, v1, v2, v3) } // Float32 and Float64 values are in [0, 1). show("Float32", r.Float32(), r.Float32(), r.Float32()) show("Float64", r.Float64(), r.Float64(), r.Float64()) // ExpFloat64 values have an average of 1 but decay exponentially. show("ExpFloat64", r.ExpFloat64(), r.ExpFloat64(), r.ExpFloat64()) // NormFloat64 values have an average of 0 and a standard deviation of 1. show("NormFloat64", r.NormFloat64(), r.NormFloat64(), r.NormFloat64()) // Int31, Int63, and Uint32 generate values of the given width. // The Int method (not shown) is like either Int31 or Int63 // depending on the size of 'int'. show("Int31", r.Int31(), r.Int31(), r.Int31()) show("Int63", r.Int63(), r.Int63(), r.Int63()) show("Uint32", r.Uint32(), r.Uint32(), r.Uint32()) // Intn, Int31n, and Int63n limit their output to be < n. // They do so more carefully than using r.Int()%n. show("Intn(10)", r.Intn(10), r.Intn(10), r.Intn(10)) show("Int31n(10)", r.Int31n(10), r.Int31n(10), r.Int31n(10)) show("Int63n(10)", r.Int63n(10), r.Int63n(10), r.Int63n(10)) // Perm generates a random permutation of the numbers [0, n). show("Perm", r.Perm(5), r.Perm(5), r.Perm(5)) ``` Output: ``` Float32 0.2635776 0.6358173 0.6718283 Float64 0.628605430454327 0.4504798828572669 0.9562755949377957 ExpFloat64 0.3362240648200941 1.4256072328483647 0.24354758816173044 NormFloat64 0.17233959114940064 1.577014951434847 0.04259129641113857 Int31 1501292890 1486668269 182840835 Int63 3546343826724305832 5724354148158589552 5239846799706671610 Uint32 2760229429 296659907 1922395059 Intn(10) 1 2 5 Int31n(10) 4 7 8 Int63n(10) 7 6 3 Perm [1 4 2 3 0] [4 2 1 3 0] [1 2 4 0 3] ``` Index ----- * [func ExpFloat64() float64](#ExpFloat64) * [func Float32() float32](#Float32) * [func Float64() float64](#Float64) * [func Int() int](#Int) * [func Int31() int32](#Int31) * [func Int31n(n int32) int32](#Int31n) * [func Int63() int64](#Int63) * [func Int63n(n int64) int64](#Int63n) * [func Intn(n int) int](#Intn) * [func NormFloat64() float64](#NormFloat64) * [func Perm(n int) []int](#Perm) * [func Read(p []byte) (n int, err error)](#Read) * [func Seed(seed int64)](#Seed) * [func Shuffle(n int, swap func(i, j int))](#Shuffle) * [func Uint32() uint32](#Uint32) * [func Uint64() uint64](#Uint64) * [type Rand](#Rand) * [func New(src Source) \*Rand](#New) * [func (r \*Rand) ExpFloat64() float64](#Rand.ExpFloat64) * [func (r \*Rand) Float32() float32](#Rand.Float32) * [func (r \*Rand) Float64() float64](#Rand.Float64) * [func (r \*Rand) Int() int](#Rand.Int) * [func (r \*Rand) Int31() int32](#Rand.Int31) * [func (r \*Rand) Int31n(n int32) int32](#Rand.Int31n) * [func (r \*Rand) Int63() int64](#Rand.Int63) * [func (r \*Rand) Int63n(n int64) int64](#Rand.Int63n) * [func (r \*Rand) Intn(n int) int](#Rand.Intn) * [func (r \*Rand) NormFloat64() float64](#Rand.NormFloat64) * [func (r \*Rand) Perm(n int) []int](#Rand.Perm) * [func (r \*Rand) Read(p []byte) (n int, err error)](#Rand.Read) * [func (r \*Rand) Seed(seed int64)](#Rand.Seed) * [func (r \*Rand) Shuffle(n int, swap func(i, j int))](#Rand.Shuffle) * [func (r \*Rand) Uint32() uint32](#Rand.Uint32) * [func (r \*Rand) Uint64() uint64](#Rand.Uint64) * [type Source](#Source) * [func NewSource(seed int64) Source](#NewSource) * [type Source64](#Source64) * [type Zipf](#Zipf) * [func NewZipf(r \*Rand, s float64, v float64, imax uint64) \*Zipf](#NewZipf) * [func (z \*Zipf) Uint64() uint64](#Zipf.Uint64) ### Examples [Package](#example_) [Intn](#example_Intn) [Perm](#example_Perm) [Shuffle](#example_Shuffle) [Shuffle (SlicesInUnison)](#example_Shuffle_slicesInUnison) [Package (Rand)](#example__rand) ### Package files exp.go normal.go rand.go rng.go zipf.go func ExpFloat64 --------------- ``` func ExpFloat64() float64 ``` ExpFloat64 returns an exponentially distributed float64 in the range (0, +math.MaxFloat64] with an exponential distribution whose rate parameter (lambda) is 1 and whose mean is 1/lambda (1) from the default Source. To produce a distribution with a different rate parameter, callers can adjust the output using: ``` sample = ExpFloat64() / desiredRateParameter ``` func Float32 ------------ ``` func Float32() float32 ``` Float32 returns, as a float32, a pseudo-random number in the half-open interval [0.0,1.0) from the default Source. func Float64 ------------ ``` func Float64() float64 ``` Float64 returns, as a float64, a pseudo-random number in the half-open interval [0.0,1.0) from the default Source. func Int -------- ``` func Int() int ``` Int returns a non-negative pseudo-random int from the default Source. func Int31 ---------- ``` func Int31() int32 ``` Int31 returns a non-negative pseudo-random 31-bit integer as an int32 from the default Source. func Int31n ----------- ``` func Int31n(n int32) int32 ``` Int31n returns, as an int32, a non-negative pseudo-random number in the half-open interval [0,n) from the default Source. It panics if n <= 0. func Int63 ---------- ``` func Int63() int64 ``` Int63 returns a non-negative pseudo-random 63-bit integer as an int64 from the default Source. func Int63n ----------- ``` func Int63n(n int64) int64 ``` Int63n returns, as an int64, a non-negative pseudo-random number in the half-open interval [0,n) from the default Source. It panics if n <= 0. func Intn --------- ``` func Intn(n int) int ``` Intn returns, as an int, a non-negative pseudo-random number in the half-open interval [0,n) from the default Source. It panics if n <= 0. #### Example Code: ``` fmt.Println(rand.Intn(100)) fmt.Println(rand.Intn(100)) fmt.Println(rand.Intn(100)) ``` func NormFloat64 ---------------- ``` func NormFloat64() float64 ``` NormFloat64 returns a normally distributed float64 in the range [-math.MaxFloat64, +math.MaxFloat64] with standard normal distribution (mean = 0, stddev = 1) from the default Source. To produce a different normal distribution, callers can adjust the output using: ``` sample = NormFloat64() * desiredStdDev + desiredMean ``` func Perm --------- ``` func Perm(n int) []int ``` Perm returns, as a slice of n ints, a pseudo-random permutation of the integers in the half-open interval [0,n) from the default Source. #### Example Code: ``` for _, value := range rand.Perm(3) { fmt.Println(value) } ``` Output: ``` 1 2 0 ``` func Read 1.6 ------------- ``` func Read(p []byte) (n int, err error) ``` Read generates len(p) random bytes from the default Source and writes them into p. It always returns len(p) and a nil error. Read, unlike the Rand.Read method, is safe for concurrent use. Deprecated: For almost all use cases, crypto/rand.Read is more appropriate. func Seed --------- ``` func Seed(seed int64) ``` Seed uses the provided seed value to initialize the default Source to a deterministic state. Seed values that have the same remainder when divided by 2³¹-1 generate the same pseudo-random sequence. Seed, unlike the Rand.Seed method, is safe for concurrent use. If Seed is not called, the generator is seeded randomly at program startup. Prior to Go 1.20, the generator was seeded like Seed(1) at program startup. To force the old behavior, call Seed(1) at program startup. Alternately, set GODEBUG=randautoseed=0 in the environment before making any calls to functions in this package. Deprecated: Programs that call Seed and then expect a specific sequence of results from the global random source (using functions such as Int) can be broken when a dependency changes how much it consumes from the global random source. To avoid such breakages, programs that need a specific result sequence should use NewRand(NewSource(seed)) to obtain a random generator that other packages cannot access. func Shuffle 1.10 ----------------- ``` func Shuffle(n int, swap func(i, j int)) ``` Shuffle pseudo-randomizes the order of elements using the default Source. n is the number of elements. Shuffle panics if n < 0. swap swaps the elements with indexes i and j. #### Example Code: ``` words := strings.Fields("ink runs from the corners of my mouth") rand.Shuffle(len(words), func(i, j int) { words[i], words[j] = words[j], words[i] }) fmt.Println(words) ``` #### Example (SlicesInUnison) Code: ``` numbers := []byte("12345") letters := []byte("ABCDE") // Shuffle numbers, swapping corresponding entries in letters at the same time. rand.Shuffle(len(numbers), func(i, j int) { numbers[i], numbers[j] = numbers[j], numbers[i] letters[i], letters[j] = letters[j], letters[i] }) for i := range numbers { fmt.Printf("%c: %c\n", letters[i], numbers[i]) } ``` func Uint32 ----------- ``` func Uint32() uint32 ``` Uint32 returns a pseudo-random 32-bit value as a uint32 from the default Source. func Uint64 1.8 --------------- ``` func Uint64() uint64 ``` Uint64 returns a pseudo-random 64-bit value as a uint64 from the default Source. type Rand --------- A Rand is a source of random numbers. ``` type Rand struct { // contains filtered or unexported fields } ``` ### func New ``` func New(src Source) *Rand ``` New returns a new Rand that uses random values from src to generate other random values. ### func (\*Rand) ExpFloat64 ``` func (r *Rand) ExpFloat64() float64 ``` ExpFloat64 returns an exponentially distributed float64 in the range (0, +math.MaxFloat64] with an exponential distribution whose rate parameter (lambda) is 1 and whose mean is 1/lambda (1). To produce a distribution with a different rate parameter, callers can adjust the output using: ``` sample = ExpFloat64() / desiredRateParameter ``` ### func (\*Rand) Float32 ``` func (r *Rand) Float32() float32 ``` Float32 returns, as a float32, a pseudo-random number in the half-open interval [0.0,1.0). ### func (\*Rand) Float64 ``` func (r *Rand) Float64() float64 ``` Float64 returns, as a float64, a pseudo-random number in the half-open interval [0.0,1.0). ### func (\*Rand) Int ``` func (r *Rand) Int() int ``` Int returns a non-negative pseudo-random int. ### func (\*Rand) Int31 ``` func (r *Rand) Int31() int32 ``` Int31 returns a non-negative pseudo-random 31-bit integer as an int32. ### func (\*Rand) Int31n ``` func (r *Rand) Int31n(n int32) int32 ``` Int31n returns, as an int32, a non-negative pseudo-random number in the half-open interval [0,n). It panics if n <= 0. ### func (\*Rand) Int63 ``` func (r *Rand) Int63() int64 ``` Int63 returns a non-negative pseudo-random 63-bit integer as an int64. ### func (\*Rand) Int63n ``` func (r *Rand) Int63n(n int64) int64 ``` Int63n returns, as an int64, a non-negative pseudo-random number in the half-open interval [0,n). It panics if n <= 0. ### func (\*Rand) Intn ``` func (r *Rand) Intn(n int) int ``` Intn returns, as an int, a non-negative pseudo-random number in the half-open interval [0,n). It panics if n <= 0. ### func (\*Rand) NormFloat64 ``` func (r *Rand) NormFloat64() float64 ``` NormFloat64 returns a normally distributed float64 in the range -math.MaxFloat64 through +math.MaxFloat64 inclusive, with standard normal distribution (mean = 0, stddev = 1). To produce a different normal distribution, callers can adjust the output using: ``` sample = NormFloat64() * desiredStdDev + desiredMean ``` ### func (\*Rand) Perm ``` func (r *Rand) Perm(n int) []int ``` Perm returns, as a slice of n ints, a pseudo-random permutation of the integers in the half-open interval [0,n). ### func (\*Rand) Read 1.6 ``` func (r *Rand) Read(p []byte) (n int, err error) ``` Read generates len(p) random bytes and writes them into p. It always returns len(p) and a nil error. Read should not be called concurrently with any other Rand method. ### func (\*Rand) Seed ``` func (r *Rand) Seed(seed int64) ``` Seed uses the provided seed value to initialize the generator to a deterministic state. Seed should not be called concurrently with any other Rand method. ### func (\*Rand) Shuffle 1.10 ``` func (r *Rand) Shuffle(n int, swap func(i, j int)) ``` Shuffle pseudo-randomizes the order of elements. n is the number of elements. Shuffle panics if n < 0. swap swaps the elements with indexes i and j. ### func (\*Rand) Uint32 ``` func (r *Rand) Uint32() uint32 ``` Uint32 returns a pseudo-random 32-bit value as a uint32. ### func (\*Rand) Uint64 1.8 ``` func (r *Rand) Uint64() uint64 ``` Uint64 returns a pseudo-random 64-bit value as a uint64. type Source ----------- A Source represents a source of uniformly-distributed pseudo-random int64 values in the range [0, 1<<63). A Source is not safe for concurrent use by multiple goroutines. ``` type Source interface { Int63() int64 Seed(seed int64) } ``` ### func NewSource ``` func NewSource(seed int64) Source ``` NewSource returns a new pseudo-random Source seeded with the given value. Unlike the default Source used by top-level functions, this source is not safe for concurrent use by multiple goroutines. The returned Source implements Source64. type Source64 1.8 ----------------- A Source64 is a Source that can also generate uniformly-distributed pseudo-random uint64 values in the range [0, 1<<64) directly. If a Rand r's underlying Source s implements Source64, then r.Uint64 returns the result of one call to s.Uint64 instead of making two calls to s.Int63. ``` type Source64 interface { Source Uint64() uint64 } ``` type Zipf --------- A Zipf generates Zipf distributed variates. ``` type Zipf struct { // contains filtered or unexported fields } ``` ### func NewZipf ``` func NewZipf(r *Rand, s float64, v float64, imax uint64) *Zipf ``` NewZipf returns a Zipf variate generator. The generator generates values k ∈ [0, imax] such that P(k) is proportional to (v + k) \*\* (-s). Requirements: s > 1 and v >= 1. ### func (\*Zipf) Uint64 ``` func (z *Zipf) Uint64() uint64 ``` Uint64 returns a value drawn from the Zipf distribution described by the Zipf object. go Package image Package image ============== * `import "image"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package image implements a basic 2-D image library. The fundamental interface is called Image. An Image contains colors, which are described in the image/color package. Values of the Image interface are created either by calling functions such as NewRGBA and NewPaletted, or by calling Decode on an io.Reader containing image data in a format such as GIF, JPEG or PNG. Decoding any particular image format requires the prior registration of a decoder function. Registration is typically automatic as a side effect of initializing that format's package so that, to decode a PNG image, it suffices to have ``` import _ "image/png" ``` in a program's main package. The \_ means to import a package purely for its initialization side effects. See "The Go image package" for more details: <https://golang.org/doc/articles/image_package.html> #### Example Code: ``` // Decode the JPEG data. If reading from file, create a reader with // // reader, err := os.Open("testdata/video-001.q50.420.jpeg") // if err != nil { // log.Fatal(err) // } // defer reader.Close() reader := base64.NewDecoder(base64.StdEncoding, strings.NewReader(data)) m, _, err := image.Decode(reader) if err != nil { log.Fatal(err) } bounds := m.Bounds() // Calculate a 16-bin histogram for m's red, green, blue and alpha components. // // An image's bounds do not necessarily start at (0, 0), so the two loops start // at bounds.Min.Y and bounds.Min.X. Looping over Y first and X second is more // likely to result in better memory access patterns than X first and Y second. var histogram [16][4]int for y := bounds.Min.Y; y < bounds.Max.Y; y++ { for x := bounds.Min.X; x < bounds.Max.X; x++ { r, g, b, a := m.At(x, y).RGBA() // A color's RGBA method returns values in the range [0, 65535]. // Shifting by 12 reduces this to the range [0, 15]. histogram[r>>12][0]++ histogram[g>>12][1]++ histogram[b>>12][2]++ histogram[a>>12][3]++ } } // Print the results. fmt.Printf("%-14s %6s %6s %6s %6s\n", "bin", "red", "green", "blue", "alpha") for i, x := range histogram { fmt.Printf("0x%04x-0x%04x: %6d %6d %6d %6d\n", i<<12, (i+1)<<12-1, x[0], x[1], x[2], x[3]) } ``` Output: ``` bin red green blue alpha 0x0000-0x0fff: 364 790 7242 0 0x1000-0x1fff: 645 2967 1039 0 0x2000-0x2fff: 1072 2299 979 0 0x3000-0x3fff: 820 2266 980 0 0x4000-0x4fff: 537 1305 541 0 0x5000-0x5fff: 319 962 261 0 0x6000-0x6fff: 322 375 177 0 0x7000-0x7fff: 601 279 214 0 0x8000-0x8fff: 3478 227 273 0 0x9000-0x9fff: 2260 234 329 0 0xa000-0xafff: 921 282 373 0 0xb000-0xbfff: 321 335 397 0 0xc000-0xcfff: 229 388 298 0 0xd000-0xdfff: 260 414 277 0 0xe000-0xefff: 516 428 298 0 0xf000-0xffff: 2785 1899 1772 15450 ``` #### Example (DecodeConfig) Code: ``` reader := base64.NewDecoder(base64.StdEncoding, strings.NewReader(data)) config, format, err := image.DecodeConfig(reader) if err != nil { log.Fatal(err) } fmt.Println("Width:", config.Width, "Height:", config.Height, "Format:", format) ``` Index ----- * [Variables](#pkg-variables) * [func RegisterFormat(name, magic string, decode func(io.Reader) (Image, error), decodeConfig func(io.Reader) (Config, error))](#RegisterFormat) * [type Alpha](#Alpha) * [func NewAlpha(r Rectangle) \*Alpha](#NewAlpha) * [func (p \*Alpha) AlphaAt(x, y int) color.Alpha](#Alpha.AlphaAt) * [func (p \*Alpha) At(x, y int) color.Color](#Alpha.At) * [func (p \*Alpha) Bounds() Rectangle](#Alpha.Bounds) * [func (p \*Alpha) ColorModel() color.Model](#Alpha.ColorModel) * [func (p \*Alpha) Opaque() bool](#Alpha.Opaque) * [func (p \*Alpha) PixOffset(x, y int) int](#Alpha.PixOffset) * [func (p \*Alpha) RGBA64At(x, y int) color.RGBA64](#Alpha.RGBA64At) * [func (p \*Alpha) Set(x, y int, c color.Color)](#Alpha.Set) * [func (p \*Alpha) SetAlpha(x, y int, c color.Alpha)](#Alpha.SetAlpha) * [func (p \*Alpha) SetRGBA64(x, y int, c color.RGBA64)](#Alpha.SetRGBA64) * [func (p \*Alpha) SubImage(r Rectangle) Image](#Alpha.SubImage) * [type Alpha16](#Alpha16) * [func NewAlpha16(r Rectangle) \*Alpha16](#NewAlpha16) * [func (p \*Alpha16) Alpha16At(x, y int) color.Alpha16](#Alpha16.Alpha16At) * [func (p \*Alpha16) At(x, y int) color.Color](#Alpha16.At) * [func (p \*Alpha16) Bounds() Rectangle](#Alpha16.Bounds) * [func (p \*Alpha16) ColorModel() color.Model](#Alpha16.ColorModel) * [func (p \*Alpha16) Opaque() bool](#Alpha16.Opaque) * [func (p \*Alpha16) PixOffset(x, y int) int](#Alpha16.PixOffset) * [func (p \*Alpha16) RGBA64At(x, y int) color.RGBA64](#Alpha16.RGBA64At) * [func (p \*Alpha16) Set(x, y int, c color.Color)](#Alpha16.Set) * [func (p \*Alpha16) SetAlpha16(x, y int, c color.Alpha16)](#Alpha16.SetAlpha16) * [func (p \*Alpha16) SetRGBA64(x, y int, c color.RGBA64)](#Alpha16.SetRGBA64) * [func (p \*Alpha16) SubImage(r Rectangle) Image](#Alpha16.SubImage) * [type CMYK](#CMYK) * [func NewCMYK(r Rectangle) \*CMYK](#NewCMYK) * [func (p \*CMYK) At(x, y int) color.Color](#CMYK.At) * [func (p \*CMYK) Bounds() Rectangle](#CMYK.Bounds) * [func (p \*CMYK) CMYKAt(x, y int) color.CMYK](#CMYK.CMYKAt) * [func (p \*CMYK) ColorModel() color.Model](#CMYK.ColorModel) * [func (p \*CMYK) Opaque() bool](#CMYK.Opaque) * [func (p \*CMYK) PixOffset(x, y int) int](#CMYK.PixOffset) * [func (p \*CMYK) RGBA64At(x, y int) color.RGBA64](#CMYK.RGBA64At) * [func (p \*CMYK) Set(x, y int, c color.Color)](#CMYK.Set) * [func (p \*CMYK) SetCMYK(x, y int, c color.CMYK)](#CMYK.SetCMYK) * [func (p \*CMYK) SetRGBA64(x, y int, c color.RGBA64)](#CMYK.SetRGBA64) * [func (p \*CMYK) SubImage(r Rectangle) Image](#CMYK.SubImage) * [type Config](#Config) * [func DecodeConfig(r io.Reader) (Config, string, error)](#DecodeConfig) * [type Gray](#Gray) * [func NewGray(r Rectangle) \*Gray](#NewGray) * [func (p \*Gray) At(x, y int) color.Color](#Gray.At) * [func (p \*Gray) Bounds() Rectangle](#Gray.Bounds) * [func (p \*Gray) ColorModel() color.Model](#Gray.ColorModel) * [func (p \*Gray) GrayAt(x, y int) color.Gray](#Gray.GrayAt) * [func (p \*Gray) Opaque() bool](#Gray.Opaque) * [func (p \*Gray) PixOffset(x, y int) int](#Gray.PixOffset) * [func (p \*Gray) RGBA64At(x, y int) color.RGBA64](#Gray.RGBA64At) * [func (p \*Gray) Set(x, y int, c color.Color)](#Gray.Set) * [func (p \*Gray) SetGray(x, y int, c color.Gray)](#Gray.SetGray) * [func (p \*Gray) SetRGBA64(x, y int, c color.RGBA64)](#Gray.SetRGBA64) * [func (p \*Gray) SubImage(r Rectangle) Image](#Gray.SubImage) * [type Gray16](#Gray16) * [func NewGray16(r Rectangle) \*Gray16](#NewGray16) * [func (p \*Gray16) At(x, y int) color.Color](#Gray16.At) * [func (p \*Gray16) Bounds() Rectangle](#Gray16.Bounds) * [func (p \*Gray16) ColorModel() color.Model](#Gray16.ColorModel) * [func (p \*Gray16) Gray16At(x, y int) color.Gray16](#Gray16.Gray16At) * [func (p \*Gray16) Opaque() bool](#Gray16.Opaque) * [func (p \*Gray16) PixOffset(x, y int) int](#Gray16.PixOffset) * [func (p \*Gray16) RGBA64At(x, y int) color.RGBA64](#Gray16.RGBA64At) * [func (p \*Gray16) Set(x, y int, c color.Color)](#Gray16.Set) * [func (p \*Gray16) SetGray16(x, y int, c color.Gray16)](#Gray16.SetGray16) * [func (p \*Gray16) SetRGBA64(x, y int, c color.RGBA64)](#Gray16.SetRGBA64) * [func (p \*Gray16) SubImage(r Rectangle) Image](#Gray16.SubImage) * [type Image](#Image) * [func Decode(r io.Reader) (Image, string, error)](#Decode) * [type NRGBA](#NRGBA) * [func NewNRGBA(r Rectangle) \*NRGBA](#NewNRGBA) * [func (p \*NRGBA) At(x, y int) color.Color](#NRGBA.At) * [func (p \*NRGBA) Bounds() Rectangle](#NRGBA.Bounds) * [func (p \*NRGBA) ColorModel() color.Model](#NRGBA.ColorModel) * [func (p \*NRGBA) NRGBAAt(x, y int) color.NRGBA](#NRGBA.NRGBAAt) * [func (p \*NRGBA) Opaque() bool](#NRGBA.Opaque) * [func (p \*NRGBA) PixOffset(x, y int) int](#NRGBA.PixOffset) * [func (p \*NRGBA) RGBA64At(x, y int) color.RGBA64](#NRGBA.RGBA64At) * [func (p \*NRGBA) Set(x, y int, c color.Color)](#NRGBA.Set) * [func (p \*NRGBA) SetNRGBA(x, y int, c color.NRGBA)](#NRGBA.SetNRGBA) * [func (p \*NRGBA) SetRGBA64(x, y int, c color.RGBA64)](#NRGBA.SetRGBA64) * [func (p \*NRGBA) SubImage(r Rectangle) Image](#NRGBA.SubImage) * [type NRGBA64](#NRGBA64) * [func NewNRGBA64(r Rectangle) \*NRGBA64](#NewNRGBA64) * [func (p \*NRGBA64) At(x, y int) color.Color](#NRGBA64.At) * [func (p \*NRGBA64) Bounds() Rectangle](#NRGBA64.Bounds) * [func (p \*NRGBA64) ColorModel() color.Model](#NRGBA64.ColorModel) * [func (p \*NRGBA64) NRGBA64At(x, y int) color.NRGBA64](#NRGBA64.NRGBA64At) * [func (p \*NRGBA64) Opaque() bool](#NRGBA64.Opaque) * [func (p \*NRGBA64) PixOffset(x, y int) int](#NRGBA64.PixOffset) * [func (p \*NRGBA64) RGBA64At(x, y int) color.RGBA64](#NRGBA64.RGBA64At) * [func (p \*NRGBA64) Set(x, y int, c color.Color)](#NRGBA64.Set) * [func (p \*NRGBA64) SetNRGBA64(x, y int, c color.NRGBA64)](#NRGBA64.SetNRGBA64) * [func (p \*NRGBA64) SetRGBA64(x, y int, c color.RGBA64)](#NRGBA64.SetRGBA64) * [func (p \*NRGBA64) SubImage(r Rectangle) Image](#NRGBA64.SubImage) * [type NYCbCrA](#NYCbCrA) * [func NewNYCbCrA(r Rectangle, subsampleRatio YCbCrSubsampleRatio) \*NYCbCrA](#NewNYCbCrA) * [func (p \*NYCbCrA) AOffset(x, y int) int](#NYCbCrA.AOffset) * [func (p \*NYCbCrA) At(x, y int) color.Color](#NYCbCrA.At) * [func (p \*NYCbCrA) ColorModel() color.Model](#NYCbCrA.ColorModel) * [func (p \*NYCbCrA) NYCbCrAAt(x, y int) color.NYCbCrA](#NYCbCrA.NYCbCrAAt) * [func (p \*NYCbCrA) Opaque() bool](#NYCbCrA.Opaque) * [func (p \*NYCbCrA) RGBA64At(x, y int) color.RGBA64](#NYCbCrA.RGBA64At) * [func (p \*NYCbCrA) SubImage(r Rectangle) Image](#NYCbCrA.SubImage) * [type Paletted](#Paletted) * [func NewPaletted(r Rectangle, p color.Palette) \*Paletted](#NewPaletted) * [func (p \*Paletted) At(x, y int) color.Color](#Paletted.At) * [func (p \*Paletted) Bounds() Rectangle](#Paletted.Bounds) * [func (p \*Paletted) ColorIndexAt(x, y int) uint8](#Paletted.ColorIndexAt) * [func (p \*Paletted) ColorModel() color.Model](#Paletted.ColorModel) * [func (p \*Paletted) Opaque() bool](#Paletted.Opaque) * [func (p \*Paletted) PixOffset(x, y int) int](#Paletted.PixOffset) * [func (p \*Paletted) RGBA64At(x, y int) color.RGBA64](#Paletted.RGBA64At) * [func (p \*Paletted) Set(x, y int, c color.Color)](#Paletted.Set) * [func (p \*Paletted) SetColorIndex(x, y int, index uint8)](#Paletted.SetColorIndex) * [func (p \*Paletted) SetRGBA64(x, y int, c color.RGBA64)](#Paletted.SetRGBA64) * [func (p \*Paletted) SubImage(r Rectangle) Image](#Paletted.SubImage) * [type PalettedImage](#PalettedImage) * [type Point](#Point) * [func Pt(X, Y int) Point](#Pt) * [func (p Point) Add(q Point) Point](#Point.Add) * [func (p Point) Div(k int) Point](#Point.Div) * [func (p Point) Eq(q Point) bool](#Point.Eq) * [func (p Point) In(r Rectangle) bool](#Point.In) * [func (p Point) Mod(r Rectangle) Point](#Point.Mod) * [func (p Point) Mul(k int) Point](#Point.Mul) * [func (p Point) String() string](#Point.String) * [func (p Point) Sub(q Point) Point](#Point.Sub) * [type RGBA](#RGBA) * [func NewRGBA(r Rectangle) \*RGBA](#NewRGBA) * [func (p \*RGBA) At(x, y int) color.Color](#RGBA.At) * [func (p \*RGBA) Bounds() Rectangle](#RGBA.Bounds) * [func (p \*RGBA) ColorModel() color.Model](#RGBA.ColorModel) * [func (p \*RGBA) Opaque() bool](#RGBA.Opaque) * [func (p \*RGBA) PixOffset(x, y int) int](#RGBA.PixOffset) * [func (p \*RGBA) RGBA64At(x, y int) color.RGBA64](#RGBA.RGBA64At) * [func (p \*RGBA) RGBAAt(x, y int) color.RGBA](#RGBA.RGBAAt) * [func (p \*RGBA) Set(x, y int, c color.Color)](#RGBA.Set) * [func (p \*RGBA) SetRGBA(x, y int, c color.RGBA)](#RGBA.SetRGBA) * [func (p \*RGBA) SetRGBA64(x, y int, c color.RGBA64)](#RGBA.SetRGBA64) * [func (p \*RGBA) SubImage(r Rectangle) Image](#RGBA.SubImage) * [type RGBA64](#RGBA64) * [func NewRGBA64(r Rectangle) \*RGBA64](#NewRGBA64) * [func (p \*RGBA64) At(x, y int) color.Color](#RGBA64.At) * [func (p \*RGBA64) Bounds() Rectangle](#RGBA64.Bounds) * [func (p \*RGBA64) ColorModel() color.Model](#RGBA64.ColorModel) * [func (p \*RGBA64) Opaque() bool](#RGBA64.Opaque) * [func (p \*RGBA64) PixOffset(x, y int) int](#RGBA64.PixOffset) * [func (p \*RGBA64) RGBA64At(x, y int) color.RGBA64](#RGBA64.RGBA64At) * [func (p \*RGBA64) Set(x, y int, c color.Color)](#RGBA64.Set) * [func (p \*RGBA64) SetRGBA64(x, y int, c color.RGBA64)](#RGBA64.SetRGBA64) * [func (p \*RGBA64) SubImage(r Rectangle) Image](#RGBA64.SubImage) * [type RGBA64Image](#RGBA64Image) * [type Rectangle](#Rectangle) * [func Rect(x0, y0, x1, y1 int) Rectangle](#Rect) * [func (r Rectangle) Add(p Point) Rectangle](#Rectangle.Add) * [func (r Rectangle) At(x, y int) color.Color](#Rectangle.At) * [func (r Rectangle) Bounds() Rectangle](#Rectangle.Bounds) * [func (r Rectangle) Canon() Rectangle](#Rectangle.Canon) * [func (r Rectangle) ColorModel() color.Model](#Rectangle.ColorModel) * [func (r Rectangle) Dx() int](#Rectangle.Dx) * [func (r Rectangle) Dy() int](#Rectangle.Dy) * [func (r Rectangle) Empty() bool](#Rectangle.Empty) * [func (r Rectangle) Eq(s Rectangle) bool](#Rectangle.Eq) * [func (r Rectangle) In(s Rectangle) bool](#Rectangle.In) * [func (r Rectangle) Inset(n int) Rectangle](#Rectangle.Inset) * [func (r Rectangle) Intersect(s Rectangle) Rectangle](#Rectangle.Intersect) * [func (r Rectangle) Overlaps(s Rectangle) bool](#Rectangle.Overlaps) * [func (r Rectangle) RGBA64At(x, y int) color.RGBA64](#Rectangle.RGBA64At) * [func (r Rectangle) Size() Point](#Rectangle.Size) * [func (r Rectangle) String() string](#Rectangle.String) * [func (r Rectangle) Sub(p Point) Rectangle](#Rectangle.Sub) * [func (r Rectangle) Union(s Rectangle) Rectangle](#Rectangle.Union) * [type Uniform](#Uniform) * [func NewUniform(c color.Color) \*Uniform](#NewUniform) * [func (c \*Uniform) At(x, y int) color.Color](#Uniform.At) * [func (c \*Uniform) Bounds() Rectangle](#Uniform.Bounds) * [func (c \*Uniform) ColorModel() color.Model](#Uniform.ColorModel) * [func (c \*Uniform) Convert(color.Color) color.Color](#Uniform.Convert) * [func (c \*Uniform) Opaque() bool](#Uniform.Opaque) * [func (c \*Uniform) RGBA() (r, g, b, a uint32)](#Uniform.RGBA) * [func (c \*Uniform) RGBA64At(x, y int) color.RGBA64](#Uniform.RGBA64At) * [type YCbCr](#YCbCr) * [func NewYCbCr(r Rectangle, subsampleRatio YCbCrSubsampleRatio) \*YCbCr](#NewYCbCr) * [func (p \*YCbCr) At(x, y int) color.Color](#YCbCr.At) * [func (p \*YCbCr) Bounds() Rectangle](#YCbCr.Bounds) * [func (p \*YCbCr) COffset(x, y int) int](#YCbCr.COffset) * [func (p \*YCbCr) ColorModel() color.Model](#YCbCr.ColorModel) * [func (p \*YCbCr) Opaque() bool](#YCbCr.Opaque) * [func (p \*YCbCr) RGBA64At(x, y int) color.RGBA64](#YCbCr.RGBA64At) * [func (p \*YCbCr) SubImage(r Rectangle) Image](#YCbCr.SubImage) * [func (p \*YCbCr) YCbCrAt(x, y int) color.YCbCr](#YCbCr.YCbCrAt) * [func (p \*YCbCr) YOffset(x, y int) int](#YCbCr.YOffset) * [type YCbCrSubsampleRatio](#YCbCrSubsampleRatio) * [func (s YCbCrSubsampleRatio) String() string](#YCbCrSubsampleRatio.String) ### Examples [Package](#example_) [Package (DecodeConfig)](#example__decodeConfig) ### Package files format.go geom.go image.go names.go ycbcr.go Variables --------- ``` var ( // Black is an opaque black uniform image. Black = NewUniform(color.Black) // White is an opaque white uniform image. White = NewUniform(color.White) // Transparent is a fully transparent uniform image. Transparent = NewUniform(color.Transparent) // Opaque is a fully opaque uniform image. Opaque = NewUniform(color.Opaque) ) ``` ErrFormat indicates that decoding encountered an unknown format. ``` var ErrFormat = errors.New("image: unknown format") ``` func RegisterFormat ------------------- ``` func RegisterFormat(name, magic string, decode func(io.Reader) (Image, error), decodeConfig func(io.Reader) (Config, error)) ``` RegisterFormat registers an image format for use by Decode. Name is the name of the format, like "jpeg" or "png". Magic is the magic prefix that identifies the format's encoding. The magic string can contain "?" wildcards that each match any one byte. Decode is the function that decodes the encoded image. DecodeConfig is the function that decodes just its configuration. type Alpha ---------- Alpha is an in-memory image whose At method returns color.Alpha values. ``` type Alpha struct { // Pix holds the image's pixels, as alpha values. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*1]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewAlpha ``` func NewAlpha(r Rectangle) *Alpha ``` NewAlpha returns a new Alpha image with the given bounds. ### func (\*Alpha) AlphaAt 1.4 ``` func (p *Alpha) AlphaAt(x, y int) color.Alpha ``` ### func (\*Alpha) At ``` func (p *Alpha) At(x, y int) color.Color ``` ### func (\*Alpha) Bounds ``` func (p *Alpha) Bounds() Rectangle ``` ### func (\*Alpha) ColorModel ``` func (p *Alpha) ColorModel() color.Model ``` ### func (\*Alpha) Opaque ``` func (p *Alpha) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*Alpha) PixOffset ``` func (p *Alpha) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*Alpha) RGBA64At 1.17 ``` func (p *Alpha) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*Alpha) Set ``` func (p *Alpha) Set(x, y int, c color.Color) ``` ### func (\*Alpha) SetAlpha ``` func (p *Alpha) SetAlpha(x, y int, c color.Alpha) ``` ### func (\*Alpha) SetRGBA64 1.17 ``` func (p *Alpha) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*Alpha) SubImage ``` func (p *Alpha) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type Alpha16 ------------ Alpha16 is an in-memory image whose At method returns color.Alpha16 values. ``` type Alpha16 struct { // Pix holds the image's pixels, as alpha values in big-endian format. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*2]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewAlpha16 ``` func NewAlpha16(r Rectangle) *Alpha16 ``` NewAlpha16 returns a new Alpha16 image with the given bounds. ### func (\*Alpha16) Alpha16At 1.4 ``` func (p *Alpha16) Alpha16At(x, y int) color.Alpha16 ``` ### func (\*Alpha16) At ``` func (p *Alpha16) At(x, y int) color.Color ``` ### func (\*Alpha16) Bounds ``` func (p *Alpha16) Bounds() Rectangle ``` ### func (\*Alpha16) ColorModel ``` func (p *Alpha16) ColorModel() color.Model ``` ### func (\*Alpha16) Opaque ``` func (p *Alpha16) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*Alpha16) PixOffset ``` func (p *Alpha16) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*Alpha16) RGBA64At 1.17 ``` func (p *Alpha16) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*Alpha16) Set ``` func (p *Alpha16) Set(x, y int, c color.Color) ``` ### func (\*Alpha16) SetAlpha16 ``` func (p *Alpha16) SetAlpha16(x, y int, c color.Alpha16) ``` ### func (\*Alpha16) SetRGBA64 1.17 ``` func (p *Alpha16) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*Alpha16) SubImage ``` func (p *Alpha16) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type CMYK 1.5 ------------- CMYK is an in-memory image whose At method returns color.CMYK values. ``` type CMYK struct { // Pix holds the image's pixels, in C, M, Y, K order. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*4]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewCMYK 1.5 ``` func NewCMYK(r Rectangle) *CMYK ``` NewCMYK returns a new CMYK image with the given bounds. ### func (\*CMYK) At 1.5 ``` func (p *CMYK) At(x, y int) color.Color ``` ### func (\*CMYK) Bounds 1.5 ``` func (p *CMYK) Bounds() Rectangle ``` ### func (\*CMYK) CMYKAt 1.5 ``` func (p *CMYK) CMYKAt(x, y int) color.CMYK ``` ### func (\*CMYK) ColorModel 1.5 ``` func (p *CMYK) ColorModel() color.Model ``` ### func (\*CMYK) Opaque 1.5 ``` func (p *CMYK) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*CMYK) PixOffset 1.5 ``` func (p *CMYK) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*CMYK) RGBA64At 1.17 ``` func (p *CMYK) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*CMYK) Set 1.5 ``` func (p *CMYK) Set(x, y int, c color.Color) ``` ### func (\*CMYK) SetCMYK 1.5 ``` func (p *CMYK) SetCMYK(x, y int, c color.CMYK) ``` ### func (\*CMYK) SetRGBA64 1.17 ``` func (p *CMYK) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*CMYK) SubImage 1.5 ``` func (p *CMYK) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type Config ----------- Config holds an image's color model and dimensions. ``` type Config struct { ColorModel color.Model Width, Height int } ``` ### func DecodeConfig ``` func DecodeConfig(r io.Reader) (Config, string, error) ``` DecodeConfig decodes the color model and dimensions of an image that has been encoded in a registered format. The string returned is the format name used during format registration. Format registration is typically done by an init function in the codec-specific package. type Gray --------- Gray is an in-memory image whose At method returns color.Gray values. ``` type Gray struct { // Pix holds the image's pixels, as gray values. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*1]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewGray ``` func NewGray(r Rectangle) *Gray ``` NewGray returns a new Gray image with the given bounds. ### func (\*Gray) At ``` func (p *Gray) At(x, y int) color.Color ``` ### func (\*Gray) Bounds ``` func (p *Gray) Bounds() Rectangle ``` ### func (\*Gray) ColorModel ``` func (p *Gray) ColorModel() color.Model ``` ### func (\*Gray) GrayAt 1.4 ``` func (p *Gray) GrayAt(x, y int) color.Gray ``` ### func (\*Gray) Opaque ``` func (p *Gray) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*Gray) PixOffset ``` func (p *Gray) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*Gray) RGBA64At 1.17 ``` func (p *Gray) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*Gray) Set ``` func (p *Gray) Set(x, y int, c color.Color) ``` ### func (\*Gray) SetGray ``` func (p *Gray) SetGray(x, y int, c color.Gray) ``` ### func (\*Gray) SetRGBA64 1.17 ``` func (p *Gray) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*Gray) SubImage ``` func (p *Gray) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type Gray16 ----------- Gray16 is an in-memory image whose At method returns color.Gray16 values. ``` type Gray16 struct { // Pix holds the image's pixels, as gray values in big-endian format. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*2]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewGray16 ``` func NewGray16(r Rectangle) *Gray16 ``` NewGray16 returns a new Gray16 image with the given bounds. ### func (\*Gray16) At ``` func (p *Gray16) At(x, y int) color.Color ``` ### func (\*Gray16) Bounds ``` func (p *Gray16) Bounds() Rectangle ``` ### func (\*Gray16) ColorModel ``` func (p *Gray16) ColorModel() color.Model ``` ### func (\*Gray16) Gray16At 1.4 ``` func (p *Gray16) Gray16At(x, y int) color.Gray16 ``` ### func (\*Gray16) Opaque ``` func (p *Gray16) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*Gray16) PixOffset ``` func (p *Gray16) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*Gray16) RGBA64At 1.17 ``` func (p *Gray16) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*Gray16) Set ``` func (p *Gray16) Set(x, y int, c color.Color) ``` ### func (\*Gray16) SetGray16 ``` func (p *Gray16) SetGray16(x, y int, c color.Gray16) ``` ### func (\*Gray16) SetRGBA64 1.17 ``` func (p *Gray16) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*Gray16) SubImage ``` func (p *Gray16) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type Image ---------- Image is a finite rectangular grid of color.Color values taken from a color model. ``` type Image interface { // ColorModel returns the Image's color model. ColorModel() color.Model // Bounds returns the domain for which At can return non-zero color. // The bounds do not necessarily contain the point (0, 0). Bounds() Rectangle // At returns the color of the pixel at (x, y). // At(Bounds().Min.X, Bounds().Min.Y) returns the upper-left pixel of the grid. // At(Bounds().Max.X-1, Bounds().Max.Y-1) returns the lower-right one. At(x, y int) color.Color } ``` ### func Decode ``` func Decode(r io.Reader) (Image, string, error) ``` Decode decodes an image that has been encoded in a registered format. The string returned is the format name used during format registration. Format registration is typically done by an init function in the codec- specific package. type NRGBA ---------- NRGBA is an in-memory image whose At method returns color.NRGBA values. ``` type NRGBA struct { // Pix holds the image's pixels, in R, G, B, A order. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*4]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewNRGBA ``` func NewNRGBA(r Rectangle) *NRGBA ``` NewNRGBA returns a new NRGBA image with the given bounds. ### func (\*NRGBA) At ``` func (p *NRGBA) At(x, y int) color.Color ``` ### func (\*NRGBA) Bounds ``` func (p *NRGBA) Bounds() Rectangle ``` ### func (\*NRGBA) ColorModel ``` func (p *NRGBA) ColorModel() color.Model ``` ### func (\*NRGBA) NRGBAAt 1.4 ``` func (p *NRGBA) NRGBAAt(x, y int) color.NRGBA ``` ### func (\*NRGBA) Opaque ``` func (p *NRGBA) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*NRGBA) PixOffset ``` func (p *NRGBA) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*NRGBA) RGBA64At 1.17 ``` func (p *NRGBA) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*NRGBA) Set ``` func (p *NRGBA) Set(x, y int, c color.Color) ``` ### func (\*NRGBA) SetNRGBA ``` func (p *NRGBA) SetNRGBA(x, y int, c color.NRGBA) ``` ### func (\*NRGBA) SetRGBA64 1.17 ``` func (p *NRGBA) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*NRGBA) SubImage ``` func (p *NRGBA) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type NRGBA64 ------------ NRGBA64 is an in-memory image whose At method returns color.NRGBA64 values. ``` type NRGBA64 struct { // Pix holds the image's pixels, in R, G, B, A order and big-endian format. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*8]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewNRGBA64 ``` func NewNRGBA64(r Rectangle) *NRGBA64 ``` NewNRGBA64 returns a new NRGBA64 image with the given bounds. ### func (\*NRGBA64) At ``` func (p *NRGBA64) At(x, y int) color.Color ``` ### func (\*NRGBA64) Bounds ``` func (p *NRGBA64) Bounds() Rectangle ``` ### func (\*NRGBA64) ColorModel ``` func (p *NRGBA64) ColorModel() color.Model ``` ### func (\*NRGBA64) NRGBA64At 1.4 ``` func (p *NRGBA64) NRGBA64At(x, y int) color.NRGBA64 ``` ### func (\*NRGBA64) Opaque ``` func (p *NRGBA64) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*NRGBA64) PixOffset ``` func (p *NRGBA64) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*NRGBA64) RGBA64At 1.17 ``` func (p *NRGBA64) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*NRGBA64) Set ``` func (p *NRGBA64) Set(x, y int, c color.Color) ``` ### func (\*NRGBA64) SetNRGBA64 ``` func (p *NRGBA64) SetNRGBA64(x, y int, c color.NRGBA64) ``` ### func (\*NRGBA64) SetRGBA64 1.17 ``` func (p *NRGBA64) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*NRGBA64) SubImage ``` func (p *NRGBA64) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type NYCbCrA 1.6 ---------------- NYCbCrA is an in-memory image of non-alpha-premultiplied Y'CbCr-with-alpha colors. A and AStride are analogous to the Y and YStride fields of the embedded YCbCr. ``` type NYCbCrA struct { YCbCr A []uint8 AStride int } ``` ### func NewNYCbCrA 1.6 ``` func NewNYCbCrA(r Rectangle, subsampleRatio YCbCrSubsampleRatio) *NYCbCrA ``` NewNYCbCrA returns a new NYCbCrA image with the given bounds and subsample ratio. ### func (\*NYCbCrA) AOffset 1.6 ``` func (p *NYCbCrA) AOffset(x, y int) int ``` AOffset returns the index of the first element of A that corresponds to the pixel at (x, y). ### func (\*NYCbCrA) At 1.6 ``` func (p *NYCbCrA) At(x, y int) color.Color ``` ### func (\*NYCbCrA) ColorModel 1.6 ``` func (p *NYCbCrA) ColorModel() color.Model ``` ### func (\*NYCbCrA) NYCbCrAAt 1.6 ``` func (p *NYCbCrA) NYCbCrAAt(x, y int) color.NYCbCrA ``` ### func (\*NYCbCrA) Opaque 1.6 ``` func (p *NYCbCrA) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*NYCbCrA) RGBA64At 1.17 ``` func (p *NYCbCrA) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*NYCbCrA) SubImage 1.6 ``` func (p *NYCbCrA) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type Paletted ------------- Paletted is an in-memory image of uint8 indices into a given palette. ``` type Paletted struct { // Pix holds the image's pixels, as palette indices. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*1]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle // Palette is the image's palette. Palette color.Palette } ``` ### func NewPaletted ``` func NewPaletted(r Rectangle, p color.Palette) *Paletted ``` NewPaletted returns a new Paletted image with the given width, height and palette. ### func (\*Paletted) At ``` func (p *Paletted) At(x, y int) color.Color ``` ### func (\*Paletted) Bounds ``` func (p *Paletted) Bounds() Rectangle ``` ### func (\*Paletted) ColorIndexAt ``` func (p *Paletted) ColorIndexAt(x, y int) uint8 ``` ### func (\*Paletted) ColorModel ``` func (p *Paletted) ColorModel() color.Model ``` ### func (\*Paletted) Opaque ``` func (p *Paletted) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*Paletted) PixOffset ``` func (p *Paletted) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*Paletted) RGBA64At 1.17 ``` func (p *Paletted) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*Paletted) Set ``` func (p *Paletted) Set(x, y int, c color.Color) ``` ### func (\*Paletted) SetColorIndex ``` func (p *Paletted) SetColorIndex(x, y int, index uint8) ``` ### func (\*Paletted) SetRGBA64 1.17 ``` func (p *Paletted) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*Paletted) SubImage ``` func (p *Paletted) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type PalettedImage ------------------ PalettedImage is an image whose colors may come from a limited palette. If m is a PalettedImage and m.ColorModel() returns a color.Palette p, then m.At(x, y) should be equivalent to p[m.ColorIndexAt(x, y)]. If m's color model is not a color.Palette, then ColorIndexAt's behavior is undefined. ``` type PalettedImage interface { // ColorIndexAt returns the palette index of the pixel at (x, y). ColorIndexAt(x, y int) uint8 Image } ``` type Point ---------- A Point is an X, Y coordinate pair. The axes increase right and down. ``` type Point struct { X, Y int } ``` ZP is the zero Point. Deprecated: Use a literal image.Point{} instead. ``` var ZP Point ``` ### func Pt ``` func Pt(X, Y int) Point ``` Pt is shorthand for Point{X, Y}. ### func (Point) Add ``` func (p Point) Add(q Point) Point ``` Add returns the vector p+q. ### func (Point) Div ``` func (p Point) Div(k int) Point ``` Div returns the vector p/k. ### func (Point) Eq ``` func (p Point) Eq(q Point) bool ``` Eq reports whether p and q are equal. ### func (Point) In ``` func (p Point) In(r Rectangle) bool ``` In reports whether p is in r. ### func (Point) Mod ``` func (p Point) Mod(r Rectangle) Point ``` Mod returns the point q in r such that p.X-q.X is a multiple of r's width and p.Y-q.Y is a multiple of r's height. ### func (Point) Mul ``` func (p Point) Mul(k int) Point ``` Mul returns the vector p\*k. ### func (Point) String ``` func (p Point) String() string ``` String returns a string representation of p like "(3,4)". ### func (Point) Sub ``` func (p Point) Sub(q Point) Point ``` Sub returns the vector p-q. type RGBA --------- RGBA is an in-memory image whose At method returns color.RGBA values. ``` type RGBA struct { // Pix holds the image's pixels, in R, G, B, A order. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*4]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewRGBA ``` func NewRGBA(r Rectangle) *RGBA ``` NewRGBA returns a new RGBA image with the given bounds. ### func (\*RGBA) At ``` func (p *RGBA) At(x, y int) color.Color ``` ### func (\*RGBA) Bounds ``` func (p *RGBA) Bounds() Rectangle ``` ### func (\*RGBA) ColorModel ``` func (p *RGBA) ColorModel() color.Model ``` ### func (\*RGBA) Opaque ``` func (p *RGBA) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*RGBA) PixOffset ``` func (p *RGBA) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*RGBA) RGBA64At 1.17 ``` func (p *RGBA) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*RGBA) RGBAAt 1.4 ``` func (p *RGBA) RGBAAt(x, y int) color.RGBA ``` ### func (\*RGBA) Set ``` func (p *RGBA) Set(x, y int, c color.Color) ``` ### func (\*RGBA) SetRGBA ``` func (p *RGBA) SetRGBA(x, y int, c color.RGBA) ``` ### func (\*RGBA) SetRGBA64 1.17 ``` func (p *RGBA) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*RGBA) SubImage ``` func (p *RGBA) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type RGBA64 ----------- RGBA64 is an in-memory image whose At method returns color.RGBA64 values. ``` type RGBA64 struct { // Pix holds the image's pixels, in R, G, B, A order and big-endian format. The pixel at // (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*8]. Pix []uint8 // Stride is the Pix stride (in bytes) between vertically adjacent pixels. Stride int // Rect is the image's bounds. Rect Rectangle } ``` ### func NewRGBA64 ``` func NewRGBA64(r Rectangle) *RGBA64 ``` NewRGBA64 returns a new RGBA64 image with the given bounds. ### func (\*RGBA64) At ``` func (p *RGBA64) At(x, y int) color.Color ``` ### func (\*RGBA64) Bounds ``` func (p *RGBA64) Bounds() Rectangle ``` ### func (\*RGBA64) ColorModel ``` func (p *RGBA64) ColorModel() color.Model ``` ### func (\*RGBA64) Opaque ``` func (p *RGBA64) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*RGBA64) PixOffset ``` func (p *RGBA64) PixOffset(x, y int) int ``` PixOffset returns the index of the first element of Pix that corresponds to the pixel at (x, y). ### func (\*RGBA64) RGBA64At 1.4 ``` func (p *RGBA64) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*RGBA64) Set ``` func (p *RGBA64) Set(x, y int, c color.Color) ``` ### func (\*RGBA64) SetRGBA64 ``` func (p *RGBA64) SetRGBA64(x, y int, c color.RGBA64) ``` ### func (\*RGBA64) SubImage ``` func (p *RGBA64) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. type RGBA64Image 1.17 --------------------- RGBA64Image is an Image whose pixels can be converted directly to a color.RGBA64. ``` type RGBA64Image interface { // RGBA64At returns the RGBA64 color of the pixel at (x, y). It is // equivalent to calling At(x, y).RGBA() and converting the resulting // 32-bit return values to a color.RGBA64, but it can avoid allocations // from converting concrete color types to the color.Color interface type. RGBA64At(x, y int) color.RGBA64 Image } ``` type Rectangle -------------- A Rectangle contains the points with Min.X <= X < Max.X, Min.Y <= Y < Max.Y. It is well-formed if Min.X <= Max.X and likewise for Y. Points are always well-formed. A rectangle's methods always return well-formed outputs for well-formed inputs. A Rectangle is also an Image whose bounds are the rectangle itself. At returns color.Opaque for points in the rectangle and color.Transparent otherwise. ``` type Rectangle struct { Min, Max Point } ``` ZR is the zero Rectangle. Deprecated: Use a literal image.Rectangle{} instead. ``` var ZR Rectangle ``` ### func Rect ``` func Rect(x0, y0, x1, y1 int) Rectangle ``` Rect is shorthand for Rectangle{Pt(x0, y0), Pt(x1, y1)}. The returned rectangle has minimum and maximum coordinates swapped if necessary so that it is well-formed. ### func (Rectangle) Add ``` func (r Rectangle) Add(p Point) Rectangle ``` Add returns the rectangle r translated by p. ### func (Rectangle) At 1.5 ``` func (r Rectangle) At(x, y int) color.Color ``` At implements the Image interface. ### func (Rectangle) Bounds 1.5 ``` func (r Rectangle) Bounds() Rectangle ``` Bounds implements the Image interface. ### func (Rectangle) Canon ``` func (r Rectangle) Canon() Rectangle ``` Canon returns the canonical version of r. The returned rectangle has minimum and maximum coordinates swapped if necessary so that it is well-formed. ### func (Rectangle) ColorModel 1.5 ``` func (r Rectangle) ColorModel() color.Model ``` ColorModel implements the Image interface. ### func (Rectangle) Dx ``` func (r Rectangle) Dx() int ``` Dx returns r's width. ### func (Rectangle) Dy ``` func (r Rectangle) Dy() int ``` Dy returns r's height. ### func (Rectangle) Empty ``` func (r Rectangle) Empty() bool ``` Empty reports whether the rectangle contains no points. ### func (Rectangle) Eq ``` func (r Rectangle) Eq(s Rectangle) bool ``` Eq reports whether r and s contain the same set of points. All empty rectangles are considered equal. ### func (Rectangle) In ``` func (r Rectangle) In(s Rectangle) bool ``` In reports whether every point in r is in s. ### func (Rectangle) Inset ``` func (r Rectangle) Inset(n int) Rectangle ``` Inset returns the rectangle r inset by n, which may be negative. If either of r's dimensions is less than 2\*n then an empty rectangle near the center of r will be returned. ### func (Rectangle) Intersect ``` func (r Rectangle) Intersect(s Rectangle) Rectangle ``` Intersect returns the largest rectangle contained by both r and s. If the two rectangles do not overlap then the zero rectangle will be returned. ### func (Rectangle) Overlaps ``` func (r Rectangle) Overlaps(s Rectangle) bool ``` Overlaps reports whether r and s have a non-empty intersection. ### func (Rectangle) RGBA64At 1.17 ``` func (r Rectangle) RGBA64At(x, y int) color.RGBA64 ``` RGBA64At implements the RGBA64Image interface. ### func (Rectangle) Size ``` func (r Rectangle) Size() Point ``` Size returns r's width and height. ### func (Rectangle) String ``` func (r Rectangle) String() string ``` String returns a string representation of r like "(3,4)-(6,5)". ### func (Rectangle) Sub ``` func (r Rectangle) Sub(p Point) Rectangle ``` Sub returns the rectangle r translated by -p. ### func (Rectangle) Union ``` func (r Rectangle) Union(s Rectangle) Rectangle ``` Union returns the smallest rectangle that contains both r and s. type Uniform ------------ Uniform is an infinite-sized Image of uniform color. It implements the color.Color, color.Model, and Image interfaces. ``` type Uniform struct { C color.Color } ``` ### func NewUniform ``` func NewUniform(c color.Color) *Uniform ``` NewUniform returns a new Uniform image of the given color. ### func (\*Uniform) At ``` func (c *Uniform) At(x, y int) color.Color ``` ### func (\*Uniform) Bounds ``` func (c *Uniform) Bounds() Rectangle ``` ### func (\*Uniform) ColorModel ``` func (c *Uniform) ColorModel() color.Model ``` ### func (\*Uniform) Convert ``` func (c *Uniform) Convert(color.Color) color.Color ``` ### func (\*Uniform) Opaque ``` func (c *Uniform) Opaque() bool ``` Opaque scans the entire image and reports whether it is fully opaque. ### func (\*Uniform) RGBA ``` func (c *Uniform) RGBA() (r, g, b, a uint32) ``` ### func (\*Uniform) RGBA64At 1.17 ``` func (c *Uniform) RGBA64At(x, y int) color.RGBA64 ``` type YCbCr ---------- YCbCr is an in-memory image of Y'CbCr colors. There is one Y sample per pixel, but each Cb and Cr sample can span one or more pixels. YStride is the Y slice index delta between vertically adjacent pixels. CStride is the Cb and Cr slice index delta between vertically adjacent pixels that map to separate chroma samples. It is not an absolute requirement, but YStride and len(Y) are typically multiples of 8, and: ``` For 4:4:4, CStride == YStride/1 && len(Cb) == len(Cr) == len(Y)/1. For 4:2:2, CStride == YStride/2 && len(Cb) == len(Cr) == len(Y)/2. For 4:2:0, CStride == YStride/2 && len(Cb) == len(Cr) == len(Y)/4. For 4:4:0, CStride == YStride/1 && len(Cb) == len(Cr) == len(Y)/2. For 4:1:1, CStride == YStride/4 && len(Cb) == len(Cr) == len(Y)/4. For 4:1:0, CStride == YStride/4 && len(Cb) == len(Cr) == len(Y)/8. ``` ``` type YCbCr struct { Y, Cb, Cr []uint8 YStride int CStride int SubsampleRatio YCbCrSubsampleRatio Rect Rectangle } ``` ### func NewYCbCr ``` func NewYCbCr(r Rectangle, subsampleRatio YCbCrSubsampleRatio) *YCbCr ``` NewYCbCr returns a new YCbCr image with the given bounds and subsample ratio. ### func (\*YCbCr) At ``` func (p *YCbCr) At(x, y int) color.Color ``` ### func (\*YCbCr) Bounds ``` func (p *YCbCr) Bounds() Rectangle ``` ### func (\*YCbCr) COffset ``` func (p *YCbCr) COffset(x, y int) int ``` COffset returns the index of the first element of Cb or Cr that corresponds to the pixel at (x, y). ### func (\*YCbCr) ColorModel ``` func (p *YCbCr) ColorModel() color.Model ``` ### func (\*YCbCr) Opaque ``` func (p *YCbCr) Opaque() bool ``` ### func (\*YCbCr) RGBA64At 1.17 ``` func (p *YCbCr) RGBA64At(x, y int) color.RGBA64 ``` ### func (\*YCbCr) SubImage ``` func (p *YCbCr) SubImage(r Rectangle) Image ``` SubImage returns an image representing the portion of the image p visible through r. The returned value shares pixels with the original image. ### func (\*YCbCr) YCbCrAt 1.4 ``` func (p *YCbCr) YCbCrAt(x, y int) color.YCbCr ``` ### func (\*YCbCr) YOffset ``` func (p *YCbCr) YOffset(x, y int) int ``` YOffset returns the index of the first element of Y that corresponds to the pixel at (x, y). type YCbCrSubsampleRatio ------------------------ YCbCrSubsampleRatio is the chroma subsample ratio used in a YCbCr image. ``` type YCbCrSubsampleRatio int ``` ``` const ( YCbCrSubsampleRatio444 YCbCrSubsampleRatio = iota YCbCrSubsampleRatio422 YCbCrSubsampleRatio420 YCbCrSubsampleRatio440 YCbCrSubsampleRatio411 YCbCrSubsampleRatio410 ) ``` ### func (YCbCrSubsampleRatio) String ``` func (s YCbCrSubsampleRatio) String() string ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [color](color/index) | Package color implements a basic color library. | | [palette](color/palette/index) | Package palette provides standard color palettes. | | [draw](draw/index) | Package draw provides image composition functions. | | [gif](gif/index) | Package gif implements a GIF image decoder and encoder. | | [jpeg](jpeg/index) | Package jpeg implements a JPEG image decoder and encoder. | | [png](png/index) | Package png implements a PNG image decoder and encoder. |
programming_docs
go Package color Package color ============== * `import "image/color"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Subdirectories](#pkg-subdirectories) Overview -------- Package color implements a basic color library. Index ----- * [Variables](#pkg-variables) * [func CMYKToRGB(c, m, y, k uint8) (uint8, uint8, uint8)](#CMYKToRGB) * [func RGBToCMYK(r, g, b uint8) (uint8, uint8, uint8, uint8)](#RGBToCMYK) * [func RGBToYCbCr(r, g, b uint8) (uint8, uint8, uint8)](#RGBToYCbCr) * [func YCbCrToRGB(y, cb, cr uint8) (uint8, uint8, uint8)](#YCbCrToRGB) * [type Alpha](#Alpha) * [func (c Alpha) RGBA() (r, g, b, a uint32)](#Alpha.RGBA) * [type Alpha16](#Alpha16) * [func (c Alpha16) RGBA() (r, g, b, a uint32)](#Alpha16.RGBA) * [type CMYK](#CMYK) * [func (c CMYK) RGBA() (uint32, uint32, uint32, uint32)](#CMYK.RGBA) * [type Color](#Color) * [type Gray](#Gray) * [func (c Gray) RGBA() (r, g, b, a uint32)](#Gray.RGBA) * [type Gray16](#Gray16) * [func (c Gray16) RGBA() (r, g, b, a uint32)](#Gray16.RGBA) * [type Model](#Model) * [func ModelFunc(f func(Color) Color) Model](#ModelFunc) * [type NRGBA](#NRGBA) * [func (c NRGBA) RGBA() (r, g, b, a uint32)](#NRGBA.RGBA) * [type NRGBA64](#NRGBA64) * [func (c NRGBA64) RGBA() (r, g, b, a uint32)](#NRGBA64.RGBA) * [type NYCbCrA](#NYCbCrA) * [func (c NYCbCrA) RGBA() (uint32, uint32, uint32, uint32)](#NYCbCrA.RGBA) * [type Palette](#Palette) * [func (p Palette) Convert(c Color) Color](#Palette.Convert) * [func (p Palette) Index(c Color) int](#Palette.Index) * [type RGBA](#RGBA) * [func (c RGBA) RGBA() (r, g, b, a uint32)](#RGBA.RGBA) * [type RGBA64](#RGBA64) * [func (c RGBA64) RGBA() (r, g, b, a uint32)](#RGBA64.RGBA) * [type YCbCr](#YCbCr) * [func (c YCbCr) RGBA() (uint32, uint32, uint32, uint32)](#YCbCr.RGBA) ### Package files color.go ycbcr.go Variables --------- Standard colors. ``` var ( Black = Gray16{0} White = Gray16{0xffff} Transparent = Alpha16{0} Opaque = Alpha16{0xffff} ) ``` func CMYKToRGB 1.5 ------------------ ``` func CMYKToRGB(c, m, y, k uint8) (uint8, uint8, uint8) ``` CMYKToRGB converts a CMYK quadruple to an RGB triple. func RGBToCMYK 1.5 ------------------ ``` func RGBToCMYK(r, g, b uint8) (uint8, uint8, uint8, uint8) ``` RGBToCMYK converts an RGB triple to a CMYK quadruple. func RGBToYCbCr --------------- ``` func RGBToYCbCr(r, g, b uint8) (uint8, uint8, uint8) ``` RGBToYCbCr converts an RGB triple to a Y'CbCr triple. func YCbCrToRGB --------------- ``` func YCbCrToRGB(y, cb, cr uint8) (uint8, uint8, uint8) ``` YCbCrToRGB converts a Y'CbCr triple to an RGB triple. type Alpha ---------- Alpha represents an 8-bit alpha color. ``` type Alpha struct { A uint8 } ``` ### func (Alpha) RGBA ``` func (c Alpha) RGBA() (r, g, b, a uint32) ``` type Alpha16 ------------ Alpha16 represents a 16-bit alpha color. ``` type Alpha16 struct { A uint16 } ``` ### func (Alpha16) RGBA ``` func (c Alpha16) RGBA() (r, g, b, a uint32) ``` type CMYK 1.5 ------------- CMYK represents a fully opaque CMYK color, having 8 bits for each of cyan, magenta, yellow and black. It is not associated with any particular color profile. ``` type CMYK struct { C, M, Y, K uint8 } ``` ### func (CMYK) RGBA 1.5 ``` func (c CMYK) RGBA() (uint32, uint32, uint32, uint32) ``` type Color ---------- Color can convert itself to alpha-premultiplied 16-bits per channel RGBA. The conversion may be lossy. ``` type Color interface { // RGBA returns the alpha-premultiplied red, green, blue and alpha values // for the color. Each value ranges within [0, 0xffff], but is represented // by a uint32 so that multiplying by a blend factor up to 0xffff will not // overflow. // // An alpha-premultiplied color component c has been scaled by alpha (a), // so has valid values 0 <= c <= a. RGBA() (r, g, b, a uint32) } ``` type Gray --------- Gray represents an 8-bit grayscale color. ``` type Gray struct { Y uint8 } ``` ### func (Gray) RGBA ``` func (c Gray) RGBA() (r, g, b, a uint32) ``` type Gray16 ----------- Gray16 represents a 16-bit grayscale color. ``` type Gray16 struct { Y uint16 } ``` ### func (Gray16) RGBA ``` func (c Gray16) RGBA() (r, g, b, a uint32) ``` type Model ---------- Model can convert any Color to one from its own color model. The conversion may be lossy. ``` type Model interface { Convert(c Color) Color } ``` Models for the standard color types. ``` var ( RGBAModel Model = ModelFunc(rgbaModel) RGBA64Model Model = ModelFunc(rgba64Model) NRGBAModel Model = ModelFunc(nrgbaModel) NRGBA64Model Model = ModelFunc(nrgba64Model) AlphaModel Model = ModelFunc(alphaModel) Alpha16Model Model = ModelFunc(alpha16Model) GrayModel Model = ModelFunc(grayModel) Gray16Model Model = ModelFunc(gray16Model) ) ``` CMYKModel is the Model for CMYK colors. ``` var CMYKModel Model = ModelFunc(cmykModel) ``` NYCbCrAModel is the Model for non-alpha-premultiplied Y'CbCr-with-alpha colors. ``` var NYCbCrAModel Model = ModelFunc(nYCbCrAModel) ``` YCbCrModel is the Model for Y'CbCr colors. ``` var YCbCrModel Model = ModelFunc(yCbCrModel) ``` ### func ModelFunc ``` func ModelFunc(f func(Color) Color) Model ``` ModelFunc returns a Model that invokes f to implement the conversion. type NRGBA ---------- NRGBA represents a non-alpha-premultiplied 32-bit color. ``` type NRGBA struct { R, G, B, A uint8 } ``` ### func (NRGBA) RGBA ``` func (c NRGBA) RGBA() (r, g, b, a uint32) ``` type NRGBA64 ------------ NRGBA64 represents a non-alpha-premultiplied 64-bit color, having 16 bits for each of red, green, blue and alpha. ``` type NRGBA64 struct { R, G, B, A uint16 } ``` ### func (NRGBA64) RGBA ``` func (c NRGBA64) RGBA() (r, g, b, a uint32) ``` type NYCbCrA 1.6 ---------------- NYCbCrA represents a non-alpha-premultiplied Y'CbCr-with-alpha color, having 8 bits each for one luma, two chroma and one alpha component. ``` type NYCbCrA struct { YCbCr A uint8 } ``` ### func (NYCbCrA) RGBA 1.6 ``` func (c NYCbCrA) RGBA() (uint32, uint32, uint32, uint32) ``` type Palette ------------ Palette is a palette of colors. ``` type Palette []Color ``` ### func (Palette) Convert ``` func (p Palette) Convert(c Color) Color ``` Convert returns the palette color closest to c in Euclidean R,G,B space. ### func (Palette) Index ``` func (p Palette) Index(c Color) int ``` Index returns the index of the palette color closest to c in Euclidean R,G,B,A space. type RGBA --------- RGBA represents a traditional 32-bit alpha-premultiplied color, having 8 bits for each of red, green, blue and alpha. An alpha-premultiplied color component C has been scaled by alpha (A), so has valid values 0 <= C <= A. ``` type RGBA struct { R, G, B, A uint8 } ``` ### func (RGBA) RGBA ``` func (c RGBA) RGBA() (r, g, b, a uint32) ``` type RGBA64 ----------- RGBA64 represents a 64-bit alpha-premultiplied color, having 16 bits for each of red, green, blue and alpha. An alpha-premultiplied color component C has been scaled by alpha (A), so has valid values 0 <= C <= A. ``` type RGBA64 struct { R, G, B, A uint16 } ``` ### func (RGBA64) RGBA ``` func (c RGBA64) RGBA() (r, g, b, a uint32) ``` type YCbCr ---------- YCbCr represents a fully opaque 24-bit Y'CbCr color, having 8 bits each for one luma and two chroma components. JPEG, VP8, the MPEG family and other codecs use this color model. Such codecs often use the terms YUV and Y'CbCr interchangeably, but strictly speaking, the term YUV applies only to analog video signals, and Y' (luma) is Y (luminance) after applying gamma correction. Conversion between RGB and Y'CbCr is lossy and there are multiple, slightly different formulae for converting between the two. This package follows the JFIF specification at <https://www.w3.org/Graphics/JPEG/jfif3.pdf>. ``` type YCbCr struct { Y, Cb, Cr uint8 } ``` ### func (YCbCr) RGBA ``` func (c YCbCr) RGBA() (uint32, uint32, uint32, uint32) ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [palette](palette/index) | Package palette provides standard color palettes. | go Package palette Package palette ================ * `import "image/color/palette"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package palette provides standard color palettes. Index ----- * [Variables](#pkg-variables) ### Package files generate.go palette.go Variables --------- Plan9 is a 256-color palette that partitions the 24-bit RGB space into 4×4×4 subdivision, with 4 shades in each subcube. Compared to the WebSafe, the idea is to reduce the color resolution by dicing the color cube into fewer cells, and to use the extra space to increase the intensity resolution. This results in 16 gray shades (4 gray subcubes with 4 samples in each), 13 shades of each primary and secondary color (3 subcubes with 4 samples plus black) and a reasonable selection of colors covering the rest of the color cube. The advantage is better representation of continuous tones. This palette was used in the Plan 9 Operating System, described at <https://9p.io/magic/man2html/6/color> ``` var Plan9 = []color.Color{ color.RGBA{0x00, 0x00, 0x00, 0xff}, color.RGBA{0x00, 0x00, 0x44, 0xff}, color.RGBA{0x00, 0x00, 0x88, 0xff}, color.RGBA{0x00, 0x00, 0xcc, 0xff}, color.RGBA{0x00, 0x44, 0x00, 0xff}, color.RGBA{0x00, 0x44, 0x44, 0xff}, color.RGBA{0x00, 0x44, 0x88, 0xff}, color.RGBA{0x00, 0x44, 0xcc, 0xff}, color.RGBA{0x00, 0x88, 0x00, 0xff}, color.RGBA{0x00, 0x88, 0x44, 0xff}, color.RGBA{0x00, 0x88, 0x88, 0xff}, color.RGBA{0x00, 0x88, 0xcc, 0xff}, color.RGBA{0x00, 0xcc, 0x00, 0xff}, color.RGBA{0x00, 0xcc, 0x44, 0xff}, color.RGBA{0x00, 0xcc, 0x88, 0xff}, color.RGBA{0x00, 0xcc, 0xcc, 0xff}, color.RGBA{0x00, 0xdd, 0xdd, 0xff}, color.RGBA{0x11, 0x11, 0x11, 0xff}, color.RGBA{0x00, 0x00, 0x55, 0xff}, color.RGBA{0x00, 0x00, 0x99, 0xff}, color.RGBA{0x00, 0x00, 0xdd, 0xff}, color.RGBA{0x00, 0x55, 0x00, 0xff}, color.RGBA{0x00, 0x55, 0x55, 0xff}, color.RGBA{0x00, 0x4c, 0x99, 0xff}, color.RGBA{0x00, 0x49, 0xdd, 0xff}, color.RGBA{0x00, 0x99, 0x00, 0xff}, color.RGBA{0x00, 0x99, 0x4c, 0xff}, color.RGBA{0x00, 0x99, 0x99, 0xff}, color.RGBA{0x00, 0x93, 0xdd, 0xff}, color.RGBA{0x00, 0xdd, 0x00, 0xff}, color.RGBA{0x00, 0xdd, 0x49, 0xff}, color.RGBA{0x00, 0xdd, 0x93, 0xff}, color.RGBA{0x00, 0xee, 0x9e, 0xff}, color.RGBA{0x00, 0xee, 0xee, 0xff}, color.RGBA{0x22, 0x22, 0x22, 0xff}, color.RGBA{0x00, 0x00, 0x66, 0xff}, color.RGBA{0x00, 0x00, 0xaa, 0xff}, color.RGBA{0x00, 0x00, 0xee, 0xff}, color.RGBA{0x00, 0x66, 0x00, 0xff}, color.RGBA{0x00, 0x66, 0x66, 0xff}, color.RGBA{0x00, 0x55, 0xaa, 0xff}, color.RGBA{0x00, 0x4f, 0xee, 0xff}, color.RGBA{0x00, 0xaa, 0x00, 0xff}, color.RGBA{0x00, 0xaa, 0x55, 0xff}, color.RGBA{0x00, 0xaa, 0xaa, 0xff}, color.RGBA{0x00, 0x9e, 0xee, 0xff}, color.RGBA{0x00, 0xee, 0x00, 0xff}, color.RGBA{0x00, 0xee, 0x4f, 0xff}, color.RGBA{0x00, 0xff, 0x55, 0xff}, color.RGBA{0x00, 0xff, 0xaa, 0xff}, color.RGBA{0x00, 0xff, 0xff, 0xff}, color.RGBA{0x33, 0x33, 0x33, 0xff}, color.RGBA{0x00, 0x00, 0x77, 0xff}, color.RGBA{0x00, 0x00, 0xbb, 0xff}, color.RGBA{0x00, 0x00, 0xff, 0xff}, color.RGBA{0x00, 0x77, 0x00, 0xff}, color.RGBA{0x00, 0x77, 0x77, 0xff}, color.RGBA{0x00, 0x5d, 0xbb, 0xff}, color.RGBA{0x00, 0x55, 0xff, 0xff}, color.RGBA{0x00, 0xbb, 0x00, 0xff}, color.RGBA{0x00, 0xbb, 0x5d, 0xff}, color.RGBA{0x00, 0xbb, 0xbb, 0xff}, color.RGBA{0x00, 0xaa, 0xff, 0xff}, color.RGBA{0x00, 0xff, 0x00, 0xff}, color.RGBA{0x44, 0x00, 0x44, 0xff}, color.RGBA{0x44, 0x00, 0x88, 0xff}, color.RGBA{0x44, 0x00, 0xcc, 0xff}, color.RGBA{0x44, 0x44, 0x00, 0xff}, color.RGBA{0x44, 0x44, 0x44, 0xff}, color.RGBA{0x44, 0x44, 0x88, 0xff}, color.RGBA{0x44, 0x44, 0xcc, 0xff}, color.RGBA{0x44, 0x88, 0x00, 0xff}, color.RGBA{0x44, 0x88, 0x44, 0xff}, color.RGBA{0x44, 0x88, 0x88, 0xff}, color.RGBA{0x44, 0x88, 0xcc, 0xff}, color.RGBA{0x44, 0xcc, 0x00, 0xff}, color.RGBA{0x44, 0xcc, 0x44, 0xff}, color.RGBA{0x44, 0xcc, 0x88, 0xff}, color.RGBA{0x44, 0xcc, 0xcc, 0xff}, color.RGBA{0x44, 0x00, 0x00, 0xff}, color.RGBA{0x55, 0x00, 0x00, 0xff}, color.RGBA{0x55, 0x00, 0x55, 0xff}, color.RGBA{0x4c, 0x00, 0x99, 0xff}, color.RGBA{0x49, 0x00, 0xdd, 0xff}, color.RGBA{0x55, 0x55, 0x00, 0xff}, color.RGBA{0x55, 0x55, 0x55, 0xff}, color.RGBA{0x4c, 0x4c, 0x99, 0xff}, color.RGBA{0x49, 0x49, 0xdd, 0xff}, color.RGBA{0x4c, 0x99, 0x00, 0xff}, color.RGBA{0x4c, 0x99, 0x4c, 0xff}, color.RGBA{0x4c, 0x99, 0x99, 0xff}, color.RGBA{0x49, 0x93, 0xdd, 0xff}, color.RGBA{0x49, 0xdd, 0x00, 0xff}, color.RGBA{0x49, 0xdd, 0x49, 0xff}, color.RGBA{0x49, 0xdd, 0x93, 0xff}, color.RGBA{0x49, 0xdd, 0xdd, 0xff}, color.RGBA{0x4f, 0xee, 0xee, 0xff}, color.RGBA{0x66, 0x00, 0x00, 0xff}, color.RGBA{0x66, 0x00, 0x66, 0xff}, color.RGBA{0x55, 0x00, 0xaa, 0xff}, color.RGBA{0x4f, 0x00, 0xee, 0xff}, color.RGBA{0x66, 0x66, 0x00, 0xff}, color.RGBA{0x66, 0x66, 0x66, 0xff}, color.RGBA{0x55, 0x55, 0xaa, 0xff}, color.RGBA{0x4f, 0x4f, 0xee, 0xff}, color.RGBA{0x55, 0xaa, 0x00, 0xff}, color.RGBA{0x55, 0xaa, 0x55, 0xff}, color.RGBA{0x55, 0xaa, 0xaa, 0xff}, color.RGBA{0x4f, 0x9e, 0xee, 0xff}, color.RGBA{0x4f, 0xee, 0x00, 0xff}, color.RGBA{0x4f, 0xee, 0x4f, 0xff}, color.RGBA{0x4f, 0xee, 0x9e, 0xff}, color.RGBA{0x55, 0xff, 0xaa, 0xff}, color.RGBA{0x55, 0xff, 0xff, 0xff}, color.RGBA{0x77, 0x00, 0x00, 0xff}, color.RGBA{0x77, 0x00, 0x77, 0xff}, color.RGBA{0x5d, 0x00, 0xbb, 0xff}, color.RGBA{0x55, 0x00, 0xff, 0xff}, color.RGBA{0x77, 0x77, 0x00, 0xff}, color.RGBA{0x77, 0x77, 0x77, 0xff}, color.RGBA{0x5d, 0x5d, 0xbb, 0xff}, color.RGBA{0x55, 0x55, 0xff, 0xff}, color.RGBA{0x5d, 0xbb, 0x00, 0xff}, color.RGBA{0x5d, 0xbb, 0x5d, 0xff}, color.RGBA{0x5d, 0xbb, 0xbb, 0xff}, color.RGBA{0x55, 0xaa, 0xff, 0xff}, color.RGBA{0x55, 0xff, 0x00, 0xff}, color.RGBA{0x55, 0xff, 0x55, 0xff}, color.RGBA{0x88, 0x00, 0x88, 0xff}, color.RGBA{0x88, 0x00, 0xcc, 0xff}, color.RGBA{0x88, 0x44, 0x00, 0xff}, color.RGBA{0x88, 0x44, 0x44, 0xff}, color.RGBA{0x88, 0x44, 0x88, 0xff}, color.RGBA{0x88, 0x44, 0xcc, 0xff}, color.RGBA{0x88, 0x88, 0x00, 0xff}, color.RGBA{0x88, 0x88, 0x44, 0xff}, color.RGBA{0x88, 0x88, 0x88, 0xff}, color.RGBA{0x88, 0x88, 0xcc, 0xff}, color.RGBA{0x88, 0xcc, 0x00, 0xff}, color.RGBA{0x88, 0xcc, 0x44, 0xff}, color.RGBA{0x88, 0xcc, 0x88, 0xff}, color.RGBA{0x88, 0xcc, 0xcc, 0xff}, color.RGBA{0x88, 0x00, 0x00, 0xff}, color.RGBA{0x88, 0x00, 0x44, 0xff}, color.RGBA{0x99, 0x00, 0x4c, 0xff}, color.RGBA{0x99, 0x00, 0x99, 0xff}, color.RGBA{0x93, 0x00, 0xdd, 0xff}, color.RGBA{0x99, 0x4c, 0x00, 0xff}, color.RGBA{0x99, 0x4c, 0x4c, 0xff}, color.RGBA{0x99, 0x4c, 0x99, 0xff}, color.RGBA{0x93, 0x49, 0xdd, 0xff}, color.RGBA{0x99, 0x99, 0x00, 0xff}, color.RGBA{0x99, 0x99, 0x4c, 0xff}, color.RGBA{0x99, 0x99, 0x99, 0xff}, color.RGBA{0x93, 0x93, 0xdd, 0xff}, color.RGBA{0x93, 0xdd, 0x00, 0xff}, color.RGBA{0x93, 0xdd, 0x49, 0xff}, color.RGBA{0x93, 0xdd, 0x93, 0xff}, color.RGBA{0x93, 0xdd, 0xdd, 0xff}, color.RGBA{0x99, 0x00, 0x00, 0xff}, color.RGBA{0xaa, 0x00, 0x00, 0xff}, color.RGBA{0xaa, 0x00, 0x55, 0xff}, color.RGBA{0xaa, 0x00, 0xaa, 0xff}, color.RGBA{0x9e, 0x00, 0xee, 0xff}, color.RGBA{0xaa, 0x55, 0x00, 0xff}, color.RGBA{0xaa, 0x55, 0x55, 0xff}, color.RGBA{0xaa, 0x55, 0xaa, 0xff}, color.RGBA{0x9e, 0x4f, 0xee, 0xff}, color.RGBA{0xaa, 0xaa, 0x00, 0xff}, color.RGBA{0xaa, 0xaa, 0x55, 0xff}, color.RGBA{0xaa, 0xaa, 0xaa, 0xff}, color.RGBA{0x9e, 0x9e, 0xee, 0xff}, color.RGBA{0x9e, 0xee, 0x00, 0xff}, color.RGBA{0x9e, 0xee, 0x4f, 0xff}, color.RGBA{0x9e, 0xee, 0x9e, 0xff}, color.RGBA{0x9e, 0xee, 0xee, 0xff}, color.RGBA{0xaa, 0xff, 0xff, 0xff}, color.RGBA{0xbb, 0x00, 0x00, 0xff}, color.RGBA{0xbb, 0x00, 0x5d, 0xff}, color.RGBA{0xbb, 0x00, 0xbb, 0xff}, color.RGBA{0xaa, 0x00, 0xff, 0xff}, color.RGBA{0xbb, 0x5d, 0x00, 0xff}, color.RGBA{0xbb, 0x5d, 0x5d, 0xff}, color.RGBA{0xbb, 0x5d, 0xbb, 0xff}, color.RGBA{0xaa, 0x55, 0xff, 0xff}, color.RGBA{0xbb, 0xbb, 0x00, 0xff}, color.RGBA{0xbb, 0xbb, 0x5d, 0xff}, color.RGBA{0xbb, 0xbb, 0xbb, 0xff}, color.RGBA{0xaa, 0xaa, 0xff, 0xff}, color.RGBA{0xaa, 0xff, 0x00, 0xff}, color.RGBA{0xaa, 0xff, 0x55, 0xff}, color.RGBA{0xaa, 0xff, 0xaa, 0xff}, color.RGBA{0xcc, 0x00, 0xcc, 0xff}, color.RGBA{0xcc, 0x44, 0x00, 0xff}, color.RGBA{0xcc, 0x44, 0x44, 0xff}, color.RGBA{0xcc, 0x44, 0x88, 0xff}, color.RGBA{0xcc, 0x44, 0xcc, 0xff}, color.RGBA{0xcc, 0x88, 0x00, 0xff}, color.RGBA{0xcc, 0x88, 0x44, 0xff}, color.RGBA{0xcc, 0x88, 0x88, 0xff}, color.RGBA{0xcc, 0x88, 0xcc, 0xff}, color.RGBA{0xcc, 0xcc, 0x00, 0xff}, color.RGBA{0xcc, 0xcc, 0x44, 0xff}, color.RGBA{0xcc, 0xcc, 0x88, 0xff}, color.RGBA{0xcc, 0xcc, 0xcc, 0xff}, color.RGBA{0xcc, 0x00, 0x00, 0xff}, color.RGBA{0xcc, 0x00, 0x44, 0xff}, color.RGBA{0xcc, 0x00, 0x88, 0xff}, color.RGBA{0xdd, 0x00, 0x93, 0xff}, color.RGBA{0xdd, 0x00, 0xdd, 0xff}, color.RGBA{0xdd, 0x49, 0x00, 0xff}, color.RGBA{0xdd, 0x49, 0x49, 0xff}, color.RGBA{0xdd, 0x49, 0x93, 0xff}, color.RGBA{0xdd, 0x49, 0xdd, 0xff}, color.RGBA{0xdd, 0x93, 0x00, 0xff}, color.RGBA{0xdd, 0x93, 0x49, 0xff}, color.RGBA{0xdd, 0x93, 0x93, 0xff}, color.RGBA{0xdd, 0x93, 0xdd, 0xff}, color.RGBA{0xdd, 0xdd, 0x00, 0xff}, color.RGBA{0xdd, 0xdd, 0x49, 0xff}, color.RGBA{0xdd, 0xdd, 0x93, 0xff}, color.RGBA{0xdd, 0xdd, 0xdd, 0xff}, color.RGBA{0xdd, 0x00, 0x00, 0xff}, color.RGBA{0xdd, 0x00, 0x49, 0xff}, color.RGBA{0xee, 0x00, 0x4f, 0xff}, color.RGBA{0xee, 0x00, 0x9e, 0xff}, color.RGBA{0xee, 0x00, 0xee, 0xff}, color.RGBA{0xee, 0x4f, 0x00, 0xff}, color.RGBA{0xee, 0x4f, 0x4f, 0xff}, color.RGBA{0xee, 0x4f, 0x9e, 0xff}, color.RGBA{0xee, 0x4f, 0xee, 0xff}, color.RGBA{0xee, 0x9e, 0x00, 0xff}, color.RGBA{0xee, 0x9e, 0x4f, 0xff}, color.RGBA{0xee, 0x9e, 0x9e, 0xff}, color.RGBA{0xee, 0x9e, 0xee, 0xff}, color.RGBA{0xee, 0xee, 0x00, 0xff}, color.RGBA{0xee, 0xee, 0x4f, 0xff}, color.RGBA{0xee, 0xee, 0x9e, 0xff}, color.RGBA{0xee, 0xee, 0xee, 0xff}, color.RGBA{0xee, 0x00, 0x00, 0xff}, color.RGBA{0xff, 0x00, 0x00, 0xff}, color.RGBA{0xff, 0x00, 0x55, 0xff}, color.RGBA{0xff, 0x00, 0xaa, 0xff}, color.RGBA{0xff, 0x00, 0xff, 0xff}, color.RGBA{0xff, 0x55, 0x00, 0xff}, color.RGBA{0xff, 0x55, 0x55, 0xff}, color.RGBA{0xff, 0x55, 0xaa, 0xff}, color.RGBA{0xff, 0x55, 0xff, 0xff}, color.RGBA{0xff, 0xaa, 0x00, 0xff}, color.RGBA{0xff, 0xaa, 0x55, 0xff}, color.RGBA{0xff, 0xaa, 0xaa, 0xff}, color.RGBA{0xff, 0xaa, 0xff, 0xff}, color.RGBA{0xff, 0xff, 0x00, 0xff}, color.RGBA{0xff, 0xff, 0x55, 0xff}, color.RGBA{0xff, 0xff, 0xaa, 0xff}, color.RGBA{0xff, 0xff, 0xff, 0xff}, } ``` WebSafe is a 216-color palette that was popularized by early versions of Netscape Navigator. It is also known as the Netscape Color Cube. See <https://en.wikipedia.org/wiki/Web_colors#Web-safe_colors> for details. ``` var WebSafe = []color.Color{ color.RGBA{0x00, 0x00, 0x00, 0xff}, color.RGBA{0x00, 0x00, 0x33, 0xff}, color.RGBA{0x00, 0x00, 0x66, 0xff}, color.RGBA{0x00, 0x00, 0x99, 0xff}, color.RGBA{0x00, 0x00, 0xcc, 0xff}, color.RGBA{0x00, 0x00, 0xff, 0xff}, color.RGBA{0x00, 0x33, 0x00, 0xff}, color.RGBA{0x00, 0x33, 0x33, 0xff}, color.RGBA{0x00, 0x33, 0x66, 0xff}, color.RGBA{0x00, 0x33, 0x99, 0xff}, color.RGBA{0x00, 0x33, 0xcc, 0xff}, color.RGBA{0x00, 0x33, 0xff, 0xff}, color.RGBA{0x00, 0x66, 0x00, 0xff}, color.RGBA{0x00, 0x66, 0x33, 0xff}, color.RGBA{0x00, 0x66, 0x66, 0xff}, color.RGBA{0x00, 0x66, 0x99, 0xff}, color.RGBA{0x00, 0x66, 0xcc, 0xff}, color.RGBA{0x00, 0x66, 0xff, 0xff}, color.RGBA{0x00, 0x99, 0x00, 0xff}, color.RGBA{0x00, 0x99, 0x33, 0xff}, color.RGBA{0x00, 0x99, 0x66, 0xff}, color.RGBA{0x00, 0x99, 0x99, 0xff}, color.RGBA{0x00, 0x99, 0xcc, 0xff}, color.RGBA{0x00, 0x99, 0xff, 0xff}, color.RGBA{0x00, 0xcc, 0x00, 0xff}, color.RGBA{0x00, 0xcc, 0x33, 0xff}, color.RGBA{0x00, 0xcc, 0x66, 0xff}, color.RGBA{0x00, 0xcc, 0x99, 0xff}, color.RGBA{0x00, 0xcc, 0xcc, 0xff}, color.RGBA{0x00, 0xcc, 0xff, 0xff}, color.RGBA{0x00, 0xff, 0x00, 0xff}, color.RGBA{0x00, 0xff, 0x33, 0xff}, color.RGBA{0x00, 0xff, 0x66, 0xff}, color.RGBA{0x00, 0xff, 0x99, 0xff}, color.RGBA{0x00, 0xff, 0xcc, 0xff}, color.RGBA{0x00, 0xff, 0xff, 0xff}, color.RGBA{0x33, 0x00, 0x00, 0xff}, color.RGBA{0x33, 0x00, 0x33, 0xff}, color.RGBA{0x33, 0x00, 0x66, 0xff}, color.RGBA{0x33, 0x00, 0x99, 0xff}, color.RGBA{0x33, 0x00, 0xcc, 0xff}, color.RGBA{0x33, 0x00, 0xff, 0xff}, color.RGBA{0x33, 0x33, 0x00, 0xff}, color.RGBA{0x33, 0x33, 0x33, 0xff}, color.RGBA{0x33, 0x33, 0x66, 0xff}, color.RGBA{0x33, 0x33, 0x99, 0xff}, color.RGBA{0x33, 0x33, 0xcc, 0xff}, color.RGBA{0x33, 0x33, 0xff, 0xff}, color.RGBA{0x33, 0x66, 0x00, 0xff}, color.RGBA{0x33, 0x66, 0x33, 0xff}, color.RGBA{0x33, 0x66, 0x66, 0xff}, color.RGBA{0x33, 0x66, 0x99, 0xff}, color.RGBA{0x33, 0x66, 0xcc, 0xff}, color.RGBA{0x33, 0x66, 0xff, 0xff}, color.RGBA{0x33, 0x99, 0x00, 0xff}, color.RGBA{0x33, 0x99, 0x33, 0xff}, color.RGBA{0x33, 0x99, 0x66, 0xff}, color.RGBA{0x33, 0x99, 0x99, 0xff}, color.RGBA{0x33, 0x99, 0xcc, 0xff}, color.RGBA{0x33, 0x99, 0xff, 0xff}, color.RGBA{0x33, 0xcc, 0x00, 0xff}, color.RGBA{0x33, 0xcc, 0x33, 0xff}, color.RGBA{0x33, 0xcc, 0x66, 0xff}, color.RGBA{0x33, 0xcc, 0x99, 0xff}, color.RGBA{0x33, 0xcc, 0xcc, 0xff}, color.RGBA{0x33, 0xcc, 0xff, 0xff}, color.RGBA{0x33, 0xff, 0x00, 0xff}, color.RGBA{0x33, 0xff, 0x33, 0xff}, color.RGBA{0x33, 0xff, 0x66, 0xff}, color.RGBA{0x33, 0xff, 0x99, 0xff}, color.RGBA{0x33, 0xff, 0xcc, 0xff}, color.RGBA{0x33, 0xff, 0xff, 0xff}, color.RGBA{0x66, 0x00, 0x00, 0xff}, color.RGBA{0x66, 0x00, 0x33, 0xff}, color.RGBA{0x66, 0x00, 0x66, 0xff}, color.RGBA{0x66, 0x00, 0x99, 0xff}, color.RGBA{0x66, 0x00, 0xcc, 0xff}, color.RGBA{0x66, 0x00, 0xff, 0xff}, color.RGBA{0x66, 0x33, 0x00, 0xff}, color.RGBA{0x66, 0x33, 0x33, 0xff}, color.RGBA{0x66, 0x33, 0x66, 0xff}, color.RGBA{0x66, 0x33, 0x99, 0xff}, color.RGBA{0x66, 0x33, 0xcc, 0xff}, color.RGBA{0x66, 0x33, 0xff, 0xff}, color.RGBA{0x66, 0x66, 0x00, 0xff}, color.RGBA{0x66, 0x66, 0x33, 0xff}, color.RGBA{0x66, 0x66, 0x66, 0xff}, color.RGBA{0x66, 0x66, 0x99, 0xff}, color.RGBA{0x66, 0x66, 0xcc, 0xff}, color.RGBA{0x66, 0x66, 0xff, 0xff}, color.RGBA{0x66, 0x99, 0x00, 0xff}, color.RGBA{0x66, 0x99, 0x33, 0xff}, color.RGBA{0x66, 0x99, 0x66, 0xff}, color.RGBA{0x66, 0x99, 0x99, 0xff}, color.RGBA{0x66, 0x99, 0xcc, 0xff}, color.RGBA{0x66, 0x99, 0xff, 0xff}, color.RGBA{0x66, 0xcc, 0x00, 0xff}, color.RGBA{0x66, 0xcc, 0x33, 0xff}, color.RGBA{0x66, 0xcc, 0x66, 0xff}, color.RGBA{0x66, 0xcc, 0x99, 0xff}, color.RGBA{0x66, 0xcc, 0xcc, 0xff}, color.RGBA{0x66, 0xcc, 0xff, 0xff}, color.RGBA{0x66, 0xff, 0x00, 0xff}, color.RGBA{0x66, 0xff, 0x33, 0xff}, color.RGBA{0x66, 0xff, 0x66, 0xff}, color.RGBA{0x66, 0xff, 0x99, 0xff}, color.RGBA{0x66, 0xff, 0xcc, 0xff}, color.RGBA{0x66, 0xff, 0xff, 0xff}, color.RGBA{0x99, 0x00, 0x00, 0xff}, color.RGBA{0x99, 0x00, 0x33, 0xff}, color.RGBA{0x99, 0x00, 0x66, 0xff}, color.RGBA{0x99, 0x00, 0x99, 0xff}, color.RGBA{0x99, 0x00, 0xcc, 0xff}, color.RGBA{0x99, 0x00, 0xff, 0xff}, color.RGBA{0x99, 0x33, 0x00, 0xff}, color.RGBA{0x99, 0x33, 0x33, 0xff}, color.RGBA{0x99, 0x33, 0x66, 0xff}, color.RGBA{0x99, 0x33, 0x99, 0xff}, color.RGBA{0x99, 0x33, 0xcc, 0xff}, color.RGBA{0x99, 0x33, 0xff, 0xff}, color.RGBA{0x99, 0x66, 0x00, 0xff}, color.RGBA{0x99, 0x66, 0x33, 0xff}, color.RGBA{0x99, 0x66, 0x66, 0xff}, color.RGBA{0x99, 0x66, 0x99, 0xff}, color.RGBA{0x99, 0x66, 0xcc, 0xff}, color.RGBA{0x99, 0x66, 0xff, 0xff}, color.RGBA{0x99, 0x99, 0x00, 0xff}, color.RGBA{0x99, 0x99, 0x33, 0xff}, color.RGBA{0x99, 0x99, 0x66, 0xff}, color.RGBA{0x99, 0x99, 0x99, 0xff}, color.RGBA{0x99, 0x99, 0xcc, 0xff}, color.RGBA{0x99, 0x99, 0xff, 0xff}, color.RGBA{0x99, 0xcc, 0x00, 0xff}, color.RGBA{0x99, 0xcc, 0x33, 0xff}, color.RGBA{0x99, 0xcc, 0x66, 0xff}, color.RGBA{0x99, 0xcc, 0x99, 0xff}, color.RGBA{0x99, 0xcc, 0xcc, 0xff}, color.RGBA{0x99, 0xcc, 0xff, 0xff}, color.RGBA{0x99, 0xff, 0x00, 0xff}, color.RGBA{0x99, 0xff, 0x33, 0xff}, color.RGBA{0x99, 0xff, 0x66, 0xff}, color.RGBA{0x99, 0xff, 0x99, 0xff}, color.RGBA{0x99, 0xff, 0xcc, 0xff}, color.RGBA{0x99, 0xff, 0xff, 0xff}, color.RGBA{0xcc, 0x00, 0x00, 0xff}, color.RGBA{0xcc, 0x00, 0x33, 0xff}, color.RGBA{0xcc, 0x00, 0x66, 0xff}, color.RGBA{0xcc, 0x00, 0x99, 0xff}, color.RGBA{0xcc, 0x00, 0xcc, 0xff}, color.RGBA{0xcc, 0x00, 0xff, 0xff}, color.RGBA{0xcc, 0x33, 0x00, 0xff}, color.RGBA{0xcc, 0x33, 0x33, 0xff}, color.RGBA{0xcc, 0x33, 0x66, 0xff}, color.RGBA{0xcc, 0x33, 0x99, 0xff}, color.RGBA{0xcc, 0x33, 0xcc, 0xff}, color.RGBA{0xcc, 0x33, 0xff, 0xff}, color.RGBA{0xcc, 0x66, 0x00, 0xff}, color.RGBA{0xcc, 0x66, 0x33, 0xff}, color.RGBA{0xcc, 0x66, 0x66, 0xff}, color.RGBA{0xcc, 0x66, 0x99, 0xff}, color.RGBA{0xcc, 0x66, 0xcc, 0xff}, color.RGBA{0xcc, 0x66, 0xff, 0xff}, color.RGBA{0xcc, 0x99, 0x00, 0xff}, color.RGBA{0xcc, 0x99, 0x33, 0xff}, color.RGBA{0xcc, 0x99, 0x66, 0xff}, color.RGBA{0xcc, 0x99, 0x99, 0xff}, color.RGBA{0xcc, 0x99, 0xcc, 0xff}, color.RGBA{0xcc, 0x99, 0xff, 0xff}, color.RGBA{0xcc, 0xcc, 0x00, 0xff}, color.RGBA{0xcc, 0xcc, 0x33, 0xff}, color.RGBA{0xcc, 0xcc, 0x66, 0xff}, color.RGBA{0xcc, 0xcc, 0x99, 0xff}, color.RGBA{0xcc, 0xcc, 0xcc, 0xff}, color.RGBA{0xcc, 0xcc, 0xff, 0xff}, color.RGBA{0xcc, 0xff, 0x00, 0xff}, color.RGBA{0xcc, 0xff, 0x33, 0xff}, color.RGBA{0xcc, 0xff, 0x66, 0xff}, color.RGBA{0xcc, 0xff, 0x99, 0xff}, color.RGBA{0xcc, 0xff, 0xcc, 0xff}, color.RGBA{0xcc, 0xff, 0xff, 0xff}, color.RGBA{0xff, 0x00, 0x00, 0xff}, color.RGBA{0xff, 0x00, 0x33, 0xff}, color.RGBA{0xff, 0x00, 0x66, 0xff}, color.RGBA{0xff, 0x00, 0x99, 0xff}, color.RGBA{0xff, 0x00, 0xcc, 0xff}, color.RGBA{0xff, 0x00, 0xff, 0xff}, color.RGBA{0xff, 0x33, 0x00, 0xff}, color.RGBA{0xff, 0x33, 0x33, 0xff}, color.RGBA{0xff, 0x33, 0x66, 0xff}, color.RGBA{0xff, 0x33, 0x99, 0xff}, color.RGBA{0xff, 0x33, 0xcc, 0xff}, color.RGBA{0xff, 0x33, 0xff, 0xff}, color.RGBA{0xff, 0x66, 0x00, 0xff}, color.RGBA{0xff, 0x66, 0x33, 0xff}, color.RGBA{0xff, 0x66, 0x66, 0xff}, color.RGBA{0xff, 0x66, 0x99, 0xff}, color.RGBA{0xff, 0x66, 0xcc, 0xff}, color.RGBA{0xff, 0x66, 0xff, 0xff}, color.RGBA{0xff, 0x99, 0x00, 0xff}, color.RGBA{0xff, 0x99, 0x33, 0xff}, color.RGBA{0xff, 0x99, 0x66, 0xff}, color.RGBA{0xff, 0x99, 0x99, 0xff}, color.RGBA{0xff, 0x99, 0xcc, 0xff}, color.RGBA{0xff, 0x99, 0xff, 0xff}, color.RGBA{0xff, 0xcc, 0x00, 0xff}, color.RGBA{0xff, 0xcc, 0x33, 0xff}, color.RGBA{0xff, 0xcc, 0x66, 0xff}, color.RGBA{0xff, 0xcc, 0x99, 0xff}, color.RGBA{0xff, 0xcc, 0xcc, 0xff}, color.RGBA{0xff, 0xcc, 0xff, 0xff}, color.RGBA{0xff, 0xff, 0x00, 0xff}, color.RGBA{0xff, 0xff, 0x33, 0xff}, color.RGBA{0xff, 0xff, 0x66, 0xff}, color.RGBA{0xff, 0xff, 0x99, 0xff}, color.RGBA{0xff, 0xff, 0xcc, 0xff}, color.RGBA{0xff, 0xff, 0xff, 0xff}, } ```
programming_docs
go Package png Package png ============ * `import "image/png"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package png implements a PNG image decoder and encoder. The PNG specification is at <https://www.w3.org/TR/PNG/>. Index ----- * [func Decode(r io.Reader) (image.Image, error)](#Decode) * [func DecodeConfig(r io.Reader) (image.Config, error)](#DecodeConfig) * [func Encode(w io.Writer, m image.Image) error](#Encode) * [type CompressionLevel](#CompressionLevel) * [type Encoder](#Encoder) * [func (enc \*Encoder) Encode(w io.Writer, m image.Image) error](#Encoder.Encode) * [type EncoderBuffer](#EncoderBuffer) * [type EncoderBufferPool](#EncoderBufferPool) * [type FormatError](#FormatError) * [func (e FormatError) Error() string](#FormatError.Error) * [type UnsupportedError](#UnsupportedError) * [func (e UnsupportedError) Error() string](#UnsupportedError.Error) ### Examples [Decode](#example_Decode) [Encode](#example_Encode) ### Package files paeth.go reader.go writer.go func Decode ----------- ``` func Decode(r io.Reader) (image.Image, error) ``` Decode reads a PNG image from r and returns it as an image.Image. The type of Image returned depends on the PNG contents. #### Example Code: ``` // This example uses png.Decode which can only decode PNG images. // Consider using the general image.Decode as it can sniff and decode any registered image format. img, err := png.Decode(gopherPNG()) if err != nil { log.Fatal(err) } levels := []string{" ", "░", "▒", "▓", "█"} for y := img.Bounds().Min.Y; y < img.Bounds().Max.Y; y++ { for x := img.Bounds().Min.X; x < img.Bounds().Max.X; x++ { c := color.GrayModel.Convert(img.At(x, y)).(color.Gray) level := c.Y / 51 // 51 * 5 = 255 if level == 5 { level-- } fmt.Print(levels[level]) } fmt.Print("\n") } ``` func DecodeConfig ----------------- ``` func DecodeConfig(r io.Reader) (image.Config, error) ``` DecodeConfig returns the color model and dimensions of a PNG image without decoding the entire image. func Encode ----------- ``` func Encode(w io.Writer, m image.Image) error ``` Encode writes the Image m to w in PNG format. Any Image may be encoded, but images that are not image.NRGBA might be encoded lossily. #### Example Code: ``` const width, height = 256, 256 // Create a colored image of the given width and height. img := image.NewNRGBA(image.Rect(0, 0, width, height)) for y := 0; y < height; y++ { for x := 0; x < width; x++ { img.Set(x, y, color.NRGBA{ R: uint8((x + y) & 255), G: uint8((x + y) << 1 & 255), B: uint8((x + y) << 2 & 255), A: 255, }) } } f, err := os.Create("image.png") if err != nil { log.Fatal(err) } if err := png.Encode(f, img); err != nil { f.Close() log.Fatal(err) } if err := f.Close(); err != nil { log.Fatal(err) } ``` type CompressionLevel 1.4 ------------------------- CompressionLevel indicates the compression level. ``` type CompressionLevel int ``` ``` const ( DefaultCompression CompressionLevel = 0 NoCompression CompressionLevel = -1 BestSpeed CompressionLevel = -2 BestCompression CompressionLevel = -3 ) ``` type Encoder 1.4 ---------------- Encoder configures encoding PNG images. ``` type Encoder struct { CompressionLevel CompressionLevel // BufferPool optionally specifies a buffer pool to get temporary // EncoderBuffers when encoding an image. BufferPool EncoderBufferPool // Go 1.9 } ``` ### func (\*Encoder) Encode 1.4 ``` func (enc *Encoder) Encode(w io.Writer, m image.Image) error ``` Encode writes the Image m to w in PNG format. type EncoderBuffer 1.9 ---------------------- EncoderBuffer holds the buffers used for encoding PNG images. ``` type EncoderBuffer encoder ``` type EncoderBufferPool 1.9 -------------------------- EncoderBufferPool is an interface for getting and returning temporary instances of the EncoderBuffer struct. This can be used to reuse buffers when encoding multiple images. ``` type EncoderBufferPool interface { Get() *EncoderBuffer Put(*EncoderBuffer) } ``` type FormatError ---------------- A FormatError reports that the input is not a valid PNG. ``` type FormatError string ``` ### func (FormatError) Error ``` func (e FormatError) Error() string ``` type UnsupportedError --------------------- An UnsupportedError reports that the input uses a valid but unimplemented PNG feature. ``` type UnsupportedError string ``` ### func (UnsupportedError) Error ``` func (e UnsupportedError) Error() string ``` go Package jpeg Package jpeg ============= * `import "image/jpeg"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package jpeg implements a JPEG image decoder and encoder. JPEG is defined in ITU-T T.81: <https://www.w3.org/Graphics/JPEG/itu-t81.pdf>. Index ----- * [Constants](#pkg-constants) * [func Decode(r io.Reader) (image.Image, error)](#Decode) * [func DecodeConfig(r io.Reader) (image.Config, error)](#DecodeConfig) * [func Encode(w io.Writer, m image.Image, o \*Options) error](#Encode) * [type FormatError](#FormatError) * [func (e FormatError) Error() string](#FormatError.Error) * [type Options](#Options) * [type Reader](#Reader) * [type UnsupportedError](#UnsupportedError) * [func (e UnsupportedError) Error() string](#UnsupportedError.Error) ### Package files fdct.go huffman.go idct.go reader.go scan.go writer.go Constants --------- DefaultQuality is the default quality encoding parameter. ``` const DefaultQuality = 75 ``` func Decode ----------- ``` func Decode(r io.Reader) (image.Image, error) ``` Decode reads a JPEG image from r and returns it as an image.Image. func DecodeConfig ----------------- ``` func DecodeConfig(r io.Reader) (image.Config, error) ``` DecodeConfig returns the color model and dimensions of a JPEG image without decoding the entire image. func Encode ----------- ``` func Encode(w io.Writer, m image.Image, o *Options) error ``` Encode writes the Image m to w in JPEG 4:2:0 baseline format with the given options. Default parameters are used if a nil \*Options is passed. type FormatError ---------------- A FormatError reports that the input is not a valid JPEG. ``` type FormatError string ``` ### func (FormatError) Error ``` func (e FormatError) Error() string ``` type Options ------------ Options are the encoding parameters. Quality ranges from 1 to 100 inclusive, higher is better. ``` type Options struct { Quality int } ``` type Reader ----------- Deprecated: Reader is not used by the image/jpeg package and should not be used by others. It is kept for compatibility. ``` type Reader interface { io.ByteReader io.Reader } ``` type UnsupportedError --------------------- An UnsupportedError reports that the input uses a valid but unimplemented JPEG feature. ``` type UnsupportedError string ``` ### func (UnsupportedError) Error ``` func (e UnsupportedError) Error() string ``` go Package draw Package draw ============= * `import "image/draw"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package draw provides image composition functions. See "The Go image/draw package" for an introduction to this package: <https://golang.org/doc/articles/image_draw.html> Index ----- * [func Draw(dst Image, r image.Rectangle, src image.Image, sp image.Point, op Op)](#Draw) * [func DrawMask(dst Image, r image.Rectangle, src image.Image, sp image.Point, mask image.Image, mp image.Point, op Op)](#DrawMask) * [type Drawer](#Drawer) * [type Image](#Image) * [type Op](#Op) * [func (op Op) Draw(dst Image, r image.Rectangle, src image.Image, sp image.Point)](#Op.Draw) * [type Quantizer](#Quantizer) * [type RGBA64Image](#RGBA64Image) ### Examples [Drawer (FloydSteinberg)](#example_Drawer_floydSteinberg) ### Package files draw.go func Draw --------- ``` func Draw(dst Image, r image.Rectangle, src image.Image, sp image.Point, op Op) ``` Draw calls DrawMask with a nil mask. func DrawMask ------------- ``` func DrawMask(dst Image, r image.Rectangle, src image.Image, sp image.Point, mask image.Image, mp image.Point, op Op) ``` DrawMask aligns r.Min in dst with sp in src and mp in mask and then replaces the rectangle r in dst with the result of a Porter-Duff composition. A nil mask is treated as opaque. type Drawer 1.2 --------------- Drawer contains the Draw method. ``` type Drawer interface { // Draw aligns r.Min in dst with sp in src and then replaces the // rectangle r in dst with the result of drawing src on dst. Draw(dst Image, r image.Rectangle, src image.Image, sp image.Point) } ``` FloydSteinberg is a Drawer that is the Src Op with Floyd-Steinberg error diffusion. ``` var FloydSteinberg Drawer = floydSteinberg{} ``` #### Example (FloydSteinberg) Code: ``` const width = 130 const height = 50 im := image.NewGray(image.Rectangle{Max: image.Point{X: width, Y: height}}) for x := 0; x < width; x++ { for y := 0; y < height; y++ { dist := math.Sqrt(math.Pow(float64(x-width/2), 2)/3+math.Pow(float64(y-height/2), 2)) / (height / 1.5) * 255 var gray uint8 if dist > 255 { gray = 255 } else { gray = uint8(dist) } im.SetGray(x, y, color.Gray{Y: 255 - gray}) } } pi := image.NewPaletted(im.Bounds(), []color.Color{ color.Gray{Y: 255}, color.Gray{Y: 160}, color.Gray{Y: 70}, color.Gray{Y: 35}, color.Gray{Y: 0}, }) draw.FloydSteinberg.Draw(pi, im.Bounds(), im, image.ZP) shade := []string{" ", "░", "▒", "▓", "█"} for i, p := range pi.Pix { fmt.Print(shade[p]) if (i+1)%width == 0 { fmt.Print("\n") } } ``` type Image ---------- Image is an image.Image with a Set method to change a single pixel. ``` type Image interface { image.Image Set(x, y int, c color.Color) } ``` type Op ------- Op is a Porter-Duff compositing operator. ``` type Op int ``` ``` const ( // Over specifies ``(src in mask) over dst''. Over Op = iota // Src specifies ``src in mask''. Src ) ``` ### func (Op) Draw 1.2 ``` func (op Op) Draw(dst Image, r image.Rectangle, src image.Image, sp image.Point) ``` Draw implements the Drawer interface by calling the Draw function with this Op. type Quantizer 1.2 ------------------ Quantizer produces a palette for an image. ``` type Quantizer interface { // Quantize appends up to cap(p) - len(p) colors to p and returns the // updated palette suitable for converting m to a paletted image. Quantize(p color.Palette, m image.Image) color.Palette } ``` type RGBA64Image 1.17 --------------------- RGBA64Image extends both the Image and image.RGBA64Image interfaces with a SetRGBA64 method to change a single pixel. SetRGBA64 is equivalent to calling Set, but it can avoid allocations from converting concrete color types to the color.Color interface type. ``` type RGBA64Image interface { image.RGBA64Image Set(x, y int, c color.Color) SetRGBA64(x, y int, c color.RGBA64) } ``` go Package gif Package gif ============ * `import "image/gif"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package gif implements a GIF image decoder and encoder. The GIF specification is at <https://www.w3.org/Graphics/GIF/spec-gif89a.txt>. Index ----- * [Constants](#pkg-constants) * [func Decode(r io.Reader) (image.Image, error)](#Decode) * [func DecodeConfig(r io.Reader) (image.Config, error)](#DecodeConfig) * [func Encode(w io.Writer, m image.Image, o \*Options) error](#Encode) * [func EncodeAll(w io.Writer, g \*GIF) error](#EncodeAll) * [type GIF](#GIF) * [func DecodeAll(r io.Reader) (\*GIF, error)](#DecodeAll) * [type Options](#Options) ### Package files reader.go writer.go Constants --------- Disposal Methods. ``` const ( DisposalNone = 0x01 DisposalBackground = 0x02 DisposalPrevious = 0x03 ) ``` func Decode ----------- ``` func Decode(r io.Reader) (image.Image, error) ``` Decode reads a GIF image from r and returns the first embedded image as an image.Image. func DecodeConfig ----------------- ``` func DecodeConfig(r io.Reader) (image.Config, error) ``` DecodeConfig returns the global color model and dimensions of a GIF image without decoding the entire image. func Encode 1.2 --------------- ``` func Encode(w io.Writer, m image.Image, o *Options) error ``` Encode writes the Image m to w in GIF format. func EncodeAll 1.2 ------------------ ``` func EncodeAll(w io.Writer, g *GIF) error ``` EncodeAll writes the images in g to w in GIF format with the given loop count and delay between frames. type GIF -------- GIF represents the possibly multiple images stored in a GIF file. ``` type GIF struct { Image []*image.Paletted // The successive images. Delay []int // The successive delay times, one per frame, in 100ths of a second. // LoopCount controls the number of times an animation will be // restarted during display. // A LoopCount of 0 means to loop forever. // A LoopCount of -1 means to show each frame only once. // Otherwise, the animation is looped LoopCount+1 times. LoopCount int // Disposal is the successive disposal methods, one per frame. For // backwards compatibility, a nil Disposal is valid to pass to EncodeAll, // and implies that each frame's disposal method is 0 (no disposal // specified). Disposal []byte // Go 1.5 // Config is the global color table (palette), width and height. A nil or // empty-color.Palette Config.ColorModel means that each frame has its own // color table and there is no global color table. Each frame's bounds must // be within the rectangle defined by the two points (0, 0) and // (Config.Width, Config.Height). // // For backwards compatibility, a zero-valued Config is valid to pass to // EncodeAll, and implies that the overall GIF's width and height equals // the first frame's bounds' Rectangle.Max point. Config image.Config // Go 1.5 // BackgroundIndex is the background index in the global color table, for // use with the DisposalBackground disposal method. BackgroundIndex byte // Go 1.5 } ``` ### func DecodeAll ``` func DecodeAll(r io.Reader) (*GIF, error) ``` DecodeAll reads a GIF image from r and returns the sequential frames and timing information. type Options 1.2 ---------------- Options are the encoding parameters. ``` type Options struct { // NumColors is the maximum number of colors used in the image. // It ranges from 1 to 256. NumColors int // Quantizer is used to produce a palette with size NumColors. // palette.Plan9 is used in place of a nil Quantizer. Quantizer draw.Quantizer // Drawer is used to convert the source image to the desired palette. // draw.FloydSteinberg is used in place of a nil Drawer. Drawer draw.Drawer } ``` go Package time Package time ============= * `import "time"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package time provides functionality for measuring and displaying time. The calendrical calculations always assume a Gregorian calendar, with no leap seconds. ### Monotonic Clocks Operating systems provide both a “wall clock,” which is subject to changes for clock synchronization, and a “monotonic clock,” which is not. The general rule is that the wall clock is for telling time and the monotonic clock is for measuring time. Rather than split the API, in this package the Time returned by time.Now contains both a wall clock reading and a monotonic clock reading; later time-telling operations use the wall clock reading, but later time-measuring operations, specifically comparisons and subtractions, use the monotonic clock reading. For example, this code always computes a positive elapsed time of approximately 20 milliseconds, even if the wall clock is changed during the operation being timed: ``` start := time.Now() ... operation that takes 20 milliseconds ... t := time.Now() elapsed := t.Sub(start) ``` Other idioms, such as time.Since(start), time.Until(deadline), and time.Now().Before(deadline), are similarly robust against wall clock resets. The rest of this section gives the precise details of how operations use monotonic clocks, but understanding those details is not required to use this package. The Time returned by time.Now contains a monotonic clock reading. If Time t has a monotonic clock reading, t.Add adds the same duration to both the wall clock and monotonic clock readings to compute the result. Because t.AddDate(y, m, d), t.Round(d), and t.Truncate(d) are wall time computations, they always strip any monotonic clock reading from their results. Because t.In, t.Local, and t.UTC are used for their effect on the interpretation of the wall time, they also strip any monotonic clock reading from their results. The canonical way to strip a monotonic clock reading is to use t = t.Round(0). If Times t and u both contain monotonic clock readings, the operations t.After(u), t.Before(u), t.Equal(u), t.Compare(u), and t.Sub(u) are carried out using the monotonic clock readings alone, ignoring the wall clock readings. If either t or u contains no monotonic clock reading, these operations fall back to using the wall clock readings. On some systems the monotonic clock will stop if the computer goes to sleep. On such a system, t.Sub(u) may not accurately reflect the actual time that passed between t and u. Because the monotonic clock reading has no meaning outside the current process, the serialized forms generated by t.GobEncode, t.MarshalBinary, t.MarshalJSON, and t.MarshalText omit the monotonic clock reading, and t.Format provides no format for it. Similarly, the constructors time.Date, time.Parse, time.ParseInLocation, and time.Unix, as well as the unmarshalers t.GobDecode, t.UnmarshalBinary. t.UnmarshalJSON, and t.UnmarshalText always create times with no monotonic clock reading. The monotonic clock reading exists only in Time values. It is not a part of Duration values or the Unix times returned by t.Unix and friends. Note that the Go == operator compares not just the time instant but also the Location and the monotonic clock reading. See the documentation for the Time type for a discussion of equality testing for Time values. For debugging, the result of t.String does include the monotonic clock reading if present. If t != u because of different monotonic clock readings, that difference will be visible when printing t.String() and u.String(). Index ----- * [Constants](#pkg-constants) * [func After(d Duration) <-chan Time](#After) * [func Sleep(d Duration)](#Sleep) * [func Tick(d Duration) <-chan Time](#Tick) * [type Duration](#Duration) * [func ParseDuration(s string) (Duration, error)](#ParseDuration) * [func Since(t Time) Duration](#Since) * [func Until(t Time) Duration](#Until) * [func (d Duration) Abs() Duration](#Duration.Abs) * [func (d Duration) Hours() float64](#Duration.Hours) * [func (d Duration) Microseconds() int64](#Duration.Microseconds) * [func (d Duration) Milliseconds() int64](#Duration.Milliseconds) * [func (d Duration) Minutes() float64](#Duration.Minutes) * [func (d Duration) Nanoseconds() int64](#Duration.Nanoseconds) * [func (d Duration) Round(m Duration) Duration](#Duration.Round) * [func (d Duration) Seconds() float64](#Duration.Seconds) * [func (d Duration) String() string](#Duration.String) * [func (d Duration) Truncate(m Duration) Duration](#Duration.Truncate) * [type Location](#Location) * [func FixedZone(name string, offset int) \*Location](#FixedZone) * [func LoadLocation(name string) (\*Location, error)](#LoadLocation) * [func LoadLocationFromTZData(name string, data []byte) (\*Location, error)](#LoadLocationFromTZData) * [func (l \*Location) String() string](#Location.String) * [type Month](#Month) * [func (m Month) String() string](#Month.String) * [type ParseError](#ParseError) * [func (e \*ParseError) Error() string](#ParseError.Error) * [type Ticker](#Ticker) * [func NewTicker(d Duration) \*Ticker](#NewTicker) * [func (t \*Ticker) Reset(d Duration)](#Ticker.Reset) * [func (t \*Ticker) Stop()](#Ticker.Stop) * [type Time](#Time) * [func Date(year int, month Month, day, hour, min, sec, nsec int, loc \*Location) Time](#Date) * [func Now() Time](#Now) * [func Parse(layout, value string) (Time, error)](#Parse) * [func ParseInLocation(layout, value string, loc \*Location) (Time, error)](#ParseInLocation) * [func Unix(sec int64, nsec int64) Time](#Unix) * [func UnixMicro(usec int64) Time](#UnixMicro) * [func UnixMilli(msec int64) Time](#UnixMilli) * [func (t Time) Add(d Duration) Time](#Time.Add) * [func (t Time) AddDate(years int, months int, days int) Time](#Time.AddDate) * [func (t Time) After(u Time) bool](#Time.After) * [func (t Time) AppendFormat(b []byte, layout string) []byte](#Time.AppendFormat) * [func (t Time) Before(u Time) bool](#Time.Before) * [func (t Time) Clock() (hour, min, sec int)](#Time.Clock) * [func (t Time) Compare(u Time) int](#Time.Compare) * [func (t Time) Date() (year int, month Month, day int)](#Time.Date) * [func (t Time) Day() int](#Time.Day) * [func (t Time) Equal(u Time) bool](#Time.Equal) * [func (t Time) Format(layout string) string](#Time.Format) * [func (t Time) GoString() string](#Time.GoString) * [func (t \*Time) GobDecode(data []byte) error](#Time.GobDecode) * [func (t Time) GobEncode() ([]byte, error)](#Time.GobEncode) * [func (t Time) Hour() int](#Time.Hour) * [func (t Time) ISOWeek() (year, week int)](#Time.ISOWeek) * [func (t Time) In(loc \*Location) Time](#Time.In) * [func (t Time) IsDST() bool](#Time.IsDST) * [func (t Time) IsZero() bool](#Time.IsZero) * [func (t Time) Local() Time](#Time.Local) * [func (t Time) Location() \*Location](#Time.Location) * [func (t Time) MarshalBinary() ([]byte, error)](#Time.MarshalBinary) * [func (t Time) MarshalJSON() ([]byte, error)](#Time.MarshalJSON) * [func (t Time) MarshalText() ([]byte, error)](#Time.MarshalText) * [func (t Time) Minute() int](#Time.Minute) * [func (t Time) Month() Month](#Time.Month) * [func (t Time) Nanosecond() int](#Time.Nanosecond) * [func (t Time) Round(d Duration) Time](#Time.Round) * [func (t Time) Second() int](#Time.Second) * [func (t Time) String() string](#Time.String) * [func (t Time) Sub(u Time) Duration](#Time.Sub) * [func (t Time) Truncate(d Duration) Time](#Time.Truncate) * [func (t Time) UTC() Time](#Time.UTC) * [func (t Time) Unix() int64](#Time.Unix) * [func (t Time) UnixMicro() int64](#Time.UnixMicro) * [func (t Time) UnixMilli() int64](#Time.UnixMilli) * [func (t Time) UnixNano() int64](#Time.UnixNano) * [func (t \*Time) UnmarshalBinary(data []byte) error](#Time.UnmarshalBinary) * [func (t \*Time) UnmarshalJSON(data []byte) error](#Time.UnmarshalJSON) * [func (t \*Time) UnmarshalText(data []byte) error](#Time.UnmarshalText) * [func (t Time) Weekday() Weekday](#Time.Weekday) * [func (t Time) Year() int](#Time.Year) * [func (t Time) YearDay() int](#Time.YearDay) * [func (t Time) Zone() (name string, offset int)](#Time.Zone) * [func (t Time) ZoneBounds() (start, end Time)](#Time.ZoneBounds) * [type Timer](#Timer) * [func AfterFunc(d Duration, f func()) \*Timer](#AfterFunc) * [func NewTimer(d Duration) \*Timer](#NewTimer) * [func (t \*Timer) Reset(d Duration) bool](#Timer.Reset) * [func (t \*Timer) Stop() bool](#Timer.Stop) * [type Weekday](#Weekday) * [func (d Weekday) String() string](#Weekday.String) ### Examples [After](#example_After) [Date](#example_Date) [Duration](#example_Duration) [Duration.Hours](#example_Duration_Hours) [Duration.Microseconds](#example_Duration_Microseconds) [Duration.Milliseconds](#example_Duration_Milliseconds) [Duration.Minutes](#example_Duration_Minutes) [Duration.Nanoseconds](#example_Duration_Nanoseconds) [Duration.Round](#example_Duration_Round) [Duration.Seconds](#example_Duration_Seconds) [Duration.String](#example_Duration_String) [Duration.Truncate](#example_Duration_Truncate) [FixedZone](#example_FixedZone) [LoadLocation](#example_LoadLocation) [Location](#example_Location) [Month](#example_Month) [NewTicker](#example_NewTicker) [Parse](#example_Parse) [ParseDuration](#example_ParseDuration) [ParseInLocation](#example_ParseInLocation) [Sleep](#example_Sleep) [Tick](#example_Tick) [Time.Add](#example_Time_Add) [Time.AddDate](#example_Time_AddDate) [Time.After](#example_Time_After) [Time.AppendFormat](#example_Time_AppendFormat) [Time.Before](#example_Time_Before) [Time.Date](#example_Time_Date) [Time.Day](#example_Time_Day) [Time.Equal](#example_Time_Equal) [Time.Format](#example_Time_Format) [Time.Format (Pad)](#example_Time_Format_pad) [Time.GoString](#example_Time_GoString) [Time.Round](#example_Time_Round) [Time.String](#example_Time_String) [Time.Sub](#example_Time_Sub) [Time.Truncate](#example_Time_Truncate) [Time.Unix](#example_Time_Unix) [Unix](#example_Unix) [UnixMicro](#example_UnixMicro) [UnixMilli](#example_UnixMilli) ### Package files format.go format\_rfc3339.go sleep.go sys\_unix.go tick.go time.go zoneinfo.go zoneinfo\_goroot.go zoneinfo\_read.go zoneinfo\_unix.go Constants --------- These are predefined layouts for use in Time.Format and time.Parse. The reference time used in these layouts is the specific time stamp: ``` 01/02 03:04:05PM '06 -0700 ``` (January 2, 15:04:05, 2006, in time zone seven hours west of GMT). That value is recorded as the constant named Layout, listed below. As a Unix time, this is 1136239445. Since MST is GMT-0700, the reference would be printed by the Unix date command as: ``` Mon Jan 2 15:04:05 MST 2006 ``` It is a regrettable historic error that the date uses the American convention of putting the numerical month before the day. The example for Time.Format demonstrates the working of the layout string in detail and is a good reference. Note that the RFC822, RFC850, and RFC1123 formats should be applied only to local times. Applying them to UTC times will use "UTC" as the time zone abbreviation, while strictly speaking those RFCs require the use of "GMT" in that case. In general RFC1123Z should be used instead of RFC1123 for servers that insist on that format, and RFC3339 should be preferred for new protocols. RFC3339, RFC822, RFC822Z, RFC1123, and RFC1123Z are useful for formatting; when used with time.Parse they do not accept all the time formats permitted by the RFCs and they do accept time formats not formally defined. The RFC3339Nano format removes trailing zeros from the seconds field and thus may not sort correctly once formatted. Most programs can use one of the defined constants as the layout passed to Format or Parse. The rest of this comment can be ignored unless you are creating a custom layout string. To define your own format, write down what the reference time would look like formatted your way; see the values of constants like ANSIC, StampMicro or Kitchen for examples. The model is to demonstrate what the reference time looks like so that the Format and Parse methods can apply the same transformation to a general time value. Here is a summary of the components of a layout string. Each element shows by example the formatting of an element of the reference time. Only these values are recognized. Text in the layout string that is not recognized as part of the reference time is echoed verbatim during Format and expected to appear verbatim in the input to Parse. ``` Year: "2006" "06" Month: "Jan" "January" "01" "1" Day of the week: "Mon" "Monday" Day of the month: "2" "_2" "02" Day of the year: "__2" "002" Hour: "15" "3" "03" (PM or AM) Minute: "4" "04" Second: "5" "05" AM/PM mark: "PM" ``` Numeric time zone offsets format as follows: ``` "-0700" ±hhmm "-07:00" ±hh:mm "-07" ±hh "-070000" ±hhmmss "-07:00:00" ±hh:mm:ss ``` Replacing the sign in the format with a Z triggers the ISO 8601 behavior of printing Z instead of an offset for the UTC zone. Thus: ``` "Z0700" Z or ±hhmm "Z07:00" Z or ±hh:mm "Z07" Z or ±hh "Z070000" Z or ±hhmmss "Z07:00:00" Z or ±hh:mm:ss ``` Within the format string, the underscores in "\_2" and "\_\_2" represent spaces that may be replaced by digits if the following number has multiple digits, for compatibility with fixed-width Unix time formats. A leading zero represents a zero-padded value. The formats \_\_2 and 002 are space-padded and zero-padded three-character day of year; there is no unpadded day of year format. A comma or decimal point followed by one or more zeros represents a fractional second, printed to the given number of decimal places. A comma or decimal point followed by one or more nines represents a fractional second, printed to the given number of decimal places, with trailing zeros removed. For example "15:04:05,000" or "15:04:05.000" formats or parses with millisecond precision. Some valid layouts are invalid time values for time.Parse, due to formats such as \_ for space padding and Z for zone information. ``` const ( Layout = "01/02 03:04:05PM '06 -0700" // The reference time, in numerical order. ANSIC = "Mon Jan _2 15:04:05 2006" UnixDate = "Mon Jan _2 15:04:05 MST 2006" RubyDate = "Mon Jan 02 15:04:05 -0700 2006" RFC822 = "02 Jan 06 15:04 MST" RFC822Z = "02 Jan 06 15:04 -0700" // RFC822 with numeric zone RFC850 = "Monday, 02-Jan-06 15:04:05 MST" RFC1123 = "Mon, 02 Jan 2006 15:04:05 MST" RFC1123Z = "Mon, 02 Jan 2006 15:04:05 -0700" // RFC1123 with numeric zone RFC3339 = "2006-01-02T15:04:05Z07:00" RFC3339Nano = "2006-01-02T15:04:05.999999999Z07:00" Kitchen = "3:04PM" // Handy time stamps. Stamp = "Jan _2 15:04:05" StampMilli = "Jan _2 15:04:05.000" StampMicro = "Jan _2 15:04:05.000000" StampNano = "Jan _2 15:04:05.000000000" DateTime = "2006-01-02 15:04:05" DateOnly = "2006-01-02" TimeOnly = "15:04:05" ) ``` Common durations. There is no definition for units of Day or larger to avoid confusion across daylight savings time zone transitions. To count the number of units in a Duration, divide: ``` second := time.Second fmt.Print(int64(second/time.Millisecond)) // prints 1000 ``` To convert an integer number of units to a Duration, multiply: ``` seconds := 10 fmt.Print(time.Duration(seconds)*time.Second) // prints 10s ``` ``` const ( Nanosecond Duration = 1 Microsecond = 1000 * Nanosecond Millisecond = 1000 * Microsecond Second = 1000 * Millisecond Minute = 60 * Second Hour = 60 * Minute ) ``` func After ---------- ``` func After(d Duration) <-chan Time ``` After waits for the duration to elapse and then sends the current time on the returned channel. It is equivalent to NewTimer(d).C. The underlying Timer is not recovered by the garbage collector until the timer fires. If efficiency is a concern, use NewTimer instead and call Timer.Stop if the timer is no longer needed. #### Example Code: ``` select { case m := <-c: handle(m) case <-time.After(10 * time.Second): fmt.Println("timed out") } ``` func Sleep ---------- ``` func Sleep(d Duration) ``` Sleep pauses the current goroutine for at least the duration d. A negative or zero duration causes Sleep to return immediately. #### Example Code: ``` time.Sleep(100 * time.Millisecond) ``` func Tick --------- ``` func Tick(d Duration) <-chan Time ``` Tick is a convenience wrapper for NewTicker providing access to the ticking channel only. While Tick is useful for clients that have no need to shut down the Ticker, be aware that without a way to shut it down the underlying Ticker cannot be recovered by the garbage collector; it "leaks". Unlike NewTicker, Tick will return nil if d <= 0. #### Example Code: ``` c := time.Tick(5 * time.Second) for next := range c { fmt.Printf("%v %s\n", next, statusUpdate()) } ``` type Duration ------------- A Duration represents the elapsed time between two instants as an int64 nanosecond count. The representation limits the largest representable duration to approximately 290 years. ``` type Duration int64 ``` #### Example Code: ``` t0 := time.Now() expensiveCall() t1 := time.Now() fmt.Printf("The call took %v to run.\n", t1.Sub(t0)) ``` ### func ParseDuration ``` func ParseDuration(s string) (Duration, error) ``` ParseDuration parses a duration string. A duration string is a possibly signed sequence of decimal numbers, each with optional fraction and a unit suffix, such as "300ms", "-1.5h" or "2h45m". Valid time units are "ns", "us" (or "µs"), "ms", "s", "m", "h". #### Example Code: ``` hours, _ := time.ParseDuration("10h") complex, _ := time.ParseDuration("1h10m10s") micro, _ := time.ParseDuration("1µs") // The package also accepts the incorrect but common prefix u for micro. micro2, _ := time.ParseDuration("1us") fmt.Println(hours) fmt.Println(complex) fmt.Printf("There are %.0f seconds in %v.\n", complex.Seconds(), complex) fmt.Printf("There are %d nanoseconds in %v.\n", micro.Nanoseconds(), micro) fmt.Printf("There are %6.2e seconds in %v.\n", micro2.Seconds(), micro) ``` Output: ``` 10h0m0s 1h10m10s There are 4210 seconds in 1h10m10s. There are 1000 nanoseconds in 1µs. There are 1.00e-06 seconds in 1µs. ``` ### func Since ``` func Since(t Time) Duration ``` Since returns the time elapsed since t. It is shorthand for time.Now().Sub(t). ### func Until 1.8 ``` func Until(t Time) Duration ``` Until returns the duration until t. It is shorthand for t.Sub(time.Now()). ### func (Duration) Abs 1.19 ``` func (d Duration) Abs() Duration ``` Abs returns the absolute value of d. As a special case, math.MinInt64 is converted to math.MaxInt64. ### func (Duration) Hours ``` func (d Duration) Hours() float64 ``` Hours returns the duration as a floating point number of hours. #### Example Code: ``` h, _ := time.ParseDuration("4h30m") fmt.Printf("I've got %.1f hours of work left.", h.Hours()) ``` Output: ``` I've got 4.5 hours of work left. ``` ### func (Duration) Microseconds 1.13 ``` func (d Duration) Microseconds() int64 ``` Microseconds returns the duration as an integer microsecond count. #### Example Code: ``` u, _ := time.ParseDuration("1s") fmt.Printf("One second is %d microseconds.\n", u.Microseconds()) ``` Output: ``` One second is 1000000 microseconds. ``` ### func (Duration) Milliseconds 1.13 ``` func (d Duration) Milliseconds() int64 ``` Milliseconds returns the duration as an integer millisecond count. #### Example Code: ``` u, _ := time.ParseDuration("1s") fmt.Printf("One second is %d milliseconds.\n", u.Milliseconds()) ``` Output: ``` One second is 1000 milliseconds. ``` ### func (Duration) Minutes ``` func (d Duration) Minutes() float64 ``` Minutes returns the duration as a floating point number of minutes. #### Example Code: ``` m, _ := time.ParseDuration("1h30m") fmt.Printf("The movie is %.0f minutes long.", m.Minutes()) ``` Output: ``` The movie is 90 minutes long. ``` ### func (Duration) Nanoseconds ``` func (d Duration) Nanoseconds() int64 ``` Nanoseconds returns the duration as an integer nanosecond count. #### Example Code: ``` u, _ := time.ParseDuration("1µs") fmt.Printf("One microsecond is %d nanoseconds.\n", u.Nanoseconds()) ``` Output: ``` One microsecond is 1000 nanoseconds. ``` ### func (Duration) Round 1.9 ``` func (d Duration) Round(m Duration) Duration ``` Round returns the result of rounding d to the nearest multiple of m. The rounding behavior for halfway values is to round away from zero. If the result exceeds the maximum (or minimum) value that can be stored in a Duration, Round returns the maximum (or minimum) duration. If m <= 0, Round returns d unchanged. #### Example Code: ``` d, err := time.ParseDuration("1h15m30.918273645s") if err != nil { panic(err) } round := []time.Duration{ time.Nanosecond, time.Microsecond, time.Millisecond, time.Second, 2 * time.Second, time.Minute, 10 * time.Minute, time.Hour, } for _, r := range round { fmt.Printf("d.Round(%6s) = %s\n", r, d.Round(r).String()) } ``` Output: ``` d.Round( 1ns) = 1h15m30.918273645s d.Round( 1µs) = 1h15m30.918274s d.Round( 1ms) = 1h15m30.918s d.Round( 1s) = 1h15m31s d.Round( 2s) = 1h15m30s d.Round( 1m0s) = 1h16m0s d.Round( 10m0s) = 1h20m0s d.Round(1h0m0s) = 1h0m0s ``` ### func (Duration) Seconds ``` func (d Duration) Seconds() float64 ``` Seconds returns the duration as a floating point number of seconds. #### Example Code: ``` m, _ := time.ParseDuration("1m30s") fmt.Printf("Take off in t-%.0f seconds.", m.Seconds()) ``` Output: ``` Take off in t-90 seconds. ``` ### func (Duration) String ``` func (d Duration) String() string ``` String returns a string representing the duration in the form "72h3m0.5s". Leading zero units are omitted. As a special case, durations less than one second format use a smaller unit (milli-, micro-, or nanoseconds) to ensure that the leading digit is non-zero. The zero duration formats as 0s. #### Example Code: ``` fmt.Println(1*time.Hour + 2*time.Minute + 300*time.Millisecond) fmt.Println(300 * time.Millisecond) ``` Output: ``` 1h2m0.3s 300ms ``` ### func (Duration) Truncate 1.9 ``` func (d Duration) Truncate(m Duration) Duration ``` Truncate returns the result of rounding d toward zero to a multiple of m. If m <= 0, Truncate returns d unchanged. #### Example Code: ``` d, err := time.ParseDuration("1h15m30.918273645s") if err != nil { panic(err) } trunc := []time.Duration{ time.Nanosecond, time.Microsecond, time.Millisecond, time.Second, 2 * time.Second, time.Minute, 10 * time.Minute, time.Hour, } for _, t := range trunc { fmt.Printf("d.Truncate(%6s) = %s\n", t, d.Truncate(t).String()) } ``` Output: ``` d.Truncate( 1ns) = 1h15m30.918273645s d.Truncate( 1µs) = 1h15m30.918273s d.Truncate( 1ms) = 1h15m30.918s d.Truncate( 1s) = 1h15m30s d.Truncate( 2s) = 1h15m30s d.Truncate( 1m0s) = 1h15m0s d.Truncate( 10m0s) = 1h10m0s d.Truncate(1h0m0s) = 1h0m0s ``` type Location ------------- A Location maps time instants to the zone in use at that time. Typically, the Location represents the collection of time offsets in use in a geographical area. For many Locations the time offset varies depending on whether daylight savings time is in use at the time instant. ``` type Location struct { // contains filtered or unexported fields } ``` Local represents the system's local time zone. On Unix systems, Local consults the TZ environment variable to find the time zone to use. No TZ means use the system default /etc/localtime. TZ="" means use UTC. TZ="foo" means use file foo in the system timezone directory. ``` var Local *Location = &localLoc ``` UTC represents Universal Coordinated Time (UTC). ``` var UTC *Location = &utcLoc ``` #### Example Code: ``` // China doesn't have daylight saving. It uses a fixed 8 hour offset from UTC. secondsEastOfUTC := int((8 * time.Hour).Seconds()) beijing := time.FixedZone("Beijing Time", secondsEastOfUTC) // If the system has a timezone database present, it's possible to load a location // from that, e.g.: // newYork, err := time.LoadLocation("America/New_York") // Creating a time requires a location. Common locations are time.Local and time.UTC. timeInUTC := time.Date(2009, 1, 1, 12, 0, 0, 0, time.UTC) sameTimeInBeijing := time.Date(2009, 1, 1, 20, 0, 0, 0, beijing) // Although the UTC clock time is 1200 and the Beijing clock time is 2000, Beijing is // 8 hours ahead so the two dates actually represent the same instant. timesAreEqual := timeInUTC.Equal(sameTimeInBeijing) fmt.Println(timesAreEqual) ``` Output: ``` true ``` ### func FixedZone ``` func FixedZone(name string, offset int) *Location ``` FixedZone returns a Location that always uses the given zone name and offset (seconds east of UTC). #### Example Code: ``` loc := time.FixedZone("UTC-8", -8*60*60) t := time.Date(2009, time.November, 10, 23, 0, 0, 0, loc) fmt.Println("The time is:", t.Format(time.RFC822)) ``` Output: ``` The time is: 10 Nov 09 23:00 UTC-8 ``` ### func LoadLocation ``` func LoadLocation(name string) (*Location, error) ``` LoadLocation returns the Location with the given name. If the name is "" or "UTC", LoadLocation returns UTC. If the name is "Local", LoadLocation returns Local. Otherwise, the name is taken to be a location name corresponding to a file in the IANA Time Zone database, such as "America/New\_York". LoadLocation looks for the IANA Time Zone database in the following locations in order: * the directory or uncompressed zip file named by the ZONEINFO environment variable * on a Unix system, the system standard installation location * $GOROOT/lib/time/zoneinfo.zip * the time/tzdata package, if it was imported #### Example Code: ``` location, err := time.LoadLocation("America/Los_Angeles") if err != nil { panic(err) } timeInUTC := time.Date(2018, 8, 30, 12, 0, 0, 0, time.UTC) fmt.Println(timeInUTC.In(location)) ``` Output: ``` 2018-08-30 05:00:00 -0700 PDT ``` ### func LoadLocationFromTZData 1.10 ``` func LoadLocationFromTZData(name string, data []byte) (*Location, error) ``` LoadLocationFromTZData returns a Location with the given name initialized from the IANA Time Zone database-formatted data. The data should be in the format of a standard IANA time zone file (for example, the content of /etc/localtime on Unix systems). ### func (\*Location) String ``` func (l *Location) String() string ``` String returns a descriptive name for the time zone information, corresponding to the name argument to LoadLocation or FixedZone. type Month ---------- A Month specifies a month of the year (January = 1, ...). ``` type Month int ``` ``` const ( January Month = 1 + iota February March April May June July August September October November December ) ``` #### Example Code: ``` _, month, day := time.Now().Date() if month == time.November && day == 10 { fmt.Println("Happy Go day!") } ``` ### func (Month) String ``` func (m Month) String() string ``` String returns the English name of the month ("January", "February", ...). type ParseError --------------- ParseError describes a problem parsing a time string. ``` type ParseError struct { Layout string Value string LayoutElem string ValueElem string Message string } ``` ### func (\*ParseError) Error ``` func (e *ParseError) Error() string ``` Error returns the string representation of a ParseError. type Ticker ----------- A Ticker holds a channel that delivers “ticks” of a clock at intervals. ``` type Ticker struct { C <-chan Time // The channel on which the ticks are delivered. // contains filtered or unexported fields } ``` ### func NewTicker ``` func NewTicker(d Duration) *Ticker ``` NewTicker returns a new Ticker containing a channel that will send the current time on the channel after each tick. The period of the ticks is specified by the duration argument. The ticker will adjust the time interval or drop ticks to make up for slow receivers. The duration d must be greater than zero; if not, NewTicker will panic. Stop the ticker to release associated resources. #### Example Code: ``` ticker := time.NewTicker(time.Second) defer ticker.Stop() done := make(chan bool) go func() { time.Sleep(10 * time.Second) done <- true }() for { select { case <-done: fmt.Println("Done!") return case t := <-ticker.C: fmt.Println("Current time: ", t) } } ``` ### func (\*Ticker) Reset 1.15 ``` func (t *Ticker) Reset(d Duration) ``` Reset stops a ticker and resets its period to the specified duration. The next tick will arrive after the new period elapses. The duration d must be greater than zero; if not, Reset will panic. ### func (\*Ticker) Stop ``` func (t *Ticker) Stop() ``` Stop turns off a ticker. After Stop, no more ticks will be sent. Stop does not close the channel, to prevent a concurrent goroutine reading from the channel from seeing an erroneous "tick". type Time --------- A Time represents an instant in time with nanosecond precision. Programs using times should typically store and pass them as values, not pointers. That is, time variables and struct fields should be of type time.Time, not \*time.Time. A Time value can be used by multiple goroutines simultaneously except that the methods GobDecode, UnmarshalBinary, UnmarshalJSON and UnmarshalText are not concurrency-safe. Time instants can be compared using the Before, After, and Equal methods. The Sub method subtracts two instants, producing a Duration. The Add method adds a Time and a Duration, producing a Time. The zero value of type Time is January 1, year 1, 00:00:00.000000000 UTC. As this time is unlikely to come up in practice, the IsZero method gives a simple way of detecting a time that has not been initialized explicitly. Each Time has associated with it a Location, consulted when computing the presentation form of the time, such as in the Format, Hour, and Year methods. The methods Local, UTC, and In return a Time with a specific location. Changing the location in this way changes only the presentation; it does not change the instant in time being denoted and therefore does not affect the computations described in earlier paragraphs. Representations of a Time value saved by the GobEncode, MarshalBinary, MarshalJSON, and MarshalText methods store the Time.Location's offset, but not the location name. They therefore lose information about Daylight Saving Time. In addition to the required “wall clock” reading, a Time may contain an optional reading of the current process's monotonic clock, to provide additional precision for comparison or subtraction. See the “Monotonic Clocks” section in the package documentation for details. Note that the Go == operator compares not just the time instant but also the Location and the monotonic clock reading. Therefore, Time values should not be used as map or database keys without first guaranteeing that the identical Location has been set for all values, which can be achieved through use of the UTC or Local method, and that the monotonic clock reading has been stripped by setting t = t.Round(0). In general, prefer t.Equal(u) to t == u, since t.Equal uses the most accurate comparison available and correctly handles the case when only one of its arguments has a monotonic clock reading. ``` type Time struct { // contains filtered or unexported fields } ``` ### func Date ``` func Date(year int, month Month, day, hour, min, sec, nsec int, loc *Location) Time ``` Date returns the Time corresponding to ``` yyyy-mm-dd hh:mm:ss + nsec nanoseconds ``` in the appropriate zone for that time in the given location. The month, day, hour, min, sec, and nsec values may be outside their usual ranges and will be normalized during the conversion. For example, October 32 converts to November 1. A daylight savings time transition skips or repeats times. For example, in the United States, March 13, 2011 2:15am never occurred, while November 6, 2011 1:15am occurred twice. In such cases, the choice of time zone, and therefore the time, is not well-defined. Date returns a time that is correct in one of the two zones involved in the transition, but it does not guarantee which. Date panics if loc is nil. #### Example Code: ``` t := time.Date(2009, time.November, 10, 23, 0, 0, 0, time.UTC) fmt.Printf("Go launched at %s\n", t.Local()) ``` Output: ``` Go launched at 2009-11-10 15:00:00 -0800 PST ``` ### func Now ``` func Now() Time ``` Now returns the current local time. ### func Parse ``` func Parse(layout, value string) (Time, error) ``` Parse parses a formatted string and returns the time value it represents. See the documentation for the constant called Layout to see how to represent the format. The second argument must be parseable using the format string (layout) provided as the first argument. The example for Time.Format demonstrates the working of the layout string in detail and is a good reference. When parsing (only), the input may contain a fractional second field immediately after the seconds field, even if the layout does not signify its presence. In that case either a comma or a decimal point followed by a maximal series of digits is parsed as a fractional second. Fractional seconds are truncated to nanosecond precision. Elements omitted from the layout are assumed to be zero or, when zero is impossible, one, so parsing "3:04pm" returns the time corresponding to Jan 1, year 0, 15:04:00 UTC (note that because the year is 0, this time is before the zero Time). Years must be in the range 0000..9999. The day of the week is checked for syntax but it is otherwise ignored. For layouts specifying the two-digit year 06, a value NN >= 69 will be treated as 19NN and a value NN < 69 will be treated as 20NN. The remainder of this comment describes the handling of time zones. In the absence of a time zone indicator, Parse returns a time in UTC. When parsing a time with a zone offset like -0700, if the offset corresponds to a time zone used by the current location (Local), then Parse uses that location and zone in the returned time. Otherwise it records the time as being in a fabricated location with time fixed at the given zone offset. When parsing a time with a zone abbreviation like MST, if the zone abbreviation has a defined offset in the current location, then that offset is used. The zone abbreviation "UTC" is recognized as UTC regardless of location. If the zone abbreviation is unknown, Parse records the time as being in a fabricated location with the given zone abbreviation and a zero offset. This choice means that such a time can be parsed and reformatted with the same layout losslessly, but the exact instant used in the representation will differ by the actual zone offset. To avoid such problems, prefer time layouts that use a numeric zone offset, or use ParseInLocation. #### Example Code: ``` // See the example for Time.Format for a thorough description of how // to define the layout string to parse a time.Time value; Parse and // Format use the same model to describe their input and output. // longForm shows by example how the reference time would be represented in // the desired layout. const longForm = "Jan 2, 2006 at 3:04pm (MST)" t, _ := time.Parse(longForm, "Feb 3, 2013 at 7:54pm (PST)") fmt.Println(t) // shortForm is another way the reference time would be represented // in the desired layout; it has no time zone present. // Note: without explicit zone, returns time in UTC. const shortForm = "2006-Jan-02" t, _ = time.Parse(shortForm, "2013-Feb-03") fmt.Println(t) // Some valid layouts are invalid time values, due to format specifiers // such as _ for space padding and Z for zone information. // For example the RFC3339 layout 2006-01-02T15:04:05Z07:00 // contains both Z and a time zone offset in order to handle both valid options: // 2006-01-02T15:04:05Z // 2006-01-02T15:04:05+07:00 t, _ = time.Parse(time.RFC3339, "2006-01-02T15:04:05Z") fmt.Println(t) t, _ = time.Parse(time.RFC3339, "2006-01-02T15:04:05+07:00") fmt.Println(t) _, err := time.Parse(time.RFC3339, time.RFC3339) fmt.Println("error", err) // Returns an error as the layout is not a valid time value ``` Output: ``` 2013-02-03 19:54:00 -0800 PST 2013-02-03 00:00:00 +0000 UTC 2006-01-02 15:04:05 +0000 UTC 2006-01-02 15:04:05 +0700 +0700 error parsing time "2006-01-02T15:04:05Z07:00": extra text: "07:00" ``` ### func ParseInLocation 1.1 ``` func ParseInLocation(layout, value string, loc *Location) (Time, error) ``` ParseInLocation is like Parse but differs in two important ways. First, in the absence of time zone information, Parse interprets a time as UTC; ParseInLocation interprets the time as in the given location. Second, when given a zone offset or abbreviation, Parse tries to match it against the Local location; ParseInLocation uses the given location. #### Example Code: ``` loc, _ := time.LoadLocation("Europe/Berlin") // This will look for the name CEST in the Europe/Berlin time zone. const longForm = "Jan 2, 2006 at 3:04pm (MST)" t, _ := time.ParseInLocation(longForm, "Jul 9, 2012 at 5:02am (CEST)", loc) fmt.Println(t) // Note: without explicit zone, returns time in given location. const shortForm = "2006-Jan-02" t, _ = time.ParseInLocation(shortForm, "2012-Jul-09", loc) fmt.Println(t) ``` Output: ``` 2012-07-09 05:02:00 +0200 CEST 2012-07-09 00:00:00 +0200 CEST ``` ### func Unix ``` func Unix(sec int64, nsec int64) Time ``` Unix returns the local Time corresponding to the given Unix time, sec seconds and nsec nanoseconds since January 1, 1970 UTC. It is valid to pass nsec outside the range [0, 999999999]. Not all sec values have a corresponding time value. One such value is 1<<63-1 (the largest int64 value). #### Example Code: ``` unixTime := time.Date(2009, time.November, 10, 23, 0, 0, 0, time.UTC) fmt.Println(unixTime.Unix()) t := time.Unix(unixTime.Unix(), 0).UTC() fmt.Println(t) ``` Output: ``` 1257894000 2009-11-10 23:00:00 +0000 UTC ``` ### func UnixMicro 1.17 ``` func UnixMicro(usec int64) Time ``` UnixMicro returns the local Time corresponding to the given Unix time, usec microseconds since January 1, 1970 UTC. #### Example Code: ``` umt := time.Date(2009, time.November, 10, 23, 0, 0, 0, time.UTC) fmt.Println(umt.UnixMicro()) t := time.UnixMicro(umt.UnixMicro()).UTC() fmt.Println(t) ``` Output: ``` 1257894000000000 2009-11-10 23:00:00 +0000 UTC ``` ### func UnixMilli 1.17 ``` func UnixMilli(msec int64) Time ``` UnixMilli returns the local Time corresponding to the given Unix time, msec milliseconds since January 1, 1970 UTC. #### Example Code: ``` umt := time.Date(2009, time.November, 10, 23, 0, 0, 0, time.UTC) fmt.Println(umt.UnixMilli()) t := time.UnixMilli(umt.UnixMilli()).UTC() fmt.Println(t) ``` Output: ``` 1257894000000 2009-11-10 23:00:00 +0000 UTC ``` ### func (Time) Add ``` func (t Time) Add(d Duration) Time ``` Add returns the time t+d. #### Example Code: ``` start := time.Date(2009, 1, 1, 12, 0, 0, 0, time.UTC) afterTenSeconds := start.Add(time.Second * 10) afterTenMinutes := start.Add(time.Minute * 10) afterTenHours := start.Add(time.Hour * 10) afterTenDays := start.Add(time.Hour * 24 * 10) fmt.Printf("start = %v\n", start) fmt.Printf("start.Add(time.Second * 10) = %v\n", afterTenSeconds) fmt.Printf("start.Add(time.Minute * 10) = %v\n", afterTenMinutes) fmt.Printf("start.Add(time.Hour * 10) = %v\n", afterTenHours) fmt.Printf("start.Add(time.Hour * 24 * 10) = %v\n", afterTenDays) ``` Output: ``` start = 2009-01-01 12:00:00 +0000 UTC start.Add(time.Second * 10) = 2009-01-01 12:00:10 +0000 UTC start.Add(time.Minute * 10) = 2009-01-01 12:10:00 +0000 UTC start.Add(time.Hour * 10) = 2009-01-01 22:00:00 +0000 UTC start.Add(time.Hour * 24 * 10) = 2009-01-11 12:00:00 +0000 UTC ``` ### func (Time) AddDate ``` func (t Time) AddDate(years int, months int, days int) Time ``` AddDate returns the time corresponding to adding the given number of years, months, and days to t. For example, AddDate(-1, 2, 3) applied to January 1, 2011 returns March 4, 2010. AddDate normalizes its result in the same way that Date does, so, for example, adding one month to October 31 yields December 1, the normalized form for November 31. #### Example Code: ``` start := time.Date(2009, 1, 1, 0, 0, 0, 0, time.UTC) oneDayLater := start.AddDate(0, 0, 1) oneMonthLater := start.AddDate(0, 1, 0) oneYearLater := start.AddDate(1, 0, 0) fmt.Printf("oneDayLater: start.AddDate(0, 0, 1) = %v\n", oneDayLater) fmt.Printf("oneMonthLater: start.AddDate(0, 1, 0) = %v\n", oneMonthLater) fmt.Printf("oneYearLater: start.AddDate(1, 0, 0) = %v\n", oneYearLater) ``` Output: ``` oneDayLater: start.AddDate(0, 0, 1) = 2009-01-02 00:00:00 +0000 UTC oneMonthLater: start.AddDate(0, 1, 0) = 2009-02-01 00:00:00 +0000 UTC oneYearLater: start.AddDate(1, 0, 0) = 2010-01-01 00:00:00 +0000 UTC ``` ### func (Time) After ``` func (t Time) After(u Time) bool ``` After reports whether the time instant t is after u. #### Example Code: ``` year2000 := time.Date(2000, 1, 1, 0, 0, 0, 0, time.UTC) year3000 := time.Date(3000, 1, 1, 0, 0, 0, 0, time.UTC) isYear3000AfterYear2000 := year3000.After(year2000) // True isYear2000AfterYear3000 := year2000.After(year3000) // False fmt.Printf("year3000.After(year2000) = %v\n", isYear3000AfterYear2000) fmt.Printf("year2000.After(year3000) = %v\n", isYear2000AfterYear3000) ``` Output: ``` year3000.After(year2000) = true year2000.After(year3000) = false ``` ### func (Time) AppendFormat 1.5 ``` func (t Time) AppendFormat(b []byte, layout string) []byte ``` AppendFormat is like Format but appends the textual representation to b and returns the extended buffer. #### Example Code: ``` t := time.Date(2017, time.November, 4, 11, 0, 0, 0, time.UTC) text := []byte("Time: ") text = t.AppendFormat(text, time.Kitchen) fmt.Println(string(text)) ``` Output: ``` Time: 11:00AM ``` ### func (Time) Before ``` func (t Time) Before(u Time) bool ``` Before reports whether the time instant t is before u. #### Example Code: ``` year2000 := time.Date(2000, 1, 1, 0, 0, 0, 0, time.UTC) year3000 := time.Date(3000, 1, 1, 0, 0, 0, 0, time.UTC) isYear2000BeforeYear3000 := year2000.Before(year3000) // True isYear3000BeforeYear2000 := year3000.Before(year2000) // False fmt.Printf("year2000.Before(year3000) = %v\n", isYear2000BeforeYear3000) fmt.Printf("year3000.Before(year2000) = %v\n", isYear3000BeforeYear2000) ``` Output: ``` year2000.Before(year3000) = true year3000.Before(year2000) = false ``` ### func (Time) Clock ``` func (t Time) Clock() (hour, min, sec int) ``` Clock returns the hour, minute, and second within the day specified by t. ### func (Time) Compare 1.20 ``` func (t Time) Compare(u Time) int ``` Compare compares the time instant t with u. If t is before u, it returns -1; if t is after u, it returns +1; if they're the same, it returns 0. ### func (Time) Date ``` func (t Time) Date() (year int, month Month, day int) ``` Date returns the year, month, and day in which t occurs. #### Example Code: ``` d := time.Date(2000, 2, 1, 12, 30, 0, 0, time.UTC) year, month, day := d.Date() fmt.Printf("year = %v\n", year) fmt.Printf("month = %v\n", month) fmt.Printf("day = %v\n", day) ``` Output: ``` year = 2000 month = February day = 1 ``` ### func (Time) Day ``` func (t Time) Day() int ``` Day returns the day of the month specified by t. #### Example Code: ``` d := time.Date(2000, 2, 1, 12, 30, 0, 0, time.UTC) day := d.Day() fmt.Printf("day = %v\n", day) ``` Output: ``` day = 1 ``` ### func (Time) Equal ``` func (t Time) Equal(u Time) bool ``` Equal reports whether t and u represent the same time instant. Two times can be equal even if they are in different locations. For example, 6:00 +0200 and 4:00 UTC are Equal. See the documentation on the Time type for the pitfalls of using == with Time values; most code should use Equal instead. #### Example Code: ``` secondsEastOfUTC := int((8 * time.Hour).Seconds()) beijing := time.FixedZone("Beijing Time", secondsEastOfUTC) // Unlike the equal operator, Equal is aware that d1 and d2 are the // same instant but in different time zones. d1 := time.Date(2000, 2, 1, 12, 30, 0, 0, time.UTC) d2 := time.Date(2000, 2, 1, 20, 30, 0, 0, beijing) datesEqualUsingEqualOperator := d1 == d2 datesEqualUsingFunction := d1.Equal(d2) fmt.Printf("datesEqualUsingEqualOperator = %v\n", datesEqualUsingEqualOperator) fmt.Printf("datesEqualUsingFunction = %v\n", datesEqualUsingFunction) ``` Output: ``` datesEqualUsingEqualOperator = false datesEqualUsingFunction = true ``` ### func (Time) Format ``` func (t Time) Format(layout string) string ``` Format returns a textual representation of the time value formatted according to the layout defined by the argument. See the documentation for the constant called Layout to see how to represent the layout format. The executable example for Time.Format demonstrates the working of the layout string in detail and is a good reference. #### Example Code: ``` // Parse a time value from a string in the standard Unix format. t, err := time.Parse(time.UnixDate, "Wed Feb 25 11:06:39 PST 2015") if err != nil { // Always check errors even if they should not happen. panic(err) } tz, err := time.LoadLocation("Asia/Shanghai") if err != nil { // Always check errors even if they should not happen. panic(err) } // time.Time's Stringer method is useful without any format. fmt.Println("default format:", t) // Predefined constants in the package implement common layouts. fmt.Println("Unix format:", t.Format(time.UnixDate)) // The time zone attached to the time value affects its output. fmt.Println("Same, in UTC:", t.UTC().Format(time.UnixDate)) fmt.Println("in Shanghai with seconds:", t.In(tz).Format("2006-01-02T15:04:05 -070000")) fmt.Println("in Shanghai with colon seconds:", t.In(tz).Format("2006-01-02T15:04:05 -07:00:00")) // The rest of this function demonstrates the properties of the // layout string used in the format. // The layout string used by the Parse function and Format method // shows by example how the reference time should be represented. // We stress that one must show how the reference time is formatted, // not a time of the user's choosing. Thus each layout string is a // representation of the time stamp, // Jan 2 15:04:05 2006 MST // An easy way to remember this value is that it holds, when presented // in this order, the values (lined up with the elements above): // 1 2 3 4 5 6 -7 // There are some wrinkles illustrated below. // Most uses of Format and Parse use constant layout strings such as // the ones defined in this package, but the interface is flexible, // as these examples show. // Define a helper function to make the examples' output look nice. do := func(name, layout, want string) { got := t.Format(layout) if want != got { fmt.Printf("error: for %q got %q; expected %q\n", layout, got, want) return } fmt.Printf("%-16s %q gives %q\n", name, layout, got) } // Print a header in our output. fmt.Printf("\nFormats:\n\n") // Simple starter examples. do("Basic full date", "Mon Jan 2 15:04:05 MST 2006", "Wed Feb 25 11:06:39 PST 2015") do("Basic short date", "2006/01/02", "2015/02/25") // The hour of the reference time is 15, or 3PM. The layout can express // it either way, and since our value is the morning we should see it as // an AM time. We show both in one format string. Lower case too. do("AM/PM", "3PM==3pm==15h", "11AM==11am==11h") // When parsing, if the seconds value is followed by a decimal point // and some digits, that is taken as a fraction of a second even if // the layout string does not represent the fractional second. // Here we add a fractional second to our time value used above. t, err = time.Parse(time.UnixDate, "Wed Feb 25 11:06:39.1234 PST 2015") if err != nil { panic(err) } // It does not appear in the output if the layout string does not contain // a representation of the fractional second. do("No fraction", time.UnixDate, "Wed Feb 25 11:06:39 PST 2015") // Fractional seconds can be printed by adding a run of 0s or 9s after // a decimal point in the seconds value in the layout string. // If the layout digits are 0s, the fractional second is of the specified // width. Note that the output has a trailing zero. do("0s for fraction", "15:04:05.00000", "11:06:39.12340") // If the fraction in the layout is 9s, trailing zeros are dropped. do("9s for fraction", "15:04:05.99999999", "11:06:39.1234") ``` Output: ``` default format: 2015-02-25 11:06:39 -0800 PST Unix format: Wed Feb 25 11:06:39 PST 2015 Same, in UTC: Wed Feb 25 19:06:39 UTC 2015 in Shanghai with seconds: 2015-02-26T03:06:39 +080000 in Shanghai with colon seconds: 2015-02-26T03:06:39 +08:00:00 Formats: Basic full date "Mon Jan 2 15:04:05 MST 2006" gives "Wed Feb 25 11:06:39 PST 2015" Basic short date "2006/01/02" gives "2015/02/25" AM/PM "3PM==3pm==15h" gives "11AM==11am==11h" No fraction "Mon Jan _2 15:04:05 MST 2006" gives "Wed Feb 25 11:06:39 PST 2015" 0s for fraction "15:04:05.00000" gives "11:06:39.12340" 9s for fraction "15:04:05.99999999" gives "11:06:39.1234" ``` #### Example (Pad) Code: ``` // Parse a time value from a string in the standard Unix format. t, err := time.Parse(time.UnixDate, "Sat Mar 7 11:06:39 PST 2015") if err != nil { // Always check errors even if they should not happen. panic(err) } // Define a helper function to make the examples' output look nice. do := func(name, layout, want string) { got := t.Format(layout) if want != got { fmt.Printf("error: for %q got %q; expected %q\n", layout, got, want) return } fmt.Printf("%-16s %q gives %q\n", name, layout, got) } // The predefined constant Unix uses an underscore to pad the day. do("Unix", time.UnixDate, "Sat Mar 7 11:06:39 PST 2015") // For fixed-width printing of values, such as the date, that may be one or // two characters (7 vs. 07), use an _ instead of a space in the layout string. // Here we print just the day, which is 2 in our layout string and 7 in our // value. do("No pad", "<2>", "<7>") // An underscore represents a space pad, if the date only has one digit. do("Spaces", "<_2>", "< 7>") // A "0" indicates zero padding for single-digit values. do("Zeros", "<02>", "<07>") // If the value is already the right width, padding is not used. // For instance, the second (05 in the reference time) in our value is 39, // so it doesn't need padding, but the minutes (04, 06) does. do("Suppressed pad", "04:05", "06:39") ``` Output: ``` Unix "Mon Jan _2 15:04:05 MST 2006" gives "Sat Mar 7 11:06:39 PST 2015" No pad "<2>" gives "<7>" Spaces "<_2>" gives "< 7>" Zeros "<02>" gives "<07>" Suppressed pad "04:05" gives "06:39" ``` ### func (Time) GoString 1.17 ``` func (t Time) GoString() string ``` GoString implements fmt.GoStringer and formats t to be printed in Go source code. #### Example Code: ``` t := time.Date(2009, time.November, 10, 23, 0, 0, 0, time.UTC) fmt.Println(t.GoString()) t = t.Add(1 * time.Minute) fmt.Println(t.GoString()) t = t.AddDate(0, 1, 0) fmt.Println(t.GoString()) t, _ = time.Parse("Jan 2, 2006 at 3:04pm (MST)", "Feb 3, 2013 at 7:54pm (UTC)") fmt.Println(t.GoString()) ``` Output: ``` time.Date(2009, time.November, 10, 23, 0, 0, 0, time.UTC) time.Date(2009, time.November, 10, 23, 1, 0, 0, time.UTC) time.Date(2009, time.December, 10, 23, 1, 0, 0, time.UTC) time.Date(2013, time.February, 3, 19, 54, 0, 0, time.UTC) ``` ### func (\*Time) GobDecode ``` func (t *Time) GobDecode(data []byte) error ``` GobDecode implements the gob.GobDecoder interface. ### func (Time) GobEncode ``` func (t Time) GobEncode() ([]byte, error) ``` GobEncode implements the gob.GobEncoder interface. ### func (Time) Hour ``` func (t Time) Hour() int ``` Hour returns the hour within the day specified by t, in the range [0, 23]. ### func (Time) ISOWeek ``` func (t Time) ISOWeek() (year, week int) ``` ISOWeek returns the ISO 8601 year and week number in which t occurs. Week ranges from 1 to 53. Jan 01 to Jan 03 of year n might belong to week 52 or 53 of year n-1, and Dec 29 to Dec 31 might belong to week 1 of year n+1. ### func (Time) In ``` func (t Time) In(loc *Location) Time ``` In returns a copy of t representing the same time instant, but with the copy's location information set to loc for display purposes. In panics if loc is nil. ### func (Time) IsDST 1.17 ``` func (t Time) IsDST() bool ``` IsDST reports whether the time in the configured location is in Daylight Savings Time. ### func (Time) IsZero ``` func (t Time) IsZero() bool ``` IsZero reports whether t represents the zero time instant, January 1, year 1, 00:00:00 UTC. ### func (Time) Local ``` func (t Time) Local() Time ``` Local returns t with the location set to local time. ### func (Time) Location ``` func (t Time) Location() *Location ``` Location returns the time zone information associated with t. ### func (Time) MarshalBinary 1.2 ``` func (t Time) MarshalBinary() ([]byte, error) ``` MarshalBinary implements the encoding.BinaryMarshaler interface. ### func (Time) MarshalJSON ``` func (t Time) MarshalJSON() ([]byte, error) ``` MarshalJSON implements the json.Marshaler interface. The time is a quoted string in the RFC 3339 format with sub-second precision. If the timestamp cannot be represented as valid RFC 3339 (e.g., the year is out of range), then an error is reported. ### func (Time) MarshalText 1.2 ``` func (t Time) MarshalText() ([]byte, error) ``` MarshalText implements the encoding.TextMarshaler interface. The time is formatted in RFC 3339 format with sub-second precision. If the timestamp cannot be represented as valid RFC 3339 (e.g., the year is out of range), then an error is reported. ### func (Time) Minute ``` func (t Time) Minute() int ``` Minute returns the minute offset within the hour specified by t, in the range [0, 59]. ### func (Time) Month ``` func (t Time) Month() Month ``` Month returns the month of the year specified by t. ### func (Time) Nanosecond ``` func (t Time) Nanosecond() int ``` Nanosecond returns the nanosecond offset within the second specified by t, in the range [0, 999999999]. ### func (Time) Round 1.1 ``` func (t Time) Round(d Duration) Time ``` Round returns the result of rounding t to the nearest multiple of d (since the zero time). The rounding behavior for halfway values is to round up. If d <= 0, Round returns t stripped of any monotonic clock reading but otherwise unchanged. Round operates on the time as an absolute duration since the zero time; it does not operate on the presentation form of the time. Thus, Round(Hour) may return a time with a non-zero minute, depending on the time's Location. #### Example Code: ``` t := time.Date(0, 0, 0, 12, 15, 30, 918273645, time.UTC) round := []time.Duration{ time.Nanosecond, time.Microsecond, time.Millisecond, time.Second, 2 * time.Second, time.Minute, 10 * time.Minute, time.Hour, } for _, d := range round { fmt.Printf("t.Round(%6s) = %s\n", d, t.Round(d).Format("15:04:05.999999999")) } ``` Output: ``` t.Round( 1ns) = 12:15:30.918273645 t.Round( 1µs) = 12:15:30.918274 t.Round( 1ms) = 12:15:30.918 t.Round( 1s) = 12:15:31 t.Round( 2s) = 12:15:30 t.Round( 1m0s) = 12:16:00 t.Round( 10m0s) = 12:20:00 t.Round(1h0m0s) = 12:00:00 ``` ### func (Time) Second ``` func (t Time) Second() int ``` Second returns the second offset within the minute specified by t, in the range [0, 59]. ### func (Time) String ``` func (t Time) String() string ``` String returns the time formatted using the format string ``` "2006-01-02 15:04:05.999999999 -0700 MST" ``` If the time has a monotonic clock reading, the returned string includes a final field "m=±<value>", where value is the monotonic clock reading formatted as a decimal number of seconds. The returned string is meant for debugging; for a stable serialized representation, use t.MarshalText, t.MarshalBinary, or t.Format with an explicit format string. #### Example Code: ``` timeWithNanoseconds := time.Date(2000, 2, 1, 12, 13, 14, 15, time.UTC) withNanoseconds := timeWithNanoseconds.String() timeWithoutNanoseconds := time.Date(2000, 2, 1, 12, 13, 14, 0, time.UTC) withoutNanoseconds := timeWithoutNanoseconds.String() fmt.Printf("withNanoseconds = %v\n", string(withNanoseconds)) fmt.Printf("withoutNanoseconds = %v\n", string(withoutNanoseconds)) ``` Output: ``` withNanoseconds = 2000-02-01 12:13:14.000000015 +0000 UTC withoutNanoseconds = 2000-02-01 12:13:14 +0000 UTC ``` ### func (Time) Sub ``` func (t Time) Sub(u Time) Duration ``` Sub returns the duration t-u. If the result exceeds the maximum (or minimum) value that can be stored in a Duration, the maximum (or minimum) duration will be returned. To compute t-d for a duration d, use t.Add(-d). #### Example Code: ``` start := time.Date(2000, 1, 1, 0, 0, 0, 0, time.UTC) end := time.Date(2000, 1, 1, 12, 0, 0, 0, time.UTC) difference := end.Sub(start) fmt.Printf("difference = %v\n", difference) ``` Output: ``` difference = 12h0m0s ``` ### func (Time) Truncate 1.1 ``` func (t Time) Truncate(d Duration) Time ``` Truncate returns the result of rounding t down to a multiple of d (since the zero time). If d <= 0, Truncate returns t stripped of any monotonic clock reading but otherwise unchanged. Truncate operates on the time as an absolute duration since the zero time; it does not operate on the presentation form of the time. Thus, Truncate(Hour) may return a time with a non-zero minute, depending on the time's Location. #### Example Code: ``` t, _ := time.Parse("2006 Jan 02 15:04:05", "2012 Dec 07 12:15:30.918273645") trunc := []time.Duration{ time.Nanosecond, time.Microsecond, time.Millisecond, time.Second, 2 * time.Second, time.Minute, 10 * time.Minute, } for _, d := range trunc { fmt.Printf("t.Truncate(%5s) = %s\n", d, t.Truncate(d).Format("15:04:05.999999999")) } // To round to the last midnight in the local timezone, create a new Date. midnight := time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, time.Local) _ = midnight ``` Output: ``` t.Truncate( 1ns) = 12:15:30.918273645 t.Truncate( 1µs) = 12:15:30.918273 t.Truncate( 1ms) = 12:15:30.918 t.Truncate( 1s) = 12:15:30 t.Truncate( 2s) = 12:15:30 t.Truncate( 1m0s) = 12:15:00 t.Truncate(10m0s) = 12:10:00 ``` ### func (Time) UTC ``` func (t Time) UTC() Time ``` UTC returns t with the location set to UTC. ### func (Time) Unix ``` func (t Time) Unix() int64 ``` Unix returns t as a Unix time, the number of seconds elapsed since January 1, 1970 UTC. The result does not depend on the location associated with t. Unix-like operating systems often record time as a 32-bit count of seconds, but since the method here returns a 64-bit value it is valid for billions of years into the past or future. #### Example Code: ``` // 1 billion seconds of Unix, three ways. fmt.Println(time.Unix(1e9, 0).UTC()) // 1e9 seconds fmt.Println(time.Unix(0, 1e18).UTC()) // 1e18 nanoseconds fmt.Println(time.Unix(2e9, -1e18).UTC()) // 2e9 seconds - 1e18 nanoseconds t := time.Date(2001, time.September, 9, 1, 46, 40, 0, time.UTC) fmt.Println(t.Unix()) // seconds since 1970 fmt.Println(t.UnixNano()) // nanoseconds since 1970 ``` Output: ``` 2001-09-09 01:46:40 +0000 UTC 2001-09-09 01:46:40 +0000 UTC 2001-09-09 01:46:40 +0000 UTC 1000000000 1000000000000000000 ``` ### func (Time) UnixMicro 1.17 ``` func (t Time) UnixMicro() int64 ``` UnixMicro returns t as a Unix time, the number of microseconds elapsed since January 1, 1970 UTC. The result is undefined if the Unix time in microseconds cannot be represented by an int64 (a date before year -290307 or after year 294246). The result does not depend on the location associated with t. ### func (Time) UnixMilli 1.17 ``` func (t Time) UnixMilli() int64 ``` UnixMilli returns t as a Unix time, the number of milliseconds elapsed since January 1, 1970 UTC. The result is undefined if the Unix time in milliseconds cannot be represented by an int64 (a date more than 292 million years before or after 1970). The result does not depend on the location associated with t. ### func (Time) UnixNano ``` func (t Time) UnixNano() int64 ``` UnixNano returns t as a Unix time, the number of nanoseconds elapsed since January 1, 1970 UTC. The result is undefined if the Unix time in nanoseconds cannot be represented by an int64 (a date before the year 1678 or after 2262). Note that this means the result of calling UnixNano on the zero Time is undefined. The result does not depend on the location associated with t. ### func (\*Time) UnmarshalBinary 1.2 ``` func (t *Time) UnmarshalBinary(data []byte) error ``` UnmarshalBinary implements the encoding.BinaryUnmarshaler interface. ### func (\*Time) UnmarshalJSON ``` func (t *Time) UnmarshalJSON(data []byte) error ``` UnmarshalJSON implements the json.Unmarshaler interface. The time must be a quoted string in the RFC 3339 format. ### func (\*Time) UnmarshalText 1.2 ``` func (t *Time) UnmarshalText(data []byte) error ``` UnmarshalText implements the encoding.TextUnmarshaler interface. The time must be in the RFC 3339 format. ### func (Time) Weekday ``` func (t Time) Weekday() Weekday ``` Weekday returns the day of the week specified by t. ### func (Time) Year ``` func (t Time) Year() int ``` Year returns the year in which t occurs. ### func (Time) YearDay 1.1 ``` func (t Time) YearDay() int ``` YearDay returns the day of the year specified by t, in the range [1,365] for non-leap years, and [1,366] in leap years. ### func (Time) Zone ``` func (t Time) Zone() (name string, offset int) ``` Zone computes the time zone in effect at time t, returning the abbreviated name of the zone (such as "CET") and its offset in seconds east of UTC. ### func (Time) ZoneBounds 1.19 ``` func (t Time) ZoneBounds() (start, end Time) ``` ZoneBounds returns the bounds of the time zone in effect at time t. The zone begins at start and the next zone begins at end. If the zone begins at the beginning of time, start will be returned as a zero Time. If the zone goes on forever, end will be returned as a zero Time. The Location of the returned times will be the same as t. type Timer ---------- The Timer type represents a single event. When the Timer expires, the current time will be sent on C, unless the Timer was created by AfterFunc. A Timer must be created with NewTimer or AfterFunc. ``` type Timer struct { C <-chan Time // contains filtered or unexported fields } ``` ### func AfterFunc ``` func AfterFunc(d Duration, f func()) *Timer ``` AfterFunc waits for the duration to elapse and then calls f in its own goroutine. It returns a Timer that can be used to cancel the call using its Stop method. ### func NewTimer ``` func NewTimer(d Duration) *Timer ``` NewTimer creates a new Timer that will send the current time on its channel after at least duration d. ### func (\*Timer) Reset 1.1 ``` func (t *Timer) Reset(d Duration) bool ``` Reset changes the timer to expire after duration d. It returns true if the timer had been active, false if the timer had expired or been stopped. For a Timer created with NewTimer, Reset should be invoked only on stopped or expired timers with drained channels. If a program has already received a value from t.C, the timer is known to have expired and the channel drained, so t.Reset can be used directly. If a program has not yet received a value from t.C, however, the timer must be stopped and—if Stop reports that the timer expired before being stopped—the channel explicitly drained: ``` if !t.Stop() { <-t.C } t.Reset(d) ``` This should not be done concurrent to other receives from the Timer's channel. Note that it is not possible to use Reset's return value correctly, as there is a race condition between draining the channel and the new timer expiring. Reset should always be invoked on stopped or expired channels, as described above. The return value exists to preserve compatibility with existing programs. For a Timer created with AfterFunc(d, f), Reset either reschedules when f will run, in which case Reset returns true, or schedules f to run again, in which case it returns false. When Reset returns false, Reset neither waits for the prior f to complete before returning nor does it guarantee that the subsequent goroutine running f does not run concurrently with the prior one. If the caller needs to know whether the prior execution of f is completed, it must coordinate with f explicitly. ### func (\*Timer) Stop ``` func (t *Timer) Stop() bool ``` Stop prevents the Timer from firing. It returns true if the call stops the timer, false if the timer has already expired or been stopped. Stop does not close the channel, to prevent a read from the channel succeeding incorrectly. To ensure the channel is empty after a call to Stop, check the return value and drain the channel. For example, assuming the program has not received from t.C already: ``` if !t.Stop() { <-t.C } ``` This cannot be done concurrent to other receives from the Timer's channel or other calls to the Timer's Stop method. For a timer created with AfterFunc(d, f), if t.Stop returns false, then the timer has already expired and the function f has been started in its own goroutine; Stop does not wait for f to complete before returning. If the caller needs to know whether f is completed, it must coordinate with f explicitly. type Weekday ------------ A Weekday specifies a day of the week (Sunday = 0, ...). ``` type Weekday int ``` ``` const ( Sunday Weekday = iota Monday Tuesday Wednesday Thursday Friday Saturday ) ``` ### func (Weekday) String ``` func (d Weekday) String() string ``` String returns the English name of the day ("Sunday", "Monday", ...). Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [tzdata](tzdata/index) | Package tzdata provides an embedded copy of the timezone database. |
programming_docs
go Package tzdata Package tzdata =============== * `import "time/tzdata"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package tzdata provides an embedded copy of the timezone database. If this package is imported anywhere in the program, then if the time package cannot find tzdata files on the system, it will use this embedded information. Importing this package will increase the size of a program by about 450 KB. This package should normally be imported by a program's main package, not by a library. Libraries normally shouldn't decide whether to include the timezone database in a program. This package will be automatically imported if you build with -tags timetzdata. Index ----- ### Package files tzdata.go zipdata.go go Package testing Package testing ================ * `import "testing"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package testing provides support for automated testing of Go packages. It is intended to be used in concert with the "go test" command, which automates execution of any function of the form ``` func TestXxx(*testing.T) ``` where Xxx does not start with a lowercase letter. The function name serves to identify the test routine. Within these functions, use the Error, Fail or related methods to signal failure. To write a new test suite, create a file that contains the TestXxx functions as described here, and give that file a name ending in "\_test.go". The file will be excluded from regular package builds but will be included when the "go test" command is run. The test file can be in the same package as the one being tested, or in a corresponding package with the suffix "\_test". If the test file is in the same package, it may refer to unexported identifiers within the package, as in this example: ``` package abs import "testing" func TestAbs(t *testing.T) { got := Abs(-1) if got != 1 { t.Errorf("Abs(-1) = %d; want 1", got) } } ``` If the file is in a separate "\_test" package, the package being tested must be imported explicitly and only its exported identifiers may be used. This is known as "black box" testing. ``` package abs_test import ( "testing" "path_to_pkg/abs" ) func TestAbs(t *testing.T) { got := abs.Abs(-1) if got != 1 { t.Errorf("Abs(-1) = %d; want 1", got) } } ``` For more detail, run "go help test" and "go help testflag". ### Benchmarks Functions of the form ``` func BenchmarkXxx(*testing.B) ``` are considered benchmarks, and are executed by the "go test" command when its -bench flag is provided. Benchmarks are run sequentially. For a description of the testing flags, see <https://golang.org/cmd/go/#hdr-Testing_flags>. A sample benchmark function looks like this: ``` func BenchmarkRandInt(b *testing.B) { for i := 0; i < b.N; i++ { rand.Int() } } ``` The benchmark function must run the target code b.N times. During benchmark execution, b.N is adjusted until the benchmark function lasts long enough to be timed reliably. The output ``` BenchmarkRandInt-8 68453040 17.8 ns/op ``` means that the loop ran 68453040 times at a speed of 17.8 ns per loop. If a benchmark needs some expensive setup before running, the timer may be reset: ``` func BenchmarkBigLen(b *testing.B) { big := NewBig() b.ResetTimer() for i := 0; i < b.N; i++ { big.Len() } } ``` If a benchmark needs to test performance in a parallel setting, it may use the RunParallel helper function; such benchmarks are intended to be used with the go test -cpu flag: ``` func BenchmarkTemplateParallel(b *testing.B) { templ := template.Must(template.New("test").Parse("Hello, {{.}}!")) b.RunParallel(func(pb *testing.PB) { var buf bytes.Buffer for pb.Next() { buf.Reset() templ.Execute(&buf, "World") } }) } ``` A detailed specification of the benchmark results format is given in <https://golang.org/design/14313-benchmark-format>. There are standard tools for working with benchmark results at <https://golang.org/x/perf/cmd>. In particular, <https://golang.org/x/perf/cmd/benchstat> performs statistically robust A/B comparisons. ### Examples The package also runs and verifies example code. Example functions may include a concluding line comment that begins with "Output:" and is compared with the standard output of the function when the tests are run. (The comparison ignores leading and trailing space.) These are examples of an example: ``` func ExampleHello() { fmt.Println("hello") // Output: hello } func ExampleSalutations() { fmt.Println("hello, and") fmt.Println("goodbye") // Output: // hello, and // goodbye } ``` The comment prefix "Unordered output:" is like "Output:", but matches any line order: ``` func ExamplePerm() { for _, value := range Perm(5) { fmt.Println(value) } // Unordered output: 4 // 2 // 1 // 3 // 0 } ``` Example functions without output comments are compiled but not executed. The naming convention to declare examples for the package, a function F, a type T and method M on type T are: ``` func Example() { ... } func ExampleF() { ... } func ExampleT() { ... } func ExampleT_M() { ... } ``` Multiple example functions for a package/type/function/method may be provided by appending a distinct suffix to the name. The suffix must start with a lower-case letter. ``` func Example_suffix() { ... } func ExampleF_suffix() { ... } func ExampleT_suffix() { ... } func ExampleT_M_suffix() { ... } ``` The entire test file is presented as the example when it contains a single example function, at least one other function, type, variable, or constant declaration, and no test or benchmark functions. ### Fuzzing 'go test' and the testing package support fuzzing, a testing technique where a function is called with randomly generated inputs to find bugs not anticipated by unit tests. Functions of the form ``` func FuzzXxx(*testing.F) ``` are considered fuzz tests. For example: ``` func FuzzHex(f *testing.F) { for _, seed := range [][]byte{{}, {0}, {9}, {0xa}, {0xf}, {1, 2, 3, 4}} { f.Add(seed) } f.Fuzz(func(t *testing.T, in []byte) { enc := hex.EncodeToString(in) out, err := hex.DecodeString(enc) if err != nil { t.Fatalf("%v: decode: %v", in, err) } if !bytes.Equal(in, out) { t.Fatalf("%v: not equal after round trip: %v", in, out) } }) } ``` A fuzz test maintains a seed corpus, or a set of inputs which are run by default, and can seed input generation. Seed inputs may be registered by calling (\*F).Add or by storing files in the directory testdata/fuzz/<Name> (where <Name> is the name of the fuzz test) within the package containing the fuzz test. Seed inputs are optional, but the fuzzing engine may find bugs more efficiently when provided with a set of small seed inputs with good code coverage. These seed inputs can also serve as regression tests for bugs identified through fuzzing. The function passed to (\*F).Fuzz within the fuzz test is considered the fuzz target. A fuzz target must accept a \*T parameter, followed by one or more parameters for random inputs. The types of arguments passed to (\*F).Add must be identical to the types of these parameters. The fuzz target may signal that it's found a problem the same way tests do: by calling T.Fail (or any method that calls it like T.Error or T.Fatal) or by panicking. When fuzzing is enabled (by setting the -fuzz flag to a regular expression that matches a specific fuzz test), the fuzz target is called with arguments generated by repeatedly making random changes to the seed inputs. On supported platforms, 'go test' compiles the test executable with fuzzing coverage instrumentation. The fuzzing engine uses that instrumentation to find and cache inputs that expand coverage, increasing the likelihood of finding bugs. If the fuzz target fails for a given input, the fuzzing engine writes the inputs that caused the failure to a file in the directory testdata/fuzz/<Name> within the package directory. This file later serves as a seed input. If the file can't be written at that location (for example, because the directory is read-only), the fuzzing engine writes the file to the fuzz cache directory within the build cache instead. When fuzzing is disabled, the fuzz target is called with the seed inputs registered with F.Add and seed inputs from testdata/fuzz/<Name>. In this mode, the fuzz test acts much like a regular test, with subtests started with F.Fuzz instead of T.Run. See <https://go.dev/doc/fuzz> for documentation about fuzzing. ### Skipping Tests or benchmarks may be skipped at run time with a call to the Skip method of \*T or \*B: ``` func TestTimeConsuming(t *testing.T) { if testing.Short() { t.Skip("skipping test in short mode.") } ... } ``` The Skip method of \*T can be used in a fuzz target if the input is invalid, but should not be considered a failing input. For example: ``` func FuzzJSONMarshaling(f *testing.F) { f.Fuzz(func(t *testing.T, b []byte) { var v interface{} if err := json.Unmarshal(b, &v); err != nil { t.Skip() } if _, err := json.Marshal(v); err != nil { t.Errorf("Marshal: %v", err) } }) } ``` ### Subtests and Sub-benchmarks The Run methods of T and B allow defining subtests and sub-benchmarks, without having to define separate functions for each. This enables uses like table-driven benchmarks and creating hierarchical tests. It also provides a way to share common setup and tear-down code: ``` func TestFoo(t *testing.T) { // <setup code> t.Run("A=1", func(t *testing.T) { ... }) t.Run("A=2", func(t *testing.T) { ... }) t.Run("B=1", func(t *testing.T) { ... }) // <tear-down code> } ``` Each subtest and sub-benchmark has a unique name: the combination of the name of the top-level test and the sequence of names passed to Run, separated by slashes, with an optional trailing sequence number for disambiguation. The argument to the -run, -bench, and -fuzz command-line flags is an unanchored regular expression that matches the test's name. For tests with multiple slash-separated elements, such as subtests, the argument is itself slash-separated, with expressions matching each name element in turn. Because it is unanchored, an empty expression matches any string. For example, using "matching" to mean "whose name contains": ``` go test -run '' # Run all tests. go test -run Foo # Run top-level tests matching "Foo", such as "TestFooBar". go test -run Foo/A= # For top-level tests matching "Foo", run subtests matching "A=". go test -run /A=1 # For all top-level tests, run subtests matching "A=1". go test -fuzz FuzzFoo # Fuzz the target matching "FuzzFoo" ``` The -run argument can also be used to run a specific value in the seed corpus, for debugging. For example: ``` go test -run=FuzzFoo/9ddb952d9814 ``` The -fuzz and -run flags can both be set, in order to fuzz a target but skip the execution of all other tests. Subtests can also be used to control parallelism. A parent test will only complete once all of its subtests complete. In this example, all tests are run in parallel with each other, and only with each other, regardless of other top-level tests that may be defined: ``` func TestGroupedParallel(t *testing.T) { for _, tc := range tests { tc := tc // capture range variable t.Run(tc.Name, func(t *testing.T) { t.Parallel() ... }) } } ``` Run does not return until parallel subtests have completed, providing a way to clean up after a group of parallel tests: ``` func TestTeardownParallel(t *testing.T) { // This Run will not return until the parallel tests finish. t.Run("group", func(t *testing.T) { t.Run("Test1", parallelTest1) t.Run("Test2", parallelTest2) t.Run("Test3", parallelTest3) }) // <tear-down code> } ``` ### Main It is sometimes necessary for a test or benchmark program to do extra setup or teardown before or after it executes. It is also sometimes necessary to control which code runs on the main thread. To support these and other cases, if a test file contains a function: ``` func TestMain(m *testing.M) ``` then the generated test will call TestMain(m) instead of running the tests or benchmarks directly. TestMain runs in the main goroutine and can do whatever setup and teardown is necessary around a call to m.Run. m.Run will return an exit code that may be passed to os.Exit. If TestMain returns, the test wrapper will pass the result of m.Run to os.Exit itself. When TestMain is called, flag.Parse has not been run. If TestMain depends on command-line flags, including those of the testing package, it should call flag.Parse explicitly. Command line flags are always parsed by the time test or benchmark functions run. A simple implementation of TestMain is: ``` func TestMain(m *testing.M) { // call flag.Parse() here if TestMain uses flags os.Exit(m.Run()) } ``` TestMain is a low-level primitive and should not be necessary for casual testing needs, where ordinary test functions suffice. Index ----- * [func AllocsPerRun(runs int, f func()) (avg float64)](#AllocsPerRun) * [func CoverMode() string](#CoverMode) * [func Coverage() float64](#Coverage) * [func Init()](#Init) * [func Main(matchString func(pat, str string) (bool, error), tests []InternalTest, benchmarks []InternalBenchmark, examples []InternalExample)](#Main) * [func RegisterCover(c Cover)](#RegisterCover) * [func RunBenchmarks(matchString func(pat, str string) (bool, error), benchmarks []InternalBenchmark)](#RunBenchmarks) * [func RunExamples(matchString func(pat, str string) (bool, error), examples []InternalExample) (ok bool)](#RunExamples) * [func RunTests(matchString func(pat, str string) (bool, error), tests []InternalTest) (ok bool)](#RunTests) * [func Short() bool](#Short) * [func Verbose() bool](#Verbose) * [type B](#B) * [func (c \*B) Cleanup(f func())](#B.Cleanup) * [func (b \*B) Elapsed() time.Duration](#B.Elapsed) * [func (c \*B) Error(args ...any)](#B.Error) * [func (c \*B) Errorf(format string, args ...any)](#B.Errorf) * [func (c \*B) Fail()](#B.Fail) * [func (c \*B) FailNow()](#B.FailNow) * [func (c \*B) Failed() bool](#B.Failed) * [func (c \*B) Fatal(args ...any)](#B.Fatal) * [func (c \*B) Fatalf(format string, args ...any)](#B.Fatalf) * [func (c \*B) Helper()](#B.Helper) * [func (c \*B) Log(args ...any)](#B.Log) * [func (c \*B) Logf(format string, args ...any)](#B.Logf) * [func (c \*B) Name() string](#B.Name) * [func (b \*B) ReportAllocs()](#B.ReportAllocs) * [func (b \*B) ReportMetric(n float64, unit string)](#B.ReportMetric) * [func (b \*B) ResetTimer()](#B.ResetTimer) * [func (b \*B) Run(name string, f func(b \*B)) bool](#B.Run) * [func (b \*B) RunParallel(body func(\*PB))](#B.RunParallel) * [func (b \*B) SetBytes(n int64)](#B.SetBytes) * [func (b \*B) SetParallelism(p int)](#B.SetParallelism) * [func (c \*B) Setenv(key, value string)](#B.Setenv) * [func (c \*B) Skip(args ...any)](#B.Skip) * [func (c \*B) SkipNow()](#B.SkipNow) * [func (c \*B) Skipf(format string, args ...any)](#B.Skipf) * [func (c \*B) Skipped() bool](#B.Skipped) * [func (b \*B) StartTimer()](#B.StartTimer) * [func (b \*B) StopTimer()](#B.StopTimer) * [func (c \*B) TempDir() string](#B.TempDir) * [type BenchmarkResult](#BenchmarkResult) * [func Benchmark(f func(b \*B)) BenchmarkResult](#Benchmark) * [func (r BenchmarkResult) AllocedBytesPerOp() int64](#BenchmarkResult.AllocedBytesPerOp) * [func (r BenchmarkResult) AllocsPerOp() int64](#BenchmarkResult.AllocsPerOp) * [func (r BenchmarkResult) MemString() string](#BenchmarkResult.MemString) * [func (r BenchmarkResult) NsPerOp() int64](#BenchmarkResult.NsPerOp) * [func (r BenchmarkResult) String() string](#BenchmarkResult.String) * [type Cover](#Cover) * [type CoverBlock](#CoverBlock) * [type F](#F) * [func (f \*F) Add(args ...any)](#F.Add) * [func (c \*F) Cleanup(f func())](#F.Cleanup) * [func (c \*F) Error(args ...any)](#F.Error) * [func (c \*F) Errorf(format string, args ...any)](#F.Errorf) * [func (f \*F) Fail()](#F.Fail) * [func (c \*F) FailNow()](#F.FailNow) * [func (c \*F) Failed() bool](#F.Failed) * [func (c \*F) Fatal(args ...any)](#F.Fatal) * [func (c \*F) Fatalf(format string, args ...any)](#F.Fatalf) * [func (f \*F) Fuzz(ff any)](#F.Fuzz) * [func (f \*F) Helper()](#F.Helper) * [func (c \*F) Log(args ...any)](#F.Log) * [func (c \*F) Logf(format string, args ...any)](#F.Logf) * [func (c \*F) Name() string](#F.Name) * [func (c \*F) Setenv(key, value string)](#F.Setenv) * [func (c \*F) Skip(args ...any)](#F.Skip) * [func (c \*F) SkipNow()](#F.SkipNow) * [func (c \*F) Skipf(format string, args ...any)](#F.Skipf) * [func (f \*F) Skipped() bool](#F.Skipped) * [func (c \*F) TempDir() string](#F.TempDir) * [type InternalBenchmark](#InternalBenchmark) * [type InternalExample](#InternalExample) * [type InternalFuzzTarget](#InternalFuzzTarget) * [type InternalTest](#InternalTest) * [type M](#M) * [func MainStart(deps testDeps, tests []InternalTest, benchmarks []InternalBenchmark, fuzzTargets []InternalFuzzTarget, examples []InternalExample) \*M](#MainStart) * [func (m \*M) Run() (code int)](#M.Run) * [type PB](#PB) * [func (pb \*PB) Next() bool](#PB.Next) * [type T](#T) * [func (c \*T) Cleanup(f func())](#T.Cleanup) * [func (t \*T) Deadline() (deadline time.Time, ok bool)](#T.Deadline) * [func (c \*T) Error(args ...any)](#T.Error) * [func (c \*T) Errorf(format string, args ...any)](#T.Errorf) * [func (c \*T) Fail()](#T.Fail) * [func (c \*T) FailNow()](#T.FailNow) * [func (c \*T) Failed() bool](#T.Failed) * [func (c \*T) Fatal(args ...any)](#T.Fatal) * [func (c \*T) Fatalf(format string, args ...any)](#T.Fatalf) * [func (c \*T) Helper()](#T.Helper) * [func (c \*T) Log(args ...any)](#T.Log) * [func (c \*T) Logf(format string, args ...any)](#T.Logf) * [func (c \*T) Name() string](#T.Name) * [func (t \*T) Parallel()](#T.Parallel) * [func (t \*T) Run(name string, f func(t \*T)) bool](#T.Run) * [func (t \*T) Setenv(key, value string)](#T.Setenv) * [func (c \*T) Skip(args ...any)](#T.Skip) * [func (c \*T) SkipNow()](#T.SkipNow) * [func (c \*T) Skipf(format string, args ...any)](#T.Skipf) * [func (c \*T) Skipped() bool](#T.Skipped) * [func (c \*T) TempDir() string](#T.TempDir) * [type TB](#TB) ### Examples [B.ReportMetric](#example_B_ReportMetric) [B.ReportMetric (Parallel)](#example_B_ReportMetric_parallel) [B.RunParallel](#example_B_RunParallel) ### Package files allocs.go benchmark.go cover.go example.go fuzz.go match.go newcover.go run\_example.go testing.go testing\_other.go func AllocsPerRun 1.1 --------------------- ``` func AllocsPerRun(runs int, f func()) (avg float64) ``` AllocsPerRun returns the average number of allocations during calls to f. Although the return value has type float64, it will always be an integral value. To compute the number of allocations, the function will first be run once as a warm-up. The average number of allocations over the specified number of runs will then be measured and returned. AllocsPerRun sets GOMAXPROCS to 1 during its measurement and will restore it before returning. func CoverMode 1.8 ------------------ ``` func CoverMode() string ``` CoverMode reports what the test coverage mode is set to. The values are "set", "count", or "atomic". The return value will be empty if test coverage is not enabled. func Coverage 1.4 ----------------- ``` func Coverage() float64 ``` Coverage reports the current code coverage as a fraction in the range [0, 1]. If coverage is not enabled, Coverage returns 0. When running a large set of sequential test cases, checking Coverage after each one can be useful for identifying which test cases exercise new code paths. It is not a replacement for the reports generated by 'go test -cover' and 'go tool cover'. func Init 1.13 -------------- ``` func Init() ``` Init registers testing flags. These flags are automatically registered by the "go test" command before running test functions, so Init is only needed when calling functions such as Benchmark without using "go test". Init has no effect if it was already called. func Main --------- ``` func Main(matchString func(pat, str string) (bool, error), tests []InternalTest, benchmarks []InternalBenchmark, examples []InternalExample) ``` Main is an internal function, part of the implementation of the "go test" command. It was exported because it is cross-package and predates "internal" packages. It is no longer used by "go test" but preserved, as much as possible, for other systems that simulate "go test" using Main, but Main sometimes cannot be updated as new functionality is added to the testing package. Systems simulating "go test" should be updated to use MainStart. func RegisterCover 1.2 ---------------------- ``` func RegisterCover(c Cover) ``` RegisterCover records the coverage data accumulators for the tests. NOTE: This function is internal to the testing infrastructure and may change. It is not covered (yet) by the Go 1 compatibility guidelines. func RunBenchmarks ------------------ ``` func RunBenchmarks(matchString func(pat, str string) (bool, error), benchmarks []InternalBenchmark) ``` RunBenchmarks is an internal function but exported because it is cross-package; it is part of the implementation of the "go test" command. func RunExamples ---------------- ``` func RunExamples(matchString func(pat, str string) (bool, error), examples []InternalExample) (ok bool) ``` RunExamples is an internal function but exported because it is cross-package; it is part of the implementation of the "go test" command. func RunTests ------------- ``` func RunTests(matchString func(pat, str string) (bool, error), tests []InternalTest) (ok bool) ``` RunTests is an internal function but exported because it is cross-package; it is part of the implementation of the "go test" command. func Short ---------- ``` func Short() bool ``` Short reports whether the -test.short flag is set. func Verbose 1.1 ---------------- ``` func Verbose() bool ``` Verbose reports whether the -test.v flag is set. type B ------ B is a type passed to Benchmark functions to manage benchmark timing and to specify the number of iterations to run. A benchmark ends when its Benchmark function returns or calls any of the methods FailNow, Fatal, Fatalf, SkipNow, Skip, or Skipf. Those methods must be called only from the goroutine running the Benchmark function. The other reporting methods, such as the variations of Log and Error, may be called simultaneously from multiple goroutines. Like in tests, benchmark logs are accumulated during execution and dumped to standard output when done. Unlike in tests, benchmark logs are always printed, so as not to hide output whose existence may be affecting benchmark results. ``` type B struct { N int // contains filtered or unexported fields } ``` ### func (\*B) Cleanup 1.14 ``` func (c *B) Cleanup(f func()) ``` Cleanup registers a function to be called when the test (or subtest) and all its subtests complete. Cleanup functions will be called in last added, first called order. ### func (\*B) Elapsed 1.20 ``` func (b *B) Elapsed() time.Duration ``` Elapsed returns the measured elapsed time of the benchmark. The duration reported by Elapsed matches the one measured by StartTimer, StopTimer, and ResetTimer. ### func (\*B) Error ``` func (c *B) Error(args ...any) ``` Error is equivalent to Log followed by Fail. ### func (\*B) Errorf ``` func (c *B) Errorf(format string, args ...any) ``` Errorf is equivalent to Logf followed by Fail. ### func (\*B) Fail ``` func (c *B) Fail() ``` Fail marks the function as having failed but continues execution. ### func (\*B) FailNow ``` func (c *B) FailNow() ``` FailNow marks the function as having failed and stops its execution by calling runtime.Goexit (which then runs all deferred calls in the current goroutine). Execution will continue at the next test or benchmark. FailNow must be called from the goroutine running the test or benchmark function, not from other goroutines created during the test. Calling FailNow does not stop those other goroutines. ### func (\*B) Failed ``` func (c *B) Failed() bool ``` Failed reports whether the function has failed. ### func (\*B) Fatal ``` func (c *B) Fatal(args ...any) ``` Fatal is equivalent to Log followed by FailNow. ### func (\*B) Fatalf ``` func (c *B) Fatalf(format string, args ...any) ``` Fatalf is equivalent to Logf followed by FailNow. ### func (\*B) Helper 1.9 ``` func (c *B) Helper() ``` Helper marks the calling function as a test helper function. When printing file and line information, that function will be skipped. Helper may be called simultaneously from multiple goroutines. ### func (\*B) Log ``` func (c *B) Log(args ...any) ``` Log formats its arguments using default formatting, analogous to Println, and records the text in the error log. For tests, the text will be printed only if the test fails or the -test.v flag is set. For benchmarks, the text is always printed to avoid having performance depend on the value of the -test.v flag. ### func (\*B) Logf ``` func (c *B) Logf(format string, args ...any) ``` Logf formats its arguments according to the format, analogous to Printf, and records the text in the error log. A final newline is added if not provided. For tests, the text will be printed only if the test fails or the -test.v flag is set. For benchmarks, the text is always printed to avoid having performance depend on the value of the -test.v flag. ### func (\*B) Name 1.8 ``` func (c *B) Name() string ``` Name returns the name of the running (sub-) test or benchmark. The name will include the name of the test along with the names of any nested sub-tests. If two sibling sub-tests have the same name, Name will append a suffix to guarantee the returned name is unique. ### func (\*B) ReportAllocs 1.1 ``` func (b *B) ReportAllocs() ``` ReportAllocs enables malloc statistics for this benchmark. It is equivalent to setting -test.benchmem, but it only affects the benchmark function that calls ReportAllocs. ### func (\*B) ReportMetric 1.13 ``` func (b *B) ReportMetric(n float64, unit string) ``` ReportMetric adds "n unit" to the reported benchmark results. If the metric is per-iteration, the caller should divide by b.N, and by convention units should end in "/op". ReportMetric overrides any previously reported value for the same unit. ReportMetric panics if unit is the empty string or if unit contains any whitespace. If unit is a unit normally reported by the benchmark framework itself (such as "allocs/op"), ReportMetric will override that metric. Setting "ns/op" to 0 will suppress that built-in metric. #### Example Code: ``` // This reports a custom benchmark metric relevant to a // specific algorithm (in this case, sorting). testing.Benchmark(func(b *testing.B) { var compares int64 for i := 0; i < b.N; i++ { s := []int{5, 4, 3, 2, 1} sort.Slice(s, func(i, j int) bool { compares++ return s[i] < s[j] }) } // This metric is per-operation, so divide by b.N and // report it as a "/op" unit. b.ReportMetric(float64(compares)/float64(b.N), "compares/op") // This metric is per-time, so divide by b.Elapsed and // report it as a "/ns" unit. b.ReportMetric(float64(compares)/float64(b.Elapsed().Nanoseconds()), "compares/ns") }) ``` #### Example (Parallel) Code: ``` // This reports a custom benchmark metric relevant to a // specific algorithm (in this case, sorting) in parallel. testing.Benchmark(func(b *testing.B) { var compares atomic.Int64 b.RunParallel(func(pb *testing.PB) { for pb.Next() { s := []int{5, 4, 3, 2, 1} sort.Slice(s, func(i, j int) bool { // Because RunParallel runs the function many // times in parallel, we must increment the // counter atomically to avoid racing writes. compares.Add(1) return s[i] < s[j] }) } }) // NOTE: Report each metric once, after all of the parallel // calls have completed. // This metric is per-operation, so divide by b.N and // report it as a "/op" unit. b.ReportMetric(float64(compares.Load())/float64(b.N), "compares/op") // This metric is per-time, so divide by b.Elapsed and // report it as a "/ns" unit. b.ReportMetric(float64(compares.Load())/float64(b.Elapsed().Nanoseconds()), "compares/ns") }) ``` ### func (\*B) ResetTimer ``` func (b *B) ResetTimer() ``` ResetTimer zeroes the elapsed benchmark time and memory allocation counters and deletes user-reported metrics. It does not affect whether the timer is running. ### func (\*B) Run 1.7 ``` func (b *B) Run(name string, f func(b *B)) bool ``` Run benchmarks f as a subbenchmark with the given name. It reports whether there were any failures. A subbenchmark is like any other benchmark. A benchmark that calls Run at least once will not be measured itself and will be called once with N=1. ### func (\*B) RunParallel 1.3 ``` func (b *B) RunParallel(body func(*PB)) ``` RunParallel runs a benchmark in parallel. It creates multiple goroutines and distributes b.N iterations among them. The number of goroutines defaults to GOMAXPROCS. To increase parallelism for non-CPU-bound benchmarks, call SetParallelism before RunParallel. RunParallel is usually used with the go test -cpu flag. The body function will be run in each goroutine. It should set up any goroutine-local state and then iterate until pb.Next returns false. It should not use the StartTimer, StopTimer, or ResetTimer functions, because they have global effect. It should also not call Run. RunParallel reports ns/op values as wall time for the benchmark as a whole, not the sum of wall time or CPU time over each parallel goroutine. #### Example Code: ``` // Parallel benchmark for text/template.Template.Execute on a single object. testing.Benchmark(func(b *testing.B) { templ := template.Must(template.New("test").Parse("Hello, {{.}}!")) // RunParallel will create GOMAXPROCS goroutines // and distribute work among them. b.RunParallel(func(pb *testing.PB) { // Each goroutine has its own bytes.Buffer. var buf bytes.Buffer for pb.Next() { // The loop body is executed b.N times total across all goroutines. buf.Reset() templ.Execute(&buf, "World") } }) }) ``` ### func (\*B) SetBytes ``` func (b *B) SetBytes(n int64) ``` SetBytes records the number of bytes processed in a single operation. If this is called, the benchmark will report ns/op and MB/s. ### func (\*B) SetParallelism 1.3 ``` func (b *B) SetParallelism(p int) ``` SetParallelism sets the number of goroutines used by RunParallel to p\*GOMAXPROCS. There is usually no need to call SetParallelism for CPU-bound benchmarks. If p is less than 1, this call will have no effect. ### func (\*B) Setenv 1.17 ``` func (c *B) Setenv(key, value string) ``` Setenv calls os.Setenv(key, value) and uses Cleanup to restore the environment variable to its original value after the test. Because Setenv affects the whole process, it cannot be used in parallel tests or tests with parallel ancestors. ### func (\*B) Skip 1.1 ``` func (c *B) Skip(args ...any) ``` Skip is equivalent to Log followed by SkipNow. ### func (\*B) SkipNow 1.1 ``` func (c *B) SkipNow() ``` SkipNow marks the test as having been skipped and stops its execution by calling runtime.Goexit. If a test fails (see Error, Errorf, Fail) and is then skipped, it is still considered to have failed. Execution will continue at the next test or benchmark. See also FailNow. SkipNow must be called from the goroutine running the test, not from other goroutines created during the test. Calling SkipNow does not stop those other goroutines. ### func (\*B) Skipf 1.1 ``` func (c *B) Skipf(format string, args ...any) ``` Skipf is equivalent to Logf followed by SkipNow. ### func (\*B) Skipped 1.1 ``` func (c *B) Skipped() bool ``` Skipped reports whether the test was skipped. ### func (\*B) StartTimer ``` func (b *B) StartTimer() ``` StartTimer starts timing a test. This function is called automatically before a benchmark starts, but it can also be used to resume timing after a call to StopTimer. ### func (\*B) StopTimer ``` func (b *B) StopTimer() ``` StopTimer stops timing a test. This can be used to pause the timer while performing complex initialization that you don't want to measure. ### func (\*B) TempDir 1.15 ``` func (c *B) TempDir() string ``` TempDir returns a temporary directory for the test to use. The directory is automatically removed by Cleanup when the test and all its subtests complete. Each subsequent call to t.TempDir returns a unique directory; if the directory creation fails, TempDir terminates the test by calling Fatal. type BenchmarkResult -------------------- BenchmarkResult contains the results of a benchmark run. ``` type BenchmarkResult struct { N int // The number of iterations. T time.Duration // The total time taken. Bytes int64 // Bytes processed in one iteration. MemAllocs uint64 // The total number of memory allocations; added in Go 1.1 MemBytes uint64 // The total number of bytes allocated; added in Go 1.1 // Extra records additional metrics reported by ReportMetric. Extra map[string]float64 // Go 1.13 } ``` ### func Benchmark ``` func Benchmark(f func(b *B)) BenchmarkResult ``` Benchmark benchmarks a single function. It is useful for creating custom benchmarks that do not use the "go test" command. If f depends on testing flags, then Init must be used to register those flags before calling Benchmark and before calling flag.Parse. If f calls Run, the result will be an estimate of running all its subbenchmarks that don't call Run in sequence in a single benchmark. ### func (BenchmarkResult) AllocedBytesPerOp 1.1 ``` func (r BenchmarkResult) AllocedBytesPerOp() int64 ``` AllocedBytesPerOp returns the "B/op" metric, which is calculated as r.MemBytes / r.N. ### func (BenchmarkResult) AllocsPerOp 1.1 ``` func (r BenchmarkResult) AllocsPerOp() int64 ``` AllocsPerOp returns the "allocs/op" metric, which is calculated as r.MemAllocs / r.N. ### func (BenchmarkResult) MemString 1.1 ``` func (r BenchmarkResult) MemString() string ``` MemString returns r.AllocedBytesPerOp and r.AllocsPerOp in the same format as 'go test'. ### func (BenchmarkResult) NsPerOp ``` func (r BenchmarkResult) NsPerOp() int64 ``` NsPerOp returns the "ns/op" metric. ### func (BenchmarkResult) String ``` func (r BenchmarkResult) String() string ``` String returns a summary of the benchmark results. It follows the benchmark result line format from <https://golang.org/design/14313-benchmark-format>, not including the benchmark name. Extra metrics override built-in metrics of the same name. String does not include allocs/op or B/op, since those are reported by MemString. type Cover 1.2 -------------- Cover records information about test coverage checking. NOTE: This struct is internal to the testing infrastructure and may change. It is not covered (yet) by the Go 1 compatibility guidelines. ``` type Cover struct { Mode string Counters map[string][]uint32 Blocks map[string][]CoverBlock CoveredPackages string } ``` type CoverBlock 1.2 ------------------- CoverBlock records the coverage data for a single basic block. The fields are 1-indexed, as in an editor: The opening line of the file is number 1, for example. Columns are measured in bytes. NOTE: This struct is internal to the testing infrastructure and may change. It is not covered (yet) by the Go 1 compatibility guidelines. ``` type CoverBlock struct { Line0 uint32 // Line number for block start. Col0 uint16 // Column number for block start. Line1 uint32 // Line number for block end. Col1 uint16 // Column number for block end. Stmts uint16 // Number of statements included in this block. } ``` type F 1.18 ----------- F is a type passed to fuzz tests. Fuzz tests run generated inputs against a provided fuzz target, which can find and report potential bugs in the code being tested. A fuzz test runs the seed corpus by default, which includes entries provided by (\*F).Add and entries in the testdata/fuzz/<FuzzTestName> directory. After any necessary setup and calls to (\*F).Add, the fuzz test must then call (\*F).Fuzz to provide the fuzz target. See the testing package documentation for an example, and see the F.Fuzz and F.Add method documentation for details. \*F methods can only be called before (\*F).Fuzz. Once the test is executing the fuzz target, only (\*T) methods can be used. The only \*F methods that are allowed in the (\*F).Fuzz function are (\*F).Failed and (\*F).Name. ``` type F struct { // contains filtered or unexported fields } ``` ### func (\*F) Add 1.18 ``` func (f *F) Add(args ...any) ``` Add will add the arguments to the seed corpus for the fuzz test. This will be a no-op if called after or within the fuzz target, and args must match the arguments for the fuzz target. ### func (\*F) Cleanup 1.18 ``` func (c *F) Cleanup(f func()) ``` Cleanup registers a function to be called when the test (or subtest) and all its subtests complete. Cleanup functions will be called in last added, first called order. ### func (\*F) Error 1.18 ``` func (c *F) Error(args ...any) ``` Error is equivalent to Log followed by Fail. ### func (\*F) Errorf 1.18 ``` func (c *F) Errorf(format string, args ...any) ``` Errorf is equivalent to Logf followed by Fail. ### func (\*F) Fail 1.18 ``` func (f *F) Fail() ``` Fail marks the function as having failed but continues execution. ### func (\*F) FailNow 1.18 ``` func (c *F) FailNow() ``` FailNow marks the function as having failed and stops its execution by calling runtime.Goexit (which then runs all deferred calls in the current goroutine). Execution will continue at the next test or benchmark. FailNow must be called from the goroutine running the test or benchmark function, not from other goroutines created during the test. Calling FailNow does not stop those other goroutines. ### func (\*F) Failed 1.18 ``` func (c *F) Failed() bool ``` Failed reports whether the function has failed. ### func (\*F) Fatal 1.18 ``` func (c *F) Fatal(args ...any) ``` Fatal is equivalent to Log followed by FailNow. ### func (\*F) Fatalf 1.18 ``` func (c *F) Fatalf(format string, args ...any) ``` Fatalf is equivalent to Logf followed by FailNow. ### func (\*F) Fuzz 1.18 ``` func (f *F) Fuzz(ff any) ``` Fuzz runs the fuzz function, ff, for fuzz testing. If ff fails for a set of arguments, those arguments will be added to the seed corpus. ff must be a function with no return value whose first argument is \*T and whose remaining arguments are the types to be fuzzed. For example: ``` f.Fuzz(func(t *testing.T, b []byte, i int) { ... }) ``` The following types are allowed: []byte, string, bool, byte, rune, float32, float64, int, int8, int16, int32, int64, uint, uint8, uint16, uint32, uint64. More types may be supported in the future. ff must not call any \*F methods, e.g. (\*F).Log, (\*F).Error, (\*F).Skip. Use the corresponding \*T method instead. The only \*F methods that are allowed in the (\*F).Fuzz function are (\*F).Failed and (\*F).Name. This function should be fast and deterministic, and its behavior should not depend on shared state. No mutatable input arguments, or pointers to them, should be retained between executions of the fuzz function, as the memory backing them may be mutated during a subsequent invocation. ff must not modify the underlying data of the arguments provided by the fuzzing engine. When fuzzing, F.Fuzz does not return until a problem is found, time runs out (set with -fuzztime), or the test process is interrupted by a signal. F.Fuzz should be called exactly once, unless F.Skip or F.Fail is called beforehand. ### func (\*F) Helper 1.18 ``` func (f *F) Helper() ``` Helper marks the calling function as a test helper function. When printing file and line information, that function will be skipped. Helper may be called simultaneously from multiple goroutines. ### func (\*F) Log 1.18 ``` func (c *F) Log(args ...any) ``` Log formats its arguments using default formatting, analogous to Println, and records the text in the error log. For tests, the text will be printed only if the test fails or the -test.v flag is set. For benchmarks, the text is always printed to avoid having performance depend on the value of the -test.v flag. ### func (\*F) Logf 1.18 ``` func (c *F) Logf(format string, args ...any) ``` Logf formats its arguments according to the format, analogous to Printf, and records the text in the error log. A final newline is added if not provided. For tests, the text will be printed only if the test fails or the -test.v flag is set. For benchmarks, the text is always printed to avoid having performance depend on the value of the -test.v flag. ### func (\*F) Name 1.18 ``` func (c *F) Name() string ``` Name returns the name of the running (sub-) test or benchmark. The name will include the name of the test along with the names of any nested sub-tests. If two sibling sub-tests have the same name, Name will append a suffix to guarantee the returned name is unique. ### func (\*F) Setenv 1.18 ``` func (c *F) Setenv(key, value string) ``` Setenv calls os.Setenv(key, value) and uses Cleanup to restore the environment variable to its original value after the test. Because Setenv affects the whole process, it cannot be used in parallel tests or tests with parallel ancestors. ### func (\*F) Skip 1.18 ``` func (c *F) Skip(args ...any) ``` Skip is equivalent to Log followed by SkipNow. ### func (\*F) SkipNow 1.18 ``` func (c *F) SkipNow() ``` SkipNow marks the test as having been skipped and stops its execution by calling runtime.Goexit. If a test fails (see Error, Errorf, Fail) and is then skipped, it is still considered to have failed. Execution will continue at the next test or benchmark. See also FailNow. SkipNow must be called from the goroutine running the test, not from other goroutines created during the test. Calling SkipNow does not stop those other goroutines. ### func (\*F) Skipf 1.18 ``` func (c *F) Skipf(format string, args ...any) ``` Skipf is equivalent to Logf followed by SkipNow. ### func (\*F) Skipped 1.18 ``` func (f *F) Skipped() bool ``` Skipped reports whether the test was skipped. ### func (\*F) TempDir 1.18 ``` func (c *F) TempDir() string ``` TempDir returns a temporary directory for the test to use. The directory is automatically removed by Cleanup when the test and all its subtests complete. Each subsequent call to t.TempDir returns a unique directory; if the directory creation fails, TempDir terminates the test by calling Fatal. type InternalBenchmark ---------------------- InternalBenchmark is an internal type but exported because it is cross-package; it is part of the implementation of the "go test" command. ``` type InternalBenchmark struct { Name string F func(b *B) } ``` type InternalExample -------------------- ``` type InternalExample struct { Name string F func() Output string Unordered bool // Go 1.7 } ``` type InternalFuzzTarget 1.18 ---------------------------- InternalFuzzTarget is an internal type but exported because it is cross-package; it is part of the implementation of the "go test" command. ``` type InternalFuzzTarget struct { Name string Fn func(f *F) } ``` type InternalTest ----------------- InternalTest is an internal type but exported because it is cross-package; it is part of the implementation of the "go test" command. ``` type InternalTest struct { Name string F func(*T) } ``` type M 1.4 ---------- M is a type passed to a TestMain function to run the actual tests. ``` type M struct { // contains filtered or unexported fields } ``` ### func MainStart 1.4 ``` func MainStart(deps testDeps, tests []InternalTest, benchmarks []InternalBenchmark, fuzzTargets []InternalFuzzTarget, examples []InternalExample) *M ``` MainStart is meant for use by tests generated by 'go test'. It is not meant to be called directly and is not subject to the Go 1 compatibility document. It may change signature from release to release. ### func (\*M) Run 1.4 ``` func (m *M) Run() (code int) ``` Run runs the tests. It returns an exit code to pass to os.Exit. type PB 1.3 ----------- A PB is used by RunParallel for running parallel benchmarks. ``` type PB struct { // contains filtered or unexported fields } ``` ### func (\*PB) Next 1.3 ``` func (pb *PB) Next() bool ``` Next reports whether there are more iterations to execute. type T ------ T is a type passed to Test functions to manage test state and support formatted test logs. A test ends when its Test function returns or calls any of the methods FailNow, Fatal, Fatalf, SkipNow, Skip, or Skipf. Those methods, as well as the Parallel method, must be called only from the goroutine running the Test function. The other reporting methods, such as the variations of Log and Error, may be called simultaneously from multiple goroutines. ``` type T struct { // contains filtered or unexported fields } ``` ### func (\*T) Cleanup 1.14 ``` func (c *T) Cleanup(f func()) ``` Cleanup registers a function to be called when the test (or subtest) and all its subtests complete. Cleanup functions will be called in last added, first called order. ### func (\*T) Deadline 1.15 ``` func (t *T) Deadline() (deadline time.Time, ok bool) ``` Deadline reports the time at which the test binary will have exceeded the timeout specified by the -timeout flag. The ok result is false if the -timeout flag indicates “no timeout” (0). ### func (\*T) Error ``` func (c *T) Error(args ...any) ``` Error is equivalent to Log followed by Fail. ### func (\*T) Errorf ``` func (c *T) Errorf(format string, args ...any) ``` Errorf is equivalent to Logf followed by Fail. ### func (\*T) Fail ``` func (c *T) Fail() ``` Fail marks the function as having failed but continues execution. ### func (\*T) FailNow ``` func (c *T) FailNow() ``` FailNow marks the function as having failed and stops its execution by calling runtime.Goexit (which then runs all deferred calls in the current goroutine). Execution will continue at the next test or benchmark. FailNow must be called from the goroutine running the test or benchmark function, not from other goroutines created during the test. Calling FailNow does not stop those other goroutines. ### func (\*T) Failed ``` func (c *T) Failed() bool ``` Failed reports whether the function has failed. ### func (\*T) Fatal ``` func (c *T) Fatal(args ...any) ``` Fatal is equivalent to Log followed by FailNow. ### func (\*T) Fatalf ``` func (c *T) Fatalf(format string, args ...any) ``` Fatalf is equivalent to Logf followed by FailNow. ### func (\*T) Helper 1.9 ``` func (c *T) Helper() ``` Helper marks the calling function as a test helper function. When printing file and line information, that function will be skipped. Helper may be called simultaneously from multiple goroutines. ### func (\*T) Log ``` func (c *T) Log(args ...any) ``` Log formats its arguments using default formatting, analogous to Println, and records the text in the error log. For tests, the text will be printed only if the test fails or the -test.v flag is set. For benchmarks, the text is always printed to avoid having performance depend on the value of the -test.v flag. ### func (\*T) Logf ``` func (c *T) Logf(format string, args ...any) ``` Logf formats its arguments according to the format, analogous to Printf, and records the text in the error log. A final newline is added if not provided. For tests, the text will be printed only if the test fails or the -test.v flag is set. For benchmarks, the text is always printed to avoid having performance depend on the value of the -test.v flag. ### func (\*T) Name 1.8 ``` func (c *T) Name() string ``` Name returns the name of the running (sub-) test or benchmark. The name will include the name of the test along with the names of any nested sub-tests. If two sibling sub-tests have the same name, Name will append a suffix to guarantee the returned name is unique. ### func (\*T) Parallel ``` func (t *T) Parallel() ``` Parallel signals that this test is to be run in parallel with (and only with) other parallel tests. When a test is run multiple times due to use of -test.count or -test.cpu, multiple instances of a single test never run in parallel with each other. ### func (\*T) Run 1.7 ``` func (t *T) Run(name string, f func(t *T)) bool ``` Run runs f as a subtest of t called name. It runs f in a separate goroutine and blocks until f returns or calls t.Parallel to become a parallel test. Run reports whether f succeeded (or at least did not fail before calling t.Parallel). Run may be called simultaneously from multiple goroutines, but all such calls must return before the outer test function for t returns. ### func (\*T) Setenv 1.17 ``` func (t *T) Setenv(key, value string) ``` Setenv calls os.Setenv(key, value) and uses Cleanup to restore the environment variable to its original value after the test. Because Setenv affects the whole process, it cannot be used in parallel tests or tests with parallel ancestors. ### func (\*T) Skip 1.1 ``` func (c *T) Skip(args ...any) ``` Skip is equivalent to Log followed by SkipNow. ### func (\*T) SkipNow 1.1 ``` func (c *T) SkipNow() ``` SkipNow marks the test as having been skipped and stops its execution by calling runtime.Goexit. If a test fails (see Error, Errorf, Fail) and is then skipped, it is still considered to have failed. Execution will continue at the next test or benchmark. See also FailNow. SkipNow must be called from the goroutine running the test, not from other goroutines created during the test. Calling SkipNow does not stop those other goroutines. ### func (\*T) Skipf 1.1 ``` func (c *T) Skipf(format string, args ...any) ``` Skipf is equivalent to Logf followed by SkipNow. ### func (\*T) Skipped 1.1 ``` func (c *T) Skipped() bool ``` Skipped reports whether the test was skipped. ### func (\*T) TempDir 1.15 ``` func (c *T) TempDir() string ``` TempDir returns a temporary directory for the test to use. The directory is automatically removed by Cleanup when the test and all its subtests complete. Each subsequent call to t.TempDir returns a unique directory; if the directory creation fails, TempDir terminates the test by calling Fatal. type TB 1.2 ----------- TB is the interface common to T, B, and F. ``` type TB interface { Cleanup(func()) Error(args ...any) Errorf(format string, args ...any) Fail() FailNow() Failed() bool Fatal(args ...any) Fatalf(format string, args ...any) Helper() Log(args ...any) Logf(format string, args ...any) Name() string Setenv(key, value string) Skip(args ...any) SkipNow() Skipf(format string, args ...any) Skipped() bool TempDir() string // contains filtered or unexported methods } ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [fstest](fstest/index) | Package fstest implements support for testing implementations and users of file systems. | | [iotest](iotest/index) | Package iotest implements Readers and Writers useful mainly for testing. | | [quick](quick/index) | Package quick implements utility functions to help with black box testing. |
programming_docs
go Package fstest Package fstest =============== * `import "testing/fstest"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package fstest implements support for testing implementations and users of file systems. Index ----- * [func TestFS(fsys fs.FS, expected ...string) error](#TestFS) * [type MapFS](#MapFS) * [func (fsys MapFS) Glob(pattern string) ([]string, error)](#MapFS.Glob) * [func (fsys MapFS) Open(name string) (fs.File, error)](#MapFS.Open) * [func (fsys MapFS) ReadDir(name string) ([]fs.DirEntry, error)](#MapFS.ReadDir) * [func (fsys MapFS) ReadFile(name string) ([]byte, error)](#MapFS.ReadFile) * [func (fsys MapFS) Stat(name string) (fs.FileInfo, error)](#MapFS.Stat) * [func (fsys MapFS) Sub(dir string) (fs.FS, error)](#MapFS.Sub) * [type MapFile](#MapFile) ### Package files mapfs.go testfs.go func TestFS 1.16 ---------------- ``` func TestFS(fsys fs.FS, expected ...string) error ``` TestFS tests a file system implementation. It walks the entire tree of files in fsys, opening and checking that each file behaves correctly. It also checks that the file system contains at least the expected files. As a special case, if no expected files are listed, fsys must be empty. Otherwise, fsys must contain at least the listed files; it can also contain others. The contents of fsys must not change concurrently with TestFS. If TestFS finds any misbehaviors, it returns an error reporting all of them. The error text spans multiple lines, one per detected misbehavior. Typical usage inside a test is: ``` if err := fstest.TestFS(myFS, "file/that/should/be/present"); err != nil { t.Fatal(err) } ``` type MapFS 1.16 --------------- A MapFS is a simple in-memory file system for use in tests, represented as a map from path names (arguments to Open) to information about the files or directories they represent. The map need not include parent directories for files contained in the map; those will be synthesized if needed. But a directory can still be included by setting the MapFile.Mode's ModeDir bit; this may be necessary for detailed control over the directory's FileInfo or to create an empty directory. File system operations read directly from the map, so that the file system can be changed by editing the map as needed. An implication is that file system operations must not run concurrently with changes to the map, which would be a race. Another implication is that opening or reading a directory requires iterating over the entire map, so a MapFS should typically be used with not more than a few hundred entries or directory reads. ``` type MapFS map[string]*MapFile ``` ### func (MapFS) Glob 1.16 ``` func (fsys MapFS) Glob(pattern string) ([]string, error) ``` ### func (MapFS) Open 1.16 ``` func (fsys MapFS) Open(name string) (fs.File, error) ``` Open opens the named file. ### func (MapFS) ReadDir 1.16 ``` func (fsys MapFS) ReadDir(name string) ([]fs.DirEntry, error) ``` ### func (MapFS) ReadFile 1.16 ``` func (fsys MapFS) ReadFile(name string) ([]byte, error) ``` ### func (MapFS) Stat 1.16 ``` func (fsys MapFS) Stat(name string) (fs.FileInfo, error) ``` ### func (MapFS) Sub 1.16 ``` func (fsys MapFS) Sub(dir string) (fs.FS, error) ``` type MapFile 1.16 ----------------- A MapFile describes a single file in a MapFS. ``` type MapFile struct { Data []byte // file content Mode fs.FileMode // FileInfo.Mode ModTime time.Time // FileInfo.ModTime Sys any // FileInfo.Sys } ``` go Package iotest Package iotest =============== * `import "testing/iotest"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package iotest implements Readers and Writers useful mainly for testing. Index ----- * [Variables](#pkg-variables) * [func DataErrReader(r io.Reader) io.Reader](#DataErrReader) * [func ErrReader(err error) io.Reader](#ErrReader) * [func HalfReader(r io.Reader) io.Reader](#HalfReader) * [func NewReadLogger(prefix string, r io.Reader) io.Reader](#NewReadLogger) * [func NewWriteLogger(prefix string, w io.Writer) io.Writer](#NewWriteLogger) * [func OneByteReader(r io.Reader) io.Reader](#OneByteReader) * [func TestReader(r io.Reader, content []byte) error](#TestReader) * [func TimeoutReader(r io.Reader) io.Reader](#TimeoutReader) * [func TruncateWriter(w io.Writer, n int64) io.Writer](#TruncateWriter) ### Examples [ErrReader](#example_ErrReader) ### Package files logger.go reader.go writer.go Variables --------- ErrTimeout is a fake timeout error. ``` var ErrTimeout = errors.New("timeout") ``` func DataErrReader ------------------ ``` func DataErrReader(r io.Reader) io.Reader ``` DataErrReader changes the way errors are handled by a Reader. Normally, a Reader returns an error (typically EOF) from the first Read call after the last piece of data is read. DataErrReader wraps a Reader and changes its behavior so the final error is returned along with the final data, instead of in the first call after the final data. func ErrReader 1.16 ------------------- ``` func ErrReader(err error) io.Reader ``` ErrReader returns an io.Reader that returns 0, err from all Read calls. #### Example Code: ``` // A reader that always returns a custom error. r := iotest.ErrReader(errors.New("custom error")) n, err := r.Read(nil) fmt.Printf("n: %d\nerr: %q\n", n, err) ``` Output: ``` n: 0 err: "custom error" ``` func HalfReader --------------- ``` func HalfReader(r io.Reader) io.Reader ``` HalfReader returns a Reader that implements Read by reading half as many requested bytes from r. func NewReadLogger ------------------ ``` func NewReadLogger(prefix string, r io.Reader) io.Reader ``` NewReadLogger returns a reader that behaves like r except that it logs (using log.Printf) each read to standard error, printing the prefix and the hexadecimal data read. func NewWriteLogger ------------------- ``` func NewWriteLogger(prefix string, w io.Writer) io.Writer ``` NewWriteLogger returns a writer that behaves like w except that it logs (using log.Printf) each write to standard error, printing the prefix and the hexadecimal data written. func OneByteReader ------------------ ``` func OneByteReader(r io.Reader) io.Reader ``` OneByteReader returns a Reader that implements each non-empty Read by reading one byte from r. func TestReader 1.16 -------------------- ``` func TestReader(r io.Reader, content []byte) error ``` TestReader tests that reading from r returns the expected file content. It does reads of different sizes, until EOF. If r implements io.ReaderAt or io.Seeker, TestReader also checks that those operations behave as they should. If TestReader finds any misbehaviors, it returns an error reporting them. The error text may span multiple lines. func TimeoutReader ------------------ ``` func TimeoutReader(r io.Reader) io.Reader ``` TimeoutReader returns ErrTimeout on the second read with no data. Subsequent calls to read succeed. func TruncateWriter ------------------- ``` func TruncateWriter(w io.Writer, n int64) io.Writer ``` TruncateWriter returns a Writer that writes to w but stops silently after n bytes. go Package quick Package quick ============== * `import "testing/quick"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package quick implements utility functions to help with black box testing. The testing/quick package is frozen and is not accepting new features. Index ----- * [func Check(f any, config \*Config) error](#Check) * [func CheckEqual(f, g any, config \*Config) error](#CheckEqual) * [func Value(t reflect.Type, rand \*rand.Rand) (value reflect.Value, ok bool)](#Value) * [type CheckEqualError](#CheckEqualError) * [func (s \*CheckEqualError) Error() string](#CheckEqualError.Error) * [type CheckError](#CheckError) * [func (s \*CheckError) Error() string](#CheckError.Error) * [type Config](#Config) * [type Generator](#Generator) * [type SetupError](#SetupError) * [func (s SetupError) Error() string](#SetupError.Error) ### Package files quick.go func Check ---------- ``` func Check(f any, config *Config) error ``` Check looks for an input to f, any function that returns bool, such that f returns false. It calls f repeatedly, with arbitrary values for each argument. If f returns false on a given input, Check returns that input as a \*CheckError. For example: ``` func TestOddMultipleOfThree(t *testing.T) { f := func(x int) bool { y := OddMultipleOfThree(x) return y%2 == 1 && y%3 == 0 } if err := quick.Check(f, nil); err != nil { t.Error(err) } } ``` func CheckEqual --------------- ``` func CheckEqual(f, g any, config *Config) error ``` CheckEqual looks for an input on which f and g return different results. It calls f and g repeatedly with arbitrary values for each argument. If f and g return different answers, CheckEqual returns a \*CheckEqualError describing the input and the outputs. func Value ---------- ``` func Value(t reflect.Type, rand *rand.Rand) (value reflect.Value, ok bool) ``` Value returns an arbitrary value of the given type. If the type implements the Generator interface, that will be used. Note: To create arbitrary values for structs, all the fields must be exported. type CheckEqualError -------------------- A CheckEqualError is the result CheckEqual finding an error. ``` type CheckEqualError struct { CheckError Out1 []any Out2 []any } ``` ### func (\*CheckEqualError) Error ``` func (s *CheckEqualError) Error() string ``` type CheckError --------------- A CheckError is the result of Check finding an error. ``` type CheckError struct { Count int In []any } ``` ### func (\*CheckError) Error ``` func (s *CheckError) Error() string ``` type Config ----------- A Config structure contains options for running a test. ``` type Config struct { // MaxCount sets the maximum number of iterations. // If zero, MaxCountScale is used. MaxCount int // MaxCountScale is a non-negative scale factor applied to the // default maximum. // A count of zero implies the default, which is usually 100 // but can be set by the -quickchecks flag. MaxCountScale float64 // Rand specifies a source of random numbers. // If nil, a default pseudo-random source will be used. Rand *rand.Rand // Values specifies a function to generate a slice of // arbitrary reflect.Values that are congruent with the // arguments to the function being tested. // If nil, the top-level Value function is used to generate them. Values func([]reflect.Value, *rand.Rand) } ``` type Generator -------------- A Generator can generate random values of its own type. ``` type Generator interface { // Generate returns a random instance of the type on which it is a // method using the size as a size hint. Generate(rand *rand.Rand, size int) reflect.Value } ``` type SetupError --------------- A SetupError is the result of an error in the way that check is being used, independent of the functions being tested. ``` type SetupError string ``` ### func (SetupError) Error ``` func (s SetupError) Error() string ``` go Package heap Package heap ============= * `import "container/heap"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package heap provides heap operations for any type that implements heap.Interface. A heap is a tree with the property that each node is the minimum-valued node in its subtree. The minimum element in the tree is the root, at index 0. A heap is a common way to implement a priority queue. To build a priority queue, implement the Heap interface with the (negative) priority as the ordering for the Less method, so Push adds items while Pop removes the highest-priority item from the queue. The Examples include such an implementation; the file example\_pq\_test.go has the complete source. #### Example (IntHeap) This example inserts several ints into an IntHeap, checks the minimum, and removes them in order of priority. Code: ``` // This example demonstrates an integer heap built using the heap interface. package heap_test import ( "container/heap" "fmt" ) // An IntHeap is a min-heap of ints. type IntHeap []int func (h IntHeap) Len() int { return len(h) } func (h IntHeap) Less(i, j int) bool { return h[i] < h[j] } func (h IntHeap) Swap(i, j int) { h[i], h[j] = h[j], h[i] } func (h *IntHeap) Push(x any) { // Push and Pop use pointer receivers because they modify the slice's length, // not just its contents. *h = append(*h, x.(int)) } func (h *IntHeap) Pop() any { old := *h n := len(old) x := old[n-1] *h = old[0 : n-1] return x } // This example inserts several ints into an IntHeap, checks the minimum, // and removes them in order of priority. func Example_intHeap() { h := &IntHeap{2, 1, 5} heap.Init(h) heap.Push(h, 3) fmt.Printf("minimum: %d\n", (*h)[0]) for h.Len() > 0 { fmt.Printf("%d ", heap.Pop(h)) } // Output: // minimum: 1 // 1 2 3 5 } ``` #### Example (PriorityQueue) This example creates a PriorityQueue with some items, adds and manipulates an item, and then removes the items in priority order. Code: ``` // This example demonstrates a priority queue built using the heap interface. package heap_test import ( "container/heap" "fmt" ) // An Item is something we manage in a priority queue. type Item struct { value string // The value of the item; arbitrary. priority int // The priority of the item in the queue. // The index is needed by update and is maintained by the heap.Interface methods. index int // The index of the item in the heap. } // A PriorityQueue implements heap.Interface and holds Items. type PriorityQueue []*Item func (pq PriorityQueue) Len() int { return len(pq) } func (pq PriorityQueue) Less(i, j int) bool { // We want Pop to give us the highest, not lowest, priority so we use greater than here. return pq[i].priority > pq[j].priority } func (pq PriorityQueue) Swap(i, j int) { pq[i], pq[j] = pq[j], pq[i] pq[i].index = i pq[j].index = j } func (pq *PriorityQueue) Push(x any) { n := len(*pq) item := x.(*Item) item.index = n *pq = append(*pq, item) } func (pq *PriorityQueue) Pop() any { old := *pq n := len(old) item := old[n-1] old[n-1] = nil // avoid memory leak item.index = -1 // for safety *pq = old[0 : n-1] return item } // update modifies the priority and value of an Item in the queue. func (pq *PriorityQueue) update(item *Item, value string, priority int) { item.value = value item.priority = priority heap.Fix(pq, item.index) } // This example creates a PriorityQueue with some items, adds and manipulates an item, // and then removes the items in priority order. func Example_priorityQueue() { // Some items and their priorities. items := map[string]int{ "banana": 3, "apple": 2, "pear": 4, } // Create a priority queue, put the items in it, and // establish the priority queue (heap) invariants. pq := make(PriorityQueue, len(items)) i := 0 for value, priority := range items { pq[i] = &Item{ value: value, priority: priority, index: i, } i++ } heap.Init(&pq) // Insert a new item and then modify its priority. item := &Item{ value: "orange", priority: 1, } heap.Push(&pq, item) pq.update(item, item.value, 5) // Take the items out; they arrive in decreasing priority order. for pq.Len() > 0 { item := heap.Pop(&pq).(*Item) fmt.Printf("%.2d:%s ", item.priority, item.value) } // Output: // 05:orange 04:pear 03:banana 02:apple } ``` Index ----- * [func Fix(h Interface, i int)](#Fix) * [func Init(h Interface)](#Init) * [func Pop(h Interface) any](#Pop) * [func Push(h Interface, x any)](#Push) * [func Remove(h Interface, i int) any](#Remove) * [type Interface](#Interface) ### Examples [Package (IntHeap)](#example__intHeap) [Package (PriorityQueue)](#example__priorityQueue) ### Package files heap.go func Fix 1.2 ------------ ``` func Fix(h Interface, i int) ``` Fix re-establishes the heap ordering after the element at index i has changed its value. Changing the value of the element at index i and then calling Fix is equivalent to, but less expensive than, calling Remove(h, i) followed by a Push of the new value. The complexity is O(log n) where n = h.Len(). func Init --------- ``` func Init(h Interface) ``` Init establishes the heap invariants required by the other routines in this package. Init is idempotent with respect to the heap invariants and may be called whenever the heap invariants may have been invalidated. The complexity is O(n) where n = h.Len(). func Pop -------- ``` func Pop(h Interface) any ``` Pop removes and returns the minimum element (according to Less) from the heap. The complexity is O(log n) where n = h.Len(). Pop is equivalent to Remove(h, 0). func Push --------- ``` func Push(h Interface, x any) ``` Push pushes the element x onto the heap. The complexity is O(log n) where n = h.Len(). func Remove ----------- ``` func Remove(h Interface, i int) any ``` Remove removes and returns the element at index i from the heap. The complexity is O(log n) where n = h.Len(). type Interface -------------- The Interface type describes the requirements for a type using the routines in this package. Any type that implements it may be used as a min-heap with the following invariants (established after Init has been called or if the data is empty or sorted): ``` !h.Less(j, i) for 0 <= i < h.Len() and 2*i+1 <= j <= 2*i+2 and j < h.Len() ``` Note that Push and Pop in this interface are for package heap's implementation to call. To add and remove things from the heap, use heap.Push and heap.Pop. ``` type Interface interface { sort.Interface Push(x any) // add x as element Len() Pop() any // remove and return element Len() - 1. } ``` go Package list Package list ============= * `import "container/list"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package list implements a doubly linked list. To iterate over a list (where l is a \*List): ``` for e := l.Front(); e != nil; e = e.Next() { // do something with e.Value } ``` #### Example Code: ``` // Create a new list and put some numbers in it. l := list.New() e4 := l.PushBack(4) e1 := l.PushFront(1) l.InsertBefore(3, e4) l.InsertAfter(2, e1) // Iterate through list and print its contents. for e := l.Front(); e != nil; e = e.Next() { fmt.Println(e.Value) } ``` Output: ``` 1 2 3 4 ``` Index ----- * [type Element](#Element) * [func (e \*Element) Next() \*Element](#Element.Next) * [func (e \*Element) Prev() \*Element](#Element.Prev) * [type List](#List) * [func New() \*List](#New) * [func (l \*List) Back() \*Element](#List.Back) * [func (l \*List) Front() \*Element](#List.Front) * [func (l \*List) Init() \*List](#List.Init) * [func (l \*List) InsertAfter(v any, mark \*Element) \*Element](#List.InsertAfter) * [func (l \*List) InsertBefore(v any, mark \*Element) \*Element](#List.InsertBefore) * [func (l \*List) Len() int](#List.Len) * [func (l \*List) MoveAfter(e, mark \*Element)](#List.MoveAfter) * [func (l \*List) MoveBefore(e, mark \*Element)](#List.MoveBefore) * [func (l \*List) MoveToBack(e \*Element)](#List.MoveToBack) * [func (l \*List) MoveToFront(e \*Element)](#List.MoveToFront) * [func (l \*List) PushBack(v any) \*Element](#List.PushBack) * [func (l \*List) PushBackList(other \*List)](#List.PushBackList) * [func (l \*List) PushFront(v any) \*Element](#List.PushFront) * [func (l \*List) PushFrontList(other \*List)](#List.PushFrontList) * [func (l \*List) Remove(e \*Element) any](#List.Remove) ### Examples [Package](#example_) ### Package files list.go type Element ------------ Element is an element of a linked list. ``` type Element struct { // The value stored with this element. Value any // contains filtered or unexported fields } ``` ### func (\*Element) Next ``` func (e *Element) Next() *Element ``` Next returns the next list element or nil. ### func (\*Element) Prev ``` func (e *Element) Prev() *Element ``` Prev returns the previous list element or nil. type List --------- List represents a doubly linked list. The zero value for List is an empty list ready to use. ``` type List struct { // contains filtered or unexported fields } ``` ### func New ``` func New() *List ``` New returns an initialized list. ### func (\*List) Back ``` func (l *List) Back() *Element ``` Back returns the last element of list l or nil if the list is empty. ### func (\*List) Front ``` func (l *List) Front() *Element ``` Front returns the first element of list l or nil if the list is empty. ### func (\*List) Init ``` func (l *List) Init() *List ``` Init initializes or clears list l. ### func (\*List) InsertAfter ``` func (l *List) InsertAfter(v any, mark *Element) *Element ``` InsertAfter inserts a new element e with value v immediately after mark and returns e. If mark is not an element of l, the list is not modified. The mark must not be nil. ### func (\*List) InsertBefore ``` func (l *List) InsertBefore(v any, mark *Element) *Element ``` InsertBefore inserts a new element e with value v immediately before mark and returns e. If mark is not an element of l, the list is not modified. The mark must not be nil. ### func (\*List) Len ``` func (l *List) Len() int ``` Len returns the number of elements of list l. The complexity is O(1). ### func (\*List) MoveAfter 1.2 ``` func (l *List) MoveAfter(e, mark *Element) ``` MoveAfter moves element e to its new position after mark. If e or mark is not an element of l, or e == mark, the list is not modified. The element and mark must not be nil. ### func (\*List) MoveBefore 1.2 ``` func (l *List) MoveBefore(e, mark *Element) ``` MoveBefore moves element e to its new position before mark. If e or mark is not an element of l, or e == mark, the list is not modified. The element and mark must not be nil. ### func (\*List) MoveToBack ``` func (l *List) MoveToBack(e *Element) ``` MoveToBack moves element e to the back of list l. If e is not an element of l, the list is not modified. The element must not be nil. ### func (\*List) MoveToFront ``` func (l *List) MoveToFront(e *Element) ``` MoveToFront moves element e to the front of list l. If e is not an element of l, the list is not modified. The element must not be nil. ### func (\*List) PushBack ``` func (l *List) PushBack(v any) *Element ``` PushBack inserts a new element e with value v at the back of list l and returns e. ### func (\*List) PushBackList ``` func (l *List) PushBackList(other *List) ``` PushBackList inserts a copy of another list at the back of list l. The lists l and other may be the same. They must not be nil. ### func (\*List) PushFront ``` func (l *List) PushFront(v any) *Element ``` PushFront inserts a new element e with value v at the front of list l and returns e. ### func (\*List) PushFrontList ``` func (l *List) PushFrontList(other *List) ``` PushFrontList inserts a copy of another list at the front of list l. The lists l and other may be the same. They must not be nil. ### func (\*List) Remove ``` func (l *List) Remove(e *Element) any ``` Remove removes e from l if e is an element of list l. It returns the element value e.Value. The element must not be nil.
programming_docs
go Package ring Package ring ============= * `import "container/ring"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package ring implements operations on circular lists. Index ----- * [type Ring](#Ring) * [func New(n int) \*Ring](#New) * [func (r \*Ring) Do(f func(any))](#Ring.Do) * [func (r \*Ring) Len() int](#Ring.Len) * [func (r \*Ring) Link(s \*Ring) \*Ring](#Ring.Link) * [func (r \*Ring) Move(n int) \*Ring](#Ring.Move) * [func (r \*Ring) Next() \*Ring](#Ring.Next) * [func (r \*Ring) Prev() \*Ring](#Ring.Prev) * [func (r \*Ring) Unlink(n int) \*Ring](#Ring.Unlink) ### Examples [Ring.Do](#example_Ring_Do) [Ring.Len](#example_Ring_Len) [Ring.Link](#example_Ring_Link) [Ring.Move](#example_Ring_Move) [Ring.Next](#example_Ring_Next) [Ring.Prev](#example_Ring_Prev) [Ring.Unlink](#example_Ring_Unlink) ### Package files ring.go type Ring --------- A Ring is an element of a circular list, or ring. Rings do not have a beginning or end; a pointer to any ring element serves as reference to the entire ring. Empty rings are represented as nil Ring pointers. The zero value for a Ring is a one-element ring with a nil Value. ``` type Ring struct { Value any // for use by client; untouched by this library // contains filtered or unexported fields } ``` ### func New ``` func New(n int) *Ring ``` New creates a ring of n elements. ### func (\*Ring) Do ``` func (r *Ring) Do(f func(any)) ``` Do calls function f on each element of the ring, in forward order. The behavior of Do is undefined if f changes \*r. #### Example Code: ``` // Create a new ring of size 5 r := ring.New(5) // Get the length of the ring n := r.Len() // Initialize the ring with some integer values for i := 0; i < n; i++ { r.Value = i r = r.Next() } // Iterate through the ring and print its contents r.Do(func(p any) { fmt.Println(p.(int)) }) ``` Output: ``` 0 1 2 3 4 ``` ### func (\*Ring) Len ``` func (r *Ring) Len() int ``` Len computes the number of elements in ring r. It executes in time proportional to the number of elements. #### Example Code: ``` // Create a new ring of size 4 r := ring.New(4) // Print out its length fmt.Println(r.Len()) ``` Output: ``` 4 ``` ### func (\*Ring) Link ``` func (r *Ring) Link(s *Ring) *Ring ``` Link connects ring r with ring s such that r.Next() becomes s and returns the original value for r.Next(). r must not be empty. If r and s point to the same ring, linking them removes the elements between r and s from the ring. The removed elements form a subring and the result is a reference to that subring (if no elements were removed, the result is still the original value for r.Next(), and not nil). If r and s point to different rings, linking them creates a single ring with the elements of s inserted after r. The result points to the element following the last element of s after insertion. #### Example Code: ``` // Create two rings, r and s, of size 2 r := ring.New(2) s := ring.New(2) // Get the length of the ring lr := r.Len() ls := s.Len() // Initialize r with 0s for i := 0; i < lr; i++ { r.Value = 0 r = r.Next() } // Initialize s with 1s for j := 0; j < ls; j++ { s.Value = 1 s = s.Next() } // Link ring r and ring s rs := r.Link(s) // Iterate through the combined ring and print its contents rs.Do(func(p any) { fmt.Println(p.(int)) }) ``` Output: ``` 0 0 1 1 ``` ### func (\*Ring) Move ``` func (r *Ring) Move(n int) *Ring ``` Move moves n % r.Len() elements backward (n < 0) or forward (n >= 0) in the ring and returns that ring element. r must not be empty. #### Example Code: ``` // Create a new ring of size 5 r := ring.New(5) // Get the length of the ring n := r.Len() // Initialize the ring with some integer values for i := 0; i < n; i++ { r.Value = i r = r.Next() } // Move the pointer forward by three steps r = r.Move(3) // Iterate through the ring and print its contents r.Do(func(p any) { fmt.Println(p.(int)) }) ``` Output: ``` 3 4 0 1 2 ``` ### func (\*Ring) Next ``` func (r *Ring) Next() *Ring ``` Next returns the next ring element. r must not be empty. #### Example Code: ``` // Create a new ring of size 5 r := ring.New(5) // Get the length of the ring n := r.Len() // Initialize the ring with some integer values for i := 0; i < n; i++ { r.Value = i r = r.Next() } // Iterate through the ring and print its contents for j := 0; j < n; j++ { fmt.Println(r.Value) r = r.Next() } ``` Output: ``` 0 1 2 3 4 ``` ### func (\*Ring) Prev ``` func (r *Ring) Prev() *Ring ``` Prev returns the previous ring element. r must not be empty. #### Example Code: ``` // Create a new ring of size 5 r := ring.New(5) // Get the length of the ring n := r.Len() // Initialize the ring with some integer values for i := 0; i < n; i++ { r.Value = i r = r.Next() } // Iterate through the ring backwards and print its contents for j := 0; j < n; j++ { r = r.Prev() fmt.Println(r.Value) } ``` Output: ``` 4 3 2 1 0 ``` ### func (\*Ring) Unlink ``` func (r *Ring) Unlink(n int) *Ring ``` Unlink removes n % r.Len() elements from the ring r, starting at r.Next(). If n % r.Len() == 0, r remains unchanged. The result is the removed subring. r must not be empty. #### Example Code: ``` // Create a new ring of size 6 r := ring.New(6) // Get the length of the ring n := r.Len() // Initialize the ring with some integer values for i := 0; i < n; i++ { r.Value = i r = r.Next() } // Unlink three elements from r, starting from r.Next() r.Unlink(3) // Iterate through the remaining ring and print its contents r.Do(func(p any) { fmt.Println(p.(int)) }) ``` Output: ``` 0 4 5 ``` go Package unsafe Package unsafe =============== * `import "unsafe"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package unsafe contains operations that step around the type safety of Go programs. Packages that import unsafe may be non-portable and are not protected by the Go 1 compatibility guidelines. Index ----- * [func Alignof(x ArbitraryType) uintptr](#Alignof) * [func Offsetof(x ArbitraryType) uintptr](#Offsetof) * [func Sizeof(x ArbitraryType) uintptr](#Sizeof) * [func String(ptr \*byte, len IntegerType) string](#String) * [func StringData(str string) \*byte](#StringData) * [type ArbitraryType](#ArbitraryType) * [func Slice(ptr \*ArbitraryType, len IntegerType) []ArbitraryType](#Slice) * [func SliceData(slice []ArbitraryType) \*ArbitraryType](#SliceData) * [type IntegerType](#IntegerType) * [type Pointer](#Pointer) * [func Add(ptr Pointer, len IntegerType) Pointer](#Add) ### Package files unsafe.go func Alignof ------------ ``` func Alignof(x ArbitraryType) uintptr ``` Alignof takes an expression x of any type and returns the required alignment of a hypothetical variable v as if v was declared via var v = x. It is the largest value m such that the address of v is always zero mod m. It is the same as the value returned by reflect.TypeOf(x).Align(). As a special case, if a variable s is of struct type and f is a field within that struct, then Alignof(s.f) will return the required alignment of a field of that type within a struct. This case is the same as the value returned by reflect.TypeOf(s.f).FieldAlign(). The return value of Alignof is a Go constant if the type of the argument does not have variable size. (See the description of [Sizeof](#Sizeof) for a definition of variable sized types.) func Offsetof ------------- ``` func Offsetof(x ArbitraryType) uintptr ``` Offsetof returns the offset within the struct of the field represented by x, which must be of the form structValue.field. In other words, it returns the number of bytes between the start of the struct and the start of the field. The return value of Offsetof is a Go constant if the type of the argument x does not have variable size. (See the description of [Sizeof](#Sizeof) for a definition of variable sized types.) func Sizeof ----------- ``` func Sizeof(x ArbitraryType) uintptr ``` Sizeof takes an expression x of any type and returns the size in bytes of a hypothetical variable v as if v was declared via var v = x. The size does not include any memory possibly referenced by x. For instance, if x is a slice, Sizeof returns the size of the slice descriptor, not the size of the memory referenced by the slice. For a struct, the size includes any padding introduced by field alignment. The return value of Sizeof is a Go constant if the type of the argument x does not have variable size. (A type has variable size if it is a type parameter or if it is an array or struct type with elements of variable size). func String ----------- ``` func String(ptr *byte, len IntegerType) string ``` String returns a string value whose underlying bytes start at ptr and whose length is len. The len argument must be of integer type or an untyped constant. A constant len argument must be non-negative and representable by a value of type int; if it is an untyped constant it is given type int. At run time, if len is negative, or if ptr is nil and len is not zero, a run-time panic occurs. Since Go strings are immutable, the bytes passed to String must not be modified afterwards. func StringData --------------- ``` func StringData(str string) *byte ``` StringData returns a pointer to the underlying bytes of str. For an empty string the return value is unspecified, and may be nil. Since Go strings are immutable, the bytes returned by StringData must not be modified. type ArbitraryType ------------------ ArbitraryType is here for the purposes of documentation only and is not actually part of the unsafe package. It represents the type of an arbitrary Go expression. ``` type ArbitraryType int ``` ### func Slice ``` func Slice(ptr *ArbitraryType, len IntegerType) []ArbitraryType ``` The function Slice returns a slice whose underlying array starts at ptr and whose length and capacity are len. Slice(ptr, len) is equivalent to ``` (*[len]ArbitraryType)(unsafe.Pointer(ptr))[:] ``` except that, as a special case, if ptr is nil and len is zero, Slice returns nil. The len argument must be of integer type or an untyped constant. A constant len argument must be non-negative and representable by a value of type int; if it is an untyped constant it is given type int. At run time, if len is negative, or if ptr is nil and len is not zero, a run-time panic occurs. ### func SliceData ``` func SliceData(slice []ArbitraryType) *ArbitraryType ``` SliceData returns a pointer to the underlying array of the argument slice. * If cap(slice) > 0, SliceData returns &slice[:1][0]. * If slice == nil, SliceData returns nil. * Otherwise, SliceData returns a non-nil pointer to an unspecified memory address. type IntegerType ---------------- IntegerType is here for the purposes of documentation only and is not actually part of the unsafe package. It represents any arbitrary integer type. ``` type IntegerType int ``` type Pointer ------------ Pointer represents a pointer to an arbitrary type. There are four special operations available for type Pointer that are not available for other types: * A pointer value of any type can be converted to a Pointer. * A Pointer can be converted to a pointer value of any type. * A uintptr can be converted to a Pointer. * A Pointer can be converted to a uintptr. Pointer therefore allows a program to defeat the type system and read and write arbitrary memory. It should be used with extreme care. The following patterns involving Pointer are valid. Code not using these patterns is likely to be invalid today or to become invalid in the future. Even the valid patterns below come with important caveats. Running "go vet" can help find uses of Pointer that do not conform to these patterns, but silence from "go vet" is not a guarantee that the code is valid. (1) Conversion of a \*T1 to Pointer to \*T2. Provided that T2 is no larger than T1 and that the two share an equivalent memory layout, this conversion allows reinterpreting data of one type as data of another type. An example is the implementation of math.Float64bits: ``` func Float64bits(f float64) uint64 { return *(*uint64)(unsafe.Pointer(&f)) } ``` (2) Conversion of a Pointer to a uintptr (but not back to Pointer). Converting a Pointer to a uintptr produces the memory address of the value pointed at, as an integer. The usual use for such a uintptr is to print it. Conversion of a uintptr back to Pointer is not valid in general. A uintptr is an integer, not a reference. Converting a Pointer to a uintptr creates an integer value with no pointer semantics. Even if a uintptr holds the address of some object, the garbage collector will not update that uintptr's value if the object moves, nor will that uintptr keep the object from being reclaimed. The remaining patterns enumerate the only valid conversions from uintptr to Pointer. (3) Conversion of a Pointer to a uintptr and back, with arithmetic. If p points into an allocated object, it can be advanced through the object by conversion to uintptr, addition of an offset, and conversion back to Pointer. ``` p = unsafe.Pointer(uintptr(p) + offset) ``` The most common use of this pattern is to access fields in a struct or elements of an array: ``` // equivalent to f := unsafe.Pointer(&s.f) f := unsafe.Pointer(uintptr(unsafe.Pointer(&s)) + unsafe.Offsetof(s.f)) // equivalent to e := unsafe.Pointer(&x[i]) e := unsafe.Pointer(uintptr(unsafe.Pointer(&x[0])) + i*unsafe.Sizeof(x[0])) ``` It is valid both to add and to subtract offsets from a pointer in this way. It is also valid to use &^ to round pointers, usually for alignment. In all cases, the result must continue to point into the original allocated object. Unlike in C, it is not valid to advance a pointer just beyond the end of its original allocation: ``` // INVALID: end points outside allocated space. var s thing end = unsafe.Pointer(uintptr(unsafe.Pointer(&s)) + unsafe.Sizeof(s)) // INVALID: end points outside allocated space. b := make([]byte, n) end = unsafe.Pointer(uintptr(unsafe.Pointer(&b[0])) + uintptr(n)) ``` Note that both conversions must appear in the same expression, with only the intervening arithmetic between them: ``` // INVALID: uintptr cannot be stored in variable // before conversion back to Pointer. u := uintptr(p) p = unsafe.Pointer(u + offset) ``` Note that the pointer must point into an allocated object, so it may not be nil. ``` // INVALID: conversion of nil pointer u := unsafe.Pointer(nil) p := unsafe.Pointer(uintptr(u) + offset) ``` (4) Conversion of a Pointer to a uintptr when calling syscall.Syscall. The Syscall functions in package syscall pass their uintptr arguments directly to the operating system, which then may, depending on the details of the call, reinterpret some of them as pointers. That is, the system call implementation is implicitly converting certain arguments back from uintptr to pointer. If a pointer argument must be converted to uintptr for use as an argument, that conversion must appear in the call expression itself: ``` syscall.Syscall(SYS_READ, uintptr(fd), uintptr(unsafe.Pointer(p)), uintptr(n)) ``` The compiler handles a Pointer converted to a uintptr in the argument list of a call to a function implemented in assembly by arranging that the referenced allocated object, if any, is retained and not moved until the call completes, even though from the types alone it would appear that the object is no longer needed during the call. For the compiler to recognize this pattern, the conversion must appear in the argument list: ``` // INVALID: uintptr cannot be stored in variable // before implicit conversion back to Pointer during system call. u := uintptr(unsafe.Pointer(p)) syscall.Syscall(SYS_READ, uintptr(fd), u, uintptr(n)) ``` (5) Conversion of the result of reflect.Value.Pointer or reflect.Value.UnsafeAddr from uintptr to Pointer. Package reflect's Value methods named Pointer and UnsafeAddr return type uintptr instead of unsafe.Pointer to keep callers from changing the result to an arbitrary type without first importing "unsafe". However, this means that the result is fragile and must be converted to Pointer immediately after making the call, in the same expression: ``` p := (*int)(unsafe.Pointer(reflect.ValueOf(new(int)).Pointer())) ``` As in the cases above, it is invalid to store the result before the conversion: ``` // INVALID: uintptr cannot be stored in variable // before conversion back to Pointer. u := reflect.ValueOf(new(int)).Pointer() p := (*int)(unsafe.Pointer(u)) ``` (6) Conversion of a reflect.SliceHeader or reflect.StringHeader Data field to or from Pointer. As in the previous case, the reflect data structures SliceHeader and StringHeader declare the field Data as a uintptr to keep callers from changing the result to an arbitrary type without first importing "unsafe". However, this means that SliceHeader and StringHeader are only valid when interpreting the content of an actual slice or string value. ``` var s string hdr := (*reflect.StringHeader)(unsafe.Pointer(&s)) // case 1 hdr.Data = uintptr(unsafe.Pointer(p)) // case 6 (this case) hdr.Len = n ``` In this usage hdr.Data is really an alternate way to refer to the underlying pointer in the string header, not a uintptr variable itself. In general, reflect.SliceHeader and reflect.StringHeader should be used only as \*reflect.SliceHeader and \*reflect.StringHeader pointing at actual slices or strings, never as plain structs. A program should not declare or allocate variables of these struct types. ``` // INVALID: a directly-declared header will not hold Data as a reference. var hdr reflect.StringHeader hdr.Data = uintptr(unsafe.Pointer(p)) hdr.Len = n s := *(*string)(unsafe.Pointer(&hdr)) // p possibly already lost ``` ``` type Pointer *ArbitraryType ``` ### func Add ``` func Add(ptr Pointer, len IntegerType) Pointer ``` The function Add adds len to ptr and returns the updated pointer Pointer(uintptr(ptr) + uintptr(len)). The len argument must be of integer type or an untyped constant. A constant len argument must be representable by a value of type int; if it is an untyped constant it is given type int. The rules for valid uses of Pointer still apply. go Package suffixarray Package suffixarray ==================== * `import "index/suffixarray"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package suffixarray implements substring search in logarithmic time using an in-memory suffix array. Example use: ``` // create index for some data index := suffixarray.New(data) // lookup byte slice s offsets1 := index.Lookup(s, -1) // the list of all indices where s occurs in data offsets2 := index.Lookup(s, 3) // the list of at most 3 indices where s occurs in data ``` Index ----- * [type Index](#Index) * [func New(data []byte) \*Index](#New) * [func (x \*Index) Bytes() []byte](#Index.Bytes) * [func (x \*Index) FindAllIndex(r \*regexp.Regexp, n int) (result [][]int)](#Index.FindAllIndex) * [func (x \*Index) Lookup(s []byte, n int) (result []int)](#Index.Lookup) * [func (x \*Index) Read(r io.Reader) error](#Index.Read) * [func (x \*Index) Write(w io.Writer) error](#Index.Write) ### Examples [Index.Lookup](#example_Index_Lookup) ### Package files sais.go sais2.go suffixarray.go type Index ---------- Index implements a suffix array for fast substring search. ``` type Index struct { // contains filtered or unexported fields } ``` ### func New ``` func New(data []byte) *Index ``` New creates a new Index for data. Index creation time is O(N) for N = len(data). ### func (\*Index) Bytes ``` func (x *Index) Bytes() []byte ``` Bytes returns the data over which the index was created. It must not be modified. ### func (\*Index) FindAllIndex ``` func (x *Index) FindAllIndex(r *regexp.Regexp, n int) (result [][]int) ``` FindAllIndex returns a sorted list of non-overlapping matches of the regular expression r, where a match is a pair of indices specifying the matched slice of x.Bytes(). If n < 0, all matches are returned in successive order. Otherwise, at most n matches are returned and they may not be successive. The result is nil if there are no matches, or if n == 0. ### func (\*Index) Lookup ``` func (x *Index) Lookup(s []byte, n int) (result []int) ``` Lookup returns an unsorted list of at most n indices where the byte string s occurs in the indexed data. If n < 0, all occurrences are returned. The result is nil if s is empty, s is not found, or n == 0. Lookup time is O(log(N)\*len(s) + len(result)) where N is the size of the indexed data. #### Example Code: ``` index := suffixarray.New([]byte("banana")) offsets := index.Lookup([]byte("ana"), -1) for _, off := range offsets { fmt.Println(off) } ``` Output: ``` 1 3 ``` ### func (\*Index) Read ``` func (x *Index) Read(r io.Reader) error ``` Read reads the index from r into x; x must not be nil. ### func (\*Index) Write ``` func (x *Index) Write(w io.Writer) error ``` Write writes the index x to w.
programming_docs
go Package log Package log ============ * `import "log"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package log implements a simple logging package. It defines a type, Logger, with methods for formatting output. It also has a predefined 'standard' Logger accessible through helper functions Print[f|ln], Fatal[f|ln], and Panic[f|ln], which are easier to use than creating a Logger manually. That logger writes to standard error and prints the date and time of each logged message. Every log message is output on a separate line: if the message being printed does not end in a newline, the logger will add one. The Fatal functions call os.Exit(1) after writing the log message. The Panic functions call panic after writing the log message. Index ----- * [Constants](#pkg-constants) * [func Fatal(v ...any)](#Fatal) * [func Fatalf(format string, v ...any)](#Fatalf) * [func Fatalln(v ...any)](#Fatalln) * [func Flags() int](#Flags) * [func Output(calldepth int, s string) error](#Output) * [func Panic(v ...any)](#Panic) * [func Panicf(format string, v ...any)](#Panicf) * [func Panicln(v ...any)](#Panicln) * [func Prefix() string](#Prefix) * [func Print(v ...any)](#Print) * [func Printf(format string, v ...any)](#Printf) * [func Println(v ...any)](#Println) * [func SetFlags(flag int)](#SetFlags) * [func SetOutput(w io.Writer)](#SetOutput) * [func SetPrefix(prefix string)](#SetPrefix) * [func Writer() io.Writer](#Writer) * [type Logger](#Logger) * [func Default() \*Logger](#Default) * [func New(out io.Writer, prefix string, flag int) \*Logger](#New) * [func (l \*Logger) Fatal(v ...any)](#Logger.Fatal) * [func (l \*Logger) Fatalf(format string, v ...any)](#Logger.Fatalf) * [func (l \*Logger) Fatalln(v ...any)](#Logger.Fatalln) * [func (l \*Logger) Flags() int](#Logger.Flags) * [func (l \*Logger) Output(calldepth int, s string) error](#Logger.Output) * [func (l \*Logger) Panic(v ...any)](#Logger.Panic) * [func (l \*Logger) Panicf(format string, v ...any)](#Logger.Panicf) * [func (l \*Logger) Panicln(v ...any)](#Logger.Panicln) * [func (l \*Logger) Prefix() string](#Logger.Prefix) * [func (l \*Logger) Print(v ...any)](#Logger.Print) * [func (l \*Logger) Printf(format string, v ...any)](#Logger.Printf) * [func (l \*Logger) Println(v ...any)](#Logger.Println) * [func (l \*Logger) SetFlags(flag int)](#Logger.SetFlags) * [func (l \*Logger) SetOutput(w io.Writer)](#Logger.SetOutput) * [func (l \*Logger) SetPrefix(prefix string)](#Logger.SetPrefix) * [func (l \*Logger) Writer() io.Writer](#Logger.Writer) ### Examples [Logger](#example_Logger) [Logger.Output](#example_Logger_Output) ### Package files log.go Constants --------- These flags define which text to prefix to each log entry generated by the Logger. Bits are or'ed together to control what's printed. With the exception of the Lmsgprefix flag, there is no control over the order they appear (the order listed here) or the format they present (as described in the comments). The prefix is followed by a colon only when Llongfile or Lshortfile is specified. For example, flags Ldate | Ltime (or LstdFlags) produce, ``` 2009/01/23 01:23:23 message ``` while flags Ldate | Ltime | Lmicroseconds | Llongfile produce, ``` 2009/01/23 01:23:23.123123 /a/b/c/d.go:23: message ``` ``` const ( Ldate = 1 << iota // the date in the local time zone: 2009/01/23 Ltime // the time in the local time zone: 01:23:23 Lmicroseconds // microsecond resolution: 01:23:23.123123. assumes Ltime. Llongfile // full file name and line number: /a/b/c/d.go:23 Lshortfile // final file name element and line number: d.go:23. overrides Llongfile LUTC // if Ldate or Ltime is set, use UTC rather than the local time zone Lmsgprefix // move the "prefix" from the beginning of the line to before the message LstdFlags = Ldate | Ltime // initial values for the standard logger ) ``` func Fatal ---------- ``` func Fatal(v ...any) ``` Fatal is equivalent to Print() followed by a call to os.Exit(1). func Fatalf ----------- ``` func Fatalf(format string, v ...any) ``` Fatalf is equivalent to Printf() followed by a call to os.Exit(1). func Fatalln ------------ ``` func Fatalln(v ...any) ``` Fatalln is equivalent to Println() followed by a call to os.Exit(1). func Flags ---------- ``` func Flags() int ``` Flags returns the output flags for the standard logger. The flag bits are Ldate, Ltime, and so on. func Output 1.5 --------------- ``` func Output(calldepth int, s string) error ``` Output writes the output for a logging event. The string s contains the text to print after the prefix specified by the flags of the Logger. A newline is appended if the last character of s is not already a newline. Calldepth is the count of the number of frames to skip when computing the file name and line number if Llongfile or Lshortfile is set; a value of 1 will print the details for the caller of Output. func Panic ---------- ``` func Panic(v ...any) ``` Panic is equivalent to Print() followed by a call to panic(). func Panicf ----------- ``` func Panicf(format string, v ...any) ``` Panicf is equivalent to Printf() followed by a call to panic(). func Panicln ------------ ``` func Panicln(v ...any) ``` Panicln is equivalent to Println() followed by a call to panic(). func Prefix ----------- ``` func Prefix() string ``` Prefix returns the output prefix for the standard logger. func Print ---------- ``` func Print(v ...any) ``` Print calls Output to print to the standard logger. Arguments are handled in the manner of fmt.Print. func Printf ----------- ``` func Printf(format string, v ...any) ``` Printf calls Output to print to the standard logger. Arguments are handled in the manner of fmt.Printf. func Println ------------ ``` func Println(v ...any) ``` Println calls Output to print to the standard logger. Arguments are handled in the manner of fmt.Println. func SetFlags ------------- ``` func SetFlags(flag int) ``` SetFlags sets the output flags for the standard logger. The flag bits are Ldate, Ltime, and so on. func SetOutput -------------- ``` func SetOutput(w io.Writer) ``` SetOutput sets the output destination for the standard logger. func SetPrefix -------------- ``` func SetPrefix(prefix string) ``` SetPrefix sets the output prefix for the standard logger. func Writer 1.13 ---------------- ``` func Writer() io.Writer ``` Writer returns the output destination for the standard logger. type Logger ----------- A Logger represents an active logging object that generates lines of output to an io.Writer. Each logging operation makes a single call to the Writer's Write method. A Logger can be used simultaneously from multiple goroutines; it guarantees to serialize access to the Writer. ``` type Logger struct { // contains filtered or unexported fields } ``` #### Example Code: ``` var ( buf bytes.Buffer logger = log.New(&buf, "logger: ", log.Lshortfile) ) logger.Print("Hello, log file!") fmt.Print(&buf) ``` Output: ``` logger: example_test.go:19: Hello, log file! ``` ### func Default 1.16 ``` func Default() *Logger ``` Default returns the standard logger used by the package-level output functions. ### func New ``` func New(out io.Writer, prefix string, flag int) *Logger ``` New creates a new Logger. The out variable sets the destination to which log data will be written. The prefix appears at the beginning of each generated log line, or after the log header if the Lmsgprefix flag is provided. The flag argument defines the logging properties. ### func (\*Logger) Fatal ``` func (l *Logger) Fatal(v ...any) ``` Fatal is equivalent to l.Print() followed by a call to os.Exit(1). ### func (\*Logger) Fatalf ``` func (l *Logger) Fatalf(format string, v ...any) ``` Fatalf is equivalent to l.Printf() followed by a call to os.Exit(1). ### func (\*Logger) Fatalln ``` func (l *Logger) Fatalln(v ...any) ``` Fatalln is equivalent to l.Println() followed by a call to os.Exit(1). ### func (\*Logger) Flags ``` func (l *Logger) Flags() int ``` Flags returns the output flags for the logger. The flag bits are Ldate, Ltime, and so on. ### func (\*Logger) Output ``` func (l *Logger) Output(calldepth int, s string) error ``` Output writes the output for a logging event. The string s contains the text to print after the prefix specified by the flags of the Logger. A newline is appended if the last character of s is not already a newline. Calldepth is used to recover the PC and is provided for generality, although at the moment on all pre-defined paths it will be 2. #### Example Code: ``` var ( buf bytes.Buffer logger = log.New(&buf, "INFO: ", log.Lshortfile) infof = func(info string) { logger.Output(2, info) } ) infof("Hello world") fmt.Print(&buf) ``` Output: ``` INFO: example_test.go:36: Hello world ``` ### func (\*Logger) Panic ``` func (l *Logger) Panic(v ...any) ``` Panic is equivalent to l.Print() followed by a call to panic(). ### func (\*Logger) Panicf ``` func (l *Logger) Panicf(format string, v ...any) ``` Panicf is equivalent to l.Printf() followed by a call to panic(). ### func (\*Logger) Panicln ``` func (l *Logger) Panicln(v ...any) ``` Panicln is equivalent to l.Println() followed by a call to panic(). ### func (\*Logger) Prefix ``` func (l *Logger) Prefix() string ``` Prefix returns the output prefix for the logger. ### func (\*Logger) Print ``` func (l *Logger) Print(v ...any) ``` Print calls l.Output to print to the logger. Arguments are handled in the manner of fmt.Print. ### func (\*Logger) Printf ``` func (l *Logger) Printf(format string, v ...any) ``` Printf calls l.Output to print to the logger. Arguments are handled in the manner of fmt.Printf. ### func (\*Logger) Println ``` func (l *Logger) Println(v ...any) ``` Println calls l.Output to print to the logger. Arguments are handled in the manner of fmt.Println. ### func (\*Logger) SetFlags ``` func (l *Logger) SetFlags(flag int) ``` SetFlags sets the output flags for the logger. The flag bits are Ldate, Ltime, and so on. ### func (\*Logger) SetOutput 1.5 ``` func (l *Logger) SetOutput(w io.Writer) ``` SetOutput sets the output destination for the logger. ### func (\*Logger) SetPrefix ``` func (l *Logger) SetPrefix(prefix string) ``` SetPrefix sets the output prefix for the logger. ### func (\*Logger) Writer 1.12 ``` func (l *Logger) Writer() io.Writer ``` Writer returns the output destination for the logger. Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [syslog](syslog/index) | Package syslog provides a simple interface to the system log service. | go Package syslog Package syslog =============== * `import "log/syslog"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package syslog provides a simple interface to the system log service. It can send messages to the syslog daemon using UNIX domain sockets, UDP or TCP. Only one call to Dial is necessary. On write failures, the syslog client will attempt to reconnect to the server and write again. The syslog package is frozen and is not accepting new features. Some external packages provide more functionality. See: ``` https://godoc.org/?q=syslog ``` Index ----- * [func NewLogger(p Priority, logFlag int) (\*log.Logger, error)](#NewLogger) * [type Priority](#Priority) * [type Writer](#Writer) * [func Dial(network, raddr string, priority Priority, tag string) (\*Writer, error)](#Dial) * [func New(priority Priority, tag string) (\*Writer, error)](#New) * [func (w \*Writer) Alert(m string) error](#Writer.Alert) * [func (w \*Writer) Close() error](#Writer.Close) * [func (w \*Writer) Crit(m string) error](#Writer.Crit) * [func (w \*Writer) Debug(m string) error](#Writer.Debug) * [func (w \*Writer) Emerg(m string) error](#Writer.Emerg) * [func (w \*Writer) Err(m string) error](#Writer.Err) * [func (w \*Writer) Info(m string) error](#Writer.Info) * [func (w \*Writer) Notice(m string) error](#Writer.Notice) * [func (w \*Writer) Warning(m string) error](#Writer.Warning) * [func (w \*Writer) Write(b []byte) (int, error)](#Writer.Write) * [Bugs](#pkg-note-BUG) ### Examples [Dial](#example_Dial) ### Package files doc.go syslog.go syslog\_unix.go func NewLogger -------------- ``` func NewLogger(p Priority, logFlag int) (*log.Logger, error) ``` NewLogger creates a log.Logger whose output is written to the system log service with the specified priority, a combination of the syslog facility and severity. The logFlag argument is the flag set passed through to log.New to create the Logger. type Priority ------------- The Priority is a combination of the syslog facility and severity. For example, LOG\_ALERT | LOG\_FTP sends an alert severity message from the FTP facility. The default severity is LOG\_EMERG; the default facility is LOG\_KERN. ``` type Priority int ``` ``` const ( // From /usr/include/sys/syslog.h. // These are the same on Linux, BSD, and OS X. LOG_EMERG Priority = iota LOG_ALERT LOG_CRIT LOG_ERR LOG_WARNING LOG_NOTICE LOG_INFO LOG_DEBUG ) ``` ``` const ( // From /usr/include/sys/syslog.h. // These are the same up to LOG_FTP on Linux, BSD, and OS X. LOG_KERN Priority = iota << 3 LOG_USER LOG_MAIL LOG_DAEMON LOG_AUTH LOG_SYSLOG LOG_LPR LOG_NEWS LOG_UUCP LOG_CRON LOG_AUTHPRIV LOG_FTP LOG_LOCAL0 LOG_LOCAL1 LOG_LOCAL2 LOG_LOCAL3 LOG_LOCAL4 LOG_LOCAL5 LOG_LOCAL6 LOG_LOCAL7 ) ``` type Writer ----------- A Writer is a connection to a syslog server. ``` type Writer struct { // contains filtered or unexported fields } ``` ### func Dial ``` func Dial(network, raddr string, priority Priority, tag string) (*Writer, error) ``` Dial establishes a connection to a log daemon by connecting to address raddr on the specified network. Each write to the returned writer sends a log message with the facility and severity (from priority) and tag. If tag is empty, the os.Args[0] is used. If network is empty, Dial will connect to the local syslog server. Otherwise, see the documentation for net.Dial for valid values of network and raddr. #### Example Code: ``` sysLog, err := syslog.Dial("tcp", "localhost:1234", syslog.LOG_WARNING|syslog.LOG_DAEMON, "demotag") if err != nil { log.Fatal(err) } fmt.Fprintf(sysLog, "This is a daemon warning with demotag.") sysLog.Emerg("And this is a daemon emergency with demotag.") ``` ### func New ``` func New(priority Priority, tag string) (*Writer, error) ``` New establishes a new connection to the system log daemon. Each write to the returned writer sends a log message with the given priority (a combination of the syslog facility and severity) and prefix tag. If tag is empty, the os.Args[0] is used. ### func (\*Writer) Alert ``` func (w *Writer) Alert(m string) error ``` Alert logs a message with severity LOG\_ALERT, ignoring the severity passed to New. ### func (\*Writer) Close ``` func (w *Writer) Close() error ``` Close closes a connection to the syslog daemon. ### func (\*Writer) Crit ``` func (w *Writer) Crit(m string) error ``` Crit logs a message with severity LOG\_CRIT, ignoring the severity passed to New. ### func (\*Writer) Debug ``` func (w *Writer) Debug(m string) error ``` Debug logs a message with severity LOG\_DEBUG, ignoring the severity passed to New. ### func (\*Writer) Emerg ``` func (w *Writer) Emerg(m string) error ``` Emerg logs a message with severity LOG\_EMERG, ignoring the severity passed to New. ### func (\*Writer) Err ``` func (w *Writer) Err(m string) error ``` Err logs a message with severity LOG\_ERR, ignoring the severity passed to New. ### func (\*Writer) Info ``` func (w *Writer) Info(m string) error ``` Info logs a message with severity LOG\_INFO, ignoring the severity passed to New. ### func (\*Writer) Notice ``` func (w *Writer) Notice(m string) error ``` Notice logs a message with severity LOG\_NOTICE, ignoring the severity passed to New. ### func (\*Writer) Warning ``` func (w *Writer) Warning(m string) error ``` Warning logs a message with severity LOG\_WARNING, ignoring the severity passed to New. ### func (\*Writer) Write ``` func (w *Writer) Write(b []byte) (int, error) ``` Write sends a log message to the syslog daemon. Bugs ---- * ☞ This package is not implemented on Windows. As the syslog package is frozen, Windows users are encouraged to use a package outside of the standard library. For background, see <https://golang.org/issue/1108>. * ☞ This package is not implemented on Plan 9. go Package mime Package mime ============= * `import "mime"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package mime implements parts of the MIME spec. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func AddExtensionType(ext, typ string) error](#AddExtensionType) * [func ExtensionsByType(typ string) ([]string, error)](#ExtensionsByType) * [func FormatMediaType(t string, param map[string]string) string](#FormatMediaType) * [func ParseMediaType(v string) (mediatype string, params map[string]string, err error)](#ParseMediaType) * [func TypeByExtension(ext string) string](#TypeByExtension) * [type WordDecoder](#WordDecoder) * [func (d \*WordDecoder) Decode(word string) (string, error)](#WordDecoder.Decode) * [func (d \*WordDecoder) DecodeHeader(header string) (string, error)](#WordDecoder.DecodeHeader) * [type WordEncoder](#WordEncoder) * [func (e WordEncoder) Encode(charset, s string) string](#WordEncoder.Encode) ### Examples [FormatMediaType](#example_FormatMediaType) [ParseMediaType](#example_ParseMediaType) [WordDecoder.Decode](#example_WordDecoder_Decode) [WordDecoder.DecodeHeader](#example_WordDecoder_DecodeHeader) [WordEncoder.Encode](#example_WordEncoder_Encode) ### Package files encodedword.go grammar.go mediatype.go type.go type\_unix.go Constants --------- ``` const ( // BEncoding represents Base64 encoding scheme as defined by RFC 2045. BEncoding = WordEncoder('b') // QEncoding represents the Q-encoding scheme as defined by RFC 2047. QEncoding = WordEncoder('q') ) ``` Variables --------- ErrInvalidMediaParameter is returned by ParseMediaType if the media type value was found but there was an error parsing the optional parameters ``` var ErrInvalidMediaParameter = errors.New("mime: invalid media parameter") ``` func AddExtensionType --------------------- ``` func AddExtensionType(ext, typ string) error ``` AddExtensionType sets the MIME type associated with the extension ext to typ. The extension should begin with a leading dot, as in ".html". func ExtensionsByType 1.5 ------------------------- ``` func ExtensionsByType(typ string) ([]string, error) ``` ExtensionsByType returns the extensions known to be associated with the MIME type typ. The returned extensions will each begin with a leading dot, as in ".html". When typ has no associated extensions, ExtensionsByType returns an nil slice. func FormatMediaType -------------------- ``` func FormatMediaType(t string, param map[string]string) string ``` FormatMediaType serializes mediatype t and the parameters param as a media type conforming to RFC 2045 and RFC 2616. The type and parameter names are written in lower-case. When any of the arguments result in a standard violation then FormatMediaType returns the empty string. #### Example Code: ``` mediatype := "text/html" params := map[string]string{ "charset": "utf-8", } result := mime.FormatMediaType(mediatype, params) fmt.Println("result:", result) ``` Output: ``` result: text/html; charset=utf-8 ``` func ParseMediaType ------------------- ``` func ParseMediaType(v string) (mediatype string, params map[string]string, err error) ``` ParseMediaType parses a media type value and any optional parameters, per RFC 1521. Media types are the values in Content-Type and Content-Disposition headers (RFC 2183). On success, ParseMediaType returns the media type converted to lowercase and trimmed of white space and a non-nil map. If there is an error parsing the optional parameter, the media type will be returned along with the error ErrInvalidMediaParameter. The returned map, params, maps from the lowercase attribute to the attribute value with its case preserved. #### Example Code: ``` mediatype, params, err := mime.ParseMediaType("text/html; charset=utf-8") if err != nil { panic(err) } fmt.Println("type:", mediatype) fmt.Println("charset:", params["charset"]) ``` Output: ``` type: text/html charset: utf-8 ``` func TypeByExtension -------------------- ``` func TypeByExtension(ext string) string ``` TypeByExtension returns the MIME type associated with the file extension ext. The extension ext should begin with a leading dot, as in ".html". When ext has no associated type, TypeByExtension returns "". Extensions are looked up first case-sensitively, then case-insensitively. The built-in table is small but on unix it is augmented by the local system's MIME-info database or mime.types file(s) if available under one or more of these names: ``` /usr/local/share/mime/globs2 /usr/share/mime/globs2 /etc/mime.types /etc/apache2/mime.types /etc/apache/mime.types ``` On Windows, MIME types are extracted from the registry. Text types have the charset parameter set to "utf-8" by default. type WordDecoder 1.5 -------------------- A WordDecoder decodes MIME headers containing RFC 2047 encoded-words. ``` type WordDecoder struct { // CharsetReader, if non-nil, defines a function to generate // charset-conversion readers, converting from the provided // charset into UTF-8. // Charsets are always lower-case. utf-8, iso-8859-1 and us-ascii charsets // are handled by default. // One of the CharsetReader's result values must be non-nil. CharsetReader func(charset string, input io.Reader) (io.Reader, error) } ``` ### func (\*WordDecoder) Decode 1.5 ``` func (d *WordDecoder) Decode(word string) (string, error) ``` Decode decodes an RFC 2047 encoded-word. #### Example Code: ``` dec := new(mime.WordDecoder) header, err := dec.Decode("=?utf-8?q?=C2=A1Hola,_se=C3=B1or!?=") if err != nil { panic(err) } fmt.Println(header) dec.CharsetReader = func(charset string, input io.Reader) (io.Reader, error) { switch charset { case "x-case": // Fake character set for example. // Real use would integrate with packages such // as code.google.com/p/go-charset content, err := io.ReadAll(input) if err != nil { return nil, err } return bytes.NewReader(bytes.ToUpper(content)), nil default: return nil, fmt.Errorf("unhandled charset %q", charset) } } header, err = dec.Decode("=?x-case?q?hello!?=") if err != nil { panic(err) } fmt.Println(header) ``` Output: ``` ¡Hola, señor! HELLO! ``` ### func (\*WordDecoder) DecodeHeader 1.5 ``` func (d *WordDecoder) DecodeHeader(header string) (string, error) ``` DecodeHeader decodes all encoded-words of the given string. It returns an error if and only if CharsetReader of d returns an error. #### Example Code: ``` dec := new(mime.WordDecoder) header, err := dec.DecodeHeader("=?utf-8?q?=C3=89ric?= <[email protected]>, =?utf-8?q?Ana=C3=AFs?= <[email protected]>") if err != nil { panic(err) } fmt.Println(header) header, err = dec.DecodeHeader("=?utf-8?q?=C2=A1Hola,?= =?utf-8?q?_se=C3=B1or!?=") if err != nil { panic(err) } fmt.Println(header) dec.CharsetReader = func(charset string, input io.Reader) (io.Reader, error) { switch charset { case "x-case": // Fake character set for example. // Real use would integrate with packages such // as code.google.com/p/go-charset content, err := io.ReadAll(input) if err != nil { return nil, err } return bytes.NewReader(bytes.ToUpper(content)), nil default: return nil, fmt.Errorf("unhandled charset %q", charset) } } header, err = dec.DecodeHeader("=?x-case?q?hello_?= =?x-case?q?world!?=") if err != nil { panic(err) } fmt.Println(header) ``` Output: ``` Éric <[email protected]>, Anaïs <[email protected]> ¡Hola, señor! HELLO WORLD! ``` type WordEncoder 1.5 -------------------- A WordEncoder is an RFC 2047 encoded-word encoder. ``` type WordEncoder byte ``` ### func (WordEncoder) Encode 1.5 ``` func (e WordEncoder) Encode(charset, s string) string ``` Encode returns the encoded-word form of s. If s is ASCII without special characters, it is returned unchanged. The provided charset is the IANA charset name of s. It is case insensitive. #### Example Code: ``` fmt.Println(mime.QEncoding.Encode("utf-8", "¡Hola, señor!")) fmt.Println(mime.QEncoding.Encode("utf-8", "Hello!")) fmt.Println(mime.BEncoding.Encode("UTF-8", "¡Hola, señor!")) fmt.Println(mime.QEncoding.Encode("ISO-8859-1", "Caf\xE9")) ``` Output: ``` =?utf-8?q?=C2=A1Hola,_se=C3=B1or!?= Hello! =?UTF-8?b?wqFIb2xhLCBzZcOxb3Ih?= =?ISO-8859-1?q?Caf=E9?= ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [multipart](multipart/index) | Package multipart implements MIME multipart parsing, as defined in RFC 2046. | | [quotedprintable](quotedprintable/index) | Package quotedprintable implements quoted-printable encoding as specified by RFC 2045. |
programming_docs
go Package multipart Package multipart ================== * `import "mime/multipart"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package multipart implements MIME multipart parsing, as defined in RFC 2046. The implementation is sufficient for HTTP (RFC 2388) and the multipart bodies generated by popular browsers. Index ----- * [Variables](#pkg-variables) * [type File](#File) * [type FileHeader](#FileHeader) * [func (fh \*FileHeader) Open() (File, error)](#FileHeader.Open) * [type Form](#Form) * [func (f \*Form) RemoveAll() error](#Form.RemoveAll) * [type Part](#Part) * [func (p \*Part) Close() error](#Part.Close) * [func (p \*Part) FileName() string](#Part.FileName) * [func (p \*Part) FormName() string](#Part.FormName) * [func (p \*Part) Read(d []byte) (n int, err error)](#Part.Read) * [type Reader](#Reader) * [func NewReader(r io.Reader, boundary string) \*Reader](#NewReader) * [func (r \*Reader) NextPart() (\*Part, error)](#Reader.NextPart) * [func (r \*Reader) NextRawPart() (\*Part, error)](#Reader.NextRawPart) * [func (r \*Reader) ReadForm(maxMemory int64) (\*Form, error)](#Reader.ReadForm) * [type Writer](#Writer) * [func NewWriter(w io.Writer) \*Writer](#NewWriter) * [func (w \*Writer) Boundary() string](#Writer.Boundary) * [func (w \*Writer) Close() error](#Writer.Close) * [func (w \*Writer) CreateFormField(fieldname string) (io.Writer, error)](#Writer.CreateFormField) * [func (w \*Writer) CreateFormFile(fieldname, filename string) (io.Writer, error)](#Writer.CreateFormFile) * [func (w \*Writer) CreatePart(header textproto.MIMEHeader) (io.Writer, error)](#Writer.CreatePart) * [func (w \*Writer) FormDataContentType() string](#Writer.FormDataContentType) * [func (w \*Writer) SetBoundary(boundary string) error](#Writer.SetBoundary) * [func (w \*Writer) WriteField(fieldname, value string) error](#Writer.WriteField) ### Examples [NewReader](#example_NewReader) ### Package files formdata.go multipart.go writer.go Variables --------- ErrMessageTooLarge is returned by ReadForm if the message form data is too large to be processed. ``` var ErrMessageTooLarge = errors.New("multipart: message too large") ``` type File --------- File is an interface to access the file part of a multipart message. Its contents may be either stored in memory or on disk. If stored on disk, the File's underlying concrete type will be an \*os.File. ``` type File interface { io.Reader io.ReaderAt io.Seeker io.Closer } ``` type FileHeader --------------- A FileHeader describes a file part of a multipart request. ``` type FileHeader struct { Filename string Header textproto.MIMEHeader Size int64 // Go 1.9 // contains filtered or unexported fields } ``` ### func (\*FileHeader) Open ``` func (fh *FileHeader) Open() (File, error) ``` Open opens and returns the FileHeader's associated File. type Form --------- Form is a parsed multipart form. Its File parts are stored either in memory or on disk, and are accessible via the \*FileHeader's Open method. Its Value parts are stored as strings. Both are keyed by field name. ``` type Form struct { Value map[string][]string File map[string][]*FileHeader } ``` ### func (\*Form) RemoveAll ``` func (f *Form) RemoveAll() error ``` RemoveAll removes any temporary files associated with a Form. type Part --------- A Part represents a single part in a multipart body. ``` type Part struct { // The headers of the body, if any, with the keys canonicalized // in the same fashion that the Go http.Request headers are. // For example, "foo-bar" changes case to "Foo-Bar" Header textproto.MIMEHeader // contains filtered or unexported fields } ``` ### func (\*Part) Close ``` func (p *Part) Close() error ``` ### func (\*Part) FileName ``` func (p *Part) FileName() string ``` FileName returns the filename parameter of the Part's Content-Disposition header. If not empty, the filename is passed through filepath.Base (which is platform dependent) before being returned. ### func (\*Part) FormName ``` func (p *Part) FormName() string ``` FormName returns the name parameter if p has a Content-Disposition of type "form-data". Otherwise it returns the empty string. ### func (\*Part) Read ``` func (p *Part) Read(d []byte) (n int, err error) ``` Read reads the body of a part, after its headers and before the next part (if any) begins. type Reader ----------- Reader is an iterator over parts in a MIME multipart body. Reader's underlying parser consumes its input as needed. Seeking isn't supported. ``` type Reader struct { // contains filtered or unexported fields } ``` ### func NewReader ``` func NewReader(r io.Reader, boundary string) *Reader ``` NewReader creates a new multipart Reader reading from r using the given MIME boundary. The boundary is usually obtained from the "boundary" parameter of the message's "Content-Type" header. Use mime.ParseMediaType to parse such headers. #### Example Code: ``` msg := &mail.Message{ Header: map[string][]string{ "Content-Type": {"multipart/mixed; boundary=foo"}, }, Body: strings.NewReader( "--foo\r\nFoo: one\r\n\r\nA section\r\n" + "--foo\r\nFoo: two\r\n\r\nAnd another\r\n" + "--foo--\r\n"), } mediaType, params, err := mime.ParseMediaType(msg.Header.Get("Content-Type")) if err != nil { log.Fatal(err) } if strings.HasPrefix(mediaType, "multipart/") { mr := multipart.NewReader(msg.Body, params["boundary"]) for { p, err := mr.NextPart() if err == io.EOF { return } if err != nil { log.Fatal(err) } slurp, err := io.ReadAll(p) if err != nil { log.Fatal(err) } fmt.Printf("Part %q: %q\n", p.Header.Get("Foo"), slurp) } } ``` Output: ``` Part "one": "A section" Part "two": "And another" ``` ### func (\*Reader) NextPart ``` func (r *Reader) NextPart() (*Part, error) ``` NextPart returns the next part in the multipart or an error. When there are no more parts, the error io.EOF is returned. As a special case, if the "Content-Transfer-Encoding" header has a value of "quoted-printable", that header is instead hidden and the body is transparently decoded during Read calls. ### func (\*Reader) NextRawPart 1.14 ``` func (r *Reader) NextRawPart() (*Part, error) ``` NextRawPart returns the next part in the multipart or an error. When there are no more parts, the error io.EOF is returned. Unlike NextPart, it does not have special handling for "Content-Transfer-Encoding: quoted-printable". ### func (\*Reader) ReadForm ``` func (r *Reader) ReadForm(maxMemory int64) (*Form, error) ``` ReadForm parses an entire multipart message whose parts have a Content-Disposition of "form-data". It stores up to maxMemory bytes + 10MB (reserved for non-file parts) in memory. File parts which can't be stored in memory will be stored on disk in temporary files. It returns ErrMessageTooLarge if all non-file parts can't be stored in memory. type Writer ----------- A Writer generates multipart messages. ``` type Writer struct { // contains filtered or unexported fields } ``` ### func NewWriter ``` func NewWriter(w io.Writer) *Writer ``` NewWriter returns a new multipart Writer with a random boundary, writing to w. ### func (\*Writer) Boundary ``` func (w *Writer) Boundary() string ``` Boundary returns the Writer's boundary. ### func (\*Writer) Close ``` func (w *Writer) Close() error ``` Close finishes the multipart message and writes the trailing boundary end line to the output. ### func (\*Writer) CreateFormField ``` func (w *Writer) CreateFormField(fieldname string) (io.Writer, error) ``` CreateFormField calls CreatePart with a header using the given field name. ### func (\*Writer) CreateFormFile ``` func (w *Writer) CreateFormFile(fieldname, filename string) (io.Writer, error) ``` CreateFormFile is a convenience wrapper around CreatePart. It creates a new form-data header with the provided field name and file name. ### func (\*Writer) CreatePart ``` func (w *Writer) CreatePart(header textproto.MIMEHeader) (io.Writer, error) ``` CreatePart creates a new multipart section with the provided header. The body of the part should be written to the returned Writer. After calling CreatePart, any previous part may no longer be written to. ### func (\*Writer) FormDataContentType ``` func (w *Writer) FormDataContentType() string ``` FormDataContentType returns the Content-Type for an HTTP multipart/form-data with this Writer's Boundary. ### func (\*Writer) SetBoundary 1.1 ``` func (w *Writer) SetBoundary(boundary string) error ``` SetBoundary overrides the Writer's default randomly-generated boundary separator with an explicit value. SetBoundary must be called before any parts are created, may only contain certain ASCII characters, and must be non-empty and at most 70 bytes long. ### func (\*Writer) WriteField ``` func (w *Writer) WriteField(fieldname, value string) error ``` WriteField calls CreateFormField and then writes the given value. go Package quotedprintable Package quotedprintable ======================== * `import "mime/quotedprintable"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package quotedprintable implements quoted-printable encoding as specified by RFC 2045. Index ----- * [type Reader](#Reader) * [func NewReader(r io.Reader) \*Reader](#NewReader) * [func (r \*Reader) Read(p []byte) (n int, err error)](#Reader.Read) * [type Writer](#Writer) * [func NewWriter(w io.Writer) \*Writer](#NewWriter) * [func (w \*Writer) Close() error](#Writer.Close) * [func (w \*Writer) Write(p []byte) (n int, err error)](#Writer.Write) ### Examples [NewReader](#example_NewReader) [NewWriter](#example_NewWriter) ### Package files reader.go writer.go type Reader 1.5 --------------- Reader is a quoted-printable decoder. ``` type Reader struct { // contains filtered or unexported fields } ``` ### func NewReader 1.5 ``` func NewReader(r io.Reader) *Reader ``` NewReader returns a quoted-printable reader, decoding from r. #### Example Code: ``` for _, s := range []string{ `=48=65=6C=6C=6F=2C=20=47=6F=70=68=65=72=73=21`, `invalid escape: <b style="font-size: 200%">hello</b>`, "Hello, Gophers! This symbol will be unescaped: =3D and this will be written in =\r\none line.", } { b, err := io.ReadAll(quotedprintable.NewReader(strings.NewReader(s))) fmt.Printf("%s %v\n", b, err) } ``` Output: ``` Hello, Gophers! <nil> invalid escape: <b style="font-size: 200%">hello</b> <nil> Hello, Gophers! This symbol will be unescaped: = and this will be written in one line. <nil> ``` ### func (\*Reader) Read 1.5 ``` func (r *Reader) Read(p []byte) (n int, err error) ``` Read reads and decodes quoted-printable data from the underlying reader. type Writer 1.5 --------------- A Writer is a quoted-printable writer that implements io.WriteCloser. ``` type Writer struct { // Binary mode treats the writer's input as pure binary and processes end of // line bytes as binary data. Binary bool // contains filtered or unexported fields } ``` ### func NewWriter 1.5 ``` func NewWriter(w io.Writer) *Writer ``` NewWriter returns a new Writer that writes to w. #### Example Code: ``` w := quotedprintable.NewWriter(os.Stdout) w.Write([]byte("These symbols will be escaped: = \t")) w.Close() ``` Output: ``` These symbols will be escaped: =3D =09 ``` ### func (\*Writer) Close 1.5 ``` func (w *Writer) Close() error ``` Close closes the Writer, flushing any unwritten data to the underlying io.Writer, but does not close the underlying io.Writer. ### func (\*Writer) Write 1.5 ``` func (w *Writer) Write(p []byte) (n int, err error) ``` Write encodes p using quoted-printable encoding and writes it to the underlying io.Writer. It limits line length to 76 characters. The encoded bytes are not necessarily flushed until the Writer is closed. go Package os Package os =========== * `import "os"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package os provides a platform-independent interface to operating system functionality. The design is Unix-like, although the error handling is Go-like; failing calls return values of type error rather than error numbers. Often, more information is available within the error. For example, if a call that takes a file name fails, such as Open or Stat, the error will include the failing file name when printed and will be of type \*PathError, which may be unpacked for more information. The os interface is intended to be uniform across all operating systems. Features not generally available appear in the system-specific package syscall. Here is a simple example, opening a file and reading some of it. ``` file, err := os.Open("file.go") // For read access. if err != nil { log.Fatal(err) } ``` If the open fails, the error string will be self-explanatory, like ``` open file.go: no such file or directory ``` The file's data can then be read into a slice of bytes. Read and Write take their byte counts from the length of the argument slice. ``` data := make([]byte, 100) count, err := file.Read(data) if err != nil { log.Fatal(err) } fmt.Printf("read %d bytes: %q\n", count, data[:count]) ``` Note: The maximum number of concurrent operations on a File may be limited by the OS or the system. The number should be high, but exceeding it may degrade performance or cause other issues. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func Chdir(dir string) error](#Chdir) * [func Chmod(name string, mode FileMode) error](#Chmod) * [func Chown(name string, uid, gid int) error](#Chown) * [func Chtimes(name string, atime time.Time, mtime time.Time) error](#Chtimes) * [func Clearenv()](#Clearenv) * [func DirFS(dir string) fs.FS](#DirFS) * [func Environ() []string](#Environ) * [func Executable() (string, error)](#Executable) * [func Exit(code int)](#Exit) * [func Expand(s string, mapping func(string) string) string](#Expand) * [func ExpandEnv(s string) string](#ExpandEnv) * [func Getegid() int](#Getegid) * [func Getenv(key string) string](#Getenv) * [func Geteuid() int](#Geteuid) * [func Getgid() int](#Getgid) * [func Getgroups() ([]int, error)](#Getgroups) * [func Getpagesize() int](#Getpagesize) * [func Getpid() int](#Getpid) * [func Getppid() int](#Getppid) * [func Getuid() int](#Getuid) * [func Getwd() (dir string, err error)](#Getwd) * [func Hostname() (name string, err error)](#Hostname) * [func IsExist(err error) bool](#IsExist) * [func IsNotExist(err error) bool](#IsNotExist) * [func IsPathSeparator(c uint8) bool](#IsPathSeparator) * [func IsPermission(err error) bool](#IsPermission) * [func IsTimeout(err error) bool](#IsTimeout) * [func Lchown(name string, uid, gid int) error](#Lchown) * [func Link(oldname, newname string) error](#Link) * [func LookupEnv(key string) (string, bool)](#LookupEnv) * [func Mkdir(name string, perm FileMode) error](#Mkdir) * [func MkdirAll(path string, perm FileMode) error](#MkdirAll) * [func MkdirTemp(dir, pattern string) (string, error)](#MkdirTemp) * [func NewSyscallError(syscall string, err error) error](#NewSyscallError) * [func Pipe() (r \*File, w \*File, err error)](#Pipe) * [func ReadFile(name string) ([]byte, error)](#ReadFile) * [func Readlink(name string) (string, error)](#Readlink) * [func Remove(name string) error](#Remove) * [func RemoveAll(path string) error](#RemoveAll) * [func Rename(oldpath, newpath string) error](#Rename) * [func SameFile(fi1, fi2 FileInfo) bool](#SameFile) * [func Setenv(key, value string) error](#Setenv) * [func Symlink(oldname, newname string) error](#Symlink) * [func TempDir() string](#TempDir) * [func Truncate(name string, size int64) error](#Truncate) * [func Unsetenv(key string) error](#Unsetenv) * [func UserCacheDir() (string, error)](#UserCacheDir) * [func UserConfigDir() (string, error)](#UserConfigDir) * [func UserHomeDir() (string, error)](#UserHomeDir) * [func WriteFile(name string, data []byte, perm FileMode) error](#WriteFile) * [type DirEntry](#DirEntry) * [func ReadDir(name string) ([]DirEntry, error)](#ReadDir) * [type File](#File) * [func Create(name string) (\*File, error)](#Create) * [func CreateTemp(dir, pattern string) (\*File, error)](#CreateTemp) * [func NewFile(fd uintptr, name string) \*File](#NewFile) * [func Open(name string) (\*File, error)](#Open) * [func OpenFile(name string, flag int, perm FileMode) (\*File, error)](#OpenFile) * [func (f \*File) Chdir() error](#File.Chdir) * [func (f \*File) Chmod(mode FileMode) error](#File.Chmod) * [func (f \*File) Chown(uid, gid int) error](#File.Chown) * [func (f \*File) Close() error](#File.Close) * [func (f \*File) Fd() uintptr](#File.Fd) * [func (f \*File) Name() string](#File.Name) * [func (f \*File) Read(b []byte) (n int, err error)](#File.Read) * [func (f \*File) ReadAt(b []byte, off int64) (n int, err error)](#File.ReadAt) * [func (f \*File) ReadDir(n int) ([]DirEntry, error)](#File.ReadDir) * [func (f \*File) ReadFrom(r io.Reader) (n int64, err error)](#File.ReadFrom) * [func (f \*File) Readdir(n int) ([]FileInfo, error)](#File.Readdir) * [func (f \*File) Readdirnames(n int) (names []string, err error)](#File.Readdirnames) * [func (f \*File) Seek(offset int64, whence int) (ret int64, err error)](#File.Seek) * [func (f \*File) SetDeadline(t time.Time) error](#File.SetDeadline) * [func (f \*File) SetReadDeadline(t time.Time) error](#File.SetReadDeadline) * [func (f \*File) SetWriteDeadline(t time.Time) error](#File.SetWriteDeadline) * [func (f \*File) Stat() (FileInfo, error)](#File.Stat) * [func (f \*File) Sync() error](#File.Sync) * [func (f \*File) SyscallConn() (syscall.RawConn, error)](#File.SyscallConn) * [func (f \*File) Truncate(size int64) error](#File.Truncate) * [func (f \*File) Write(b []byte) (n int, err error)](#File.Write) * [func (f \*File) WriteAt(b []byte, off int64) (n int, err error)](#File.WriteAt) * [func (f \*File) WriteString(s string) (n int, err error)](#File.WriteString) * [type FileInfo](#FileInfo) * [func Lstat(name string) (FileInfo, error)](#Lstat) * [func Stat(name string) (FileInfo, error)](#Stat) * [type FileMode](#FileMode) * [type LinkError](#LinkError) * [func (e \*LinkError) Error() string](#LinkError.Error) * [func (e \*LinkError) Unwrap() error](#LinkError.Unwrap) * [type PathError](#PathError) * [type ProcAttr](#ProcAttr) * [type Process](#Process) * [func FindProcess(pid int) (\*Process, error)](#FindProcess) * [func StartProcess(name string, argv []string, attr \*ProcAttr) (\*Process, error)](#StartProcess) * [func (p \*Process) Kill() error](#Process.Kill) * [func (p \*Process) Release() error](#Process.Release) * [func (p \*Process) Signal(sig Signal) error](#Process.Signal) * [func (p \*Process) Wait() (\*ProcessState, error)](#Process.Wait) * [type ProcessState](#ProcessState) * [func (p \*ProcessState) ExitCode() int](#ProcessState.ExitCode) * [func (p \*ProcessState) Exited() bool](#ProcessState.Exited) * [func (p \*ProcessState) Pid() int](#ProcessState.Pid) * [func (p \*ProcessState) String() string](#ProcessState.String) * [func (p \*ProcessState) Success() bool](#ProcessState.Success) * [func (p \*ProcessState) Sys() any](#ProcessState.Sys) * [func (p \*ProcessState) SysUsage() any](#ProcessState.SysUsage) * [func (p \*ProcessState) SystemTime() time.Duration](#ProcessState.SystemTime) * [func (p \*ProcessState) UserTime() time.Duration](#ProcessState.UserTime) * [type Signal](#Signal) * [type SyscallError](#SyscallError) * [func (e \*SyscallError) Error() string](#SyscallError.Error) * [func (e \*SyscallError) Timeout() bool](#SyscallError.Timeout) * [func (e \*SyscallError) Unwrap() error](#SyscallError.Unwrap) ### Examples [Chmod](#example_Chmod) [Chtimes](#example_Chtimes) [CreateTemp](#example_CreateTemp) [CreateTemp (Suffix)](#example_CreateTemp_suffix) [ErrNotExist](#example_ErrNotExist) [Expand](#example_Expand) [ExpandEnv](#example_ExpandEnv) [FileMode](#example_FileMode) [Getenv](#example_Getenv) [LookupEnv](#example_LookupEnv) [Mkdir](#example_Mkdir) [MkdirAll](#example_MkdirAll) [MkdirTemp](#example_MkdirTemp) [MkdirTemp (Suffix)](#example_MkdirTemp_suffix) [OpenFile](#example_OpenFile) [OpenFile (Append)](#example_OpenFile_append) [ReadDir](#example_ReadDir) [ReadFile](#example_ReadFile) [Unsetenv](#example_Unsetenv) [WriteFile](#example_WriteFile) ### Package files dir.go dir\_unix.go dirent\_linux.go endian\_little.go env.go error.go error\_errno.go error\_posix.go exec.go exec\_posix.go exec\_unix.go executable.go executable\_procfs.go file.go file\_posix.go file\_unix.go getwd.go path.go path\_unix.go pipe2\_unix.go proc.go rawconn.go readfrom\_linux.go removeall\_at.go rlimit.go rlimit\_stub.go stat.go stat\_linux.go stat\_unix.go sticky\_notbsd.go str.go sys.go sys\_linux.go sys\_unix.go tempfile.go types.go types\_unix.go wait\_waitid.go Constants --------- Flags to OpenFile wrapping those of the underlying system. Not all flags may be implemented on a given system. ``` const ( // Exactly one of O_RDONLY, O_WRONLY, or O_RDWR must be specified. O_RDONLY int = syscall.O_RDONLY // open the file read-only. O_WRONLY int = syscall.O_WRONLY // open the file write-only. O_RDWR int = syscall.O_RDWR // open the file read-write. // The remaining values may be or'ed in to control behavior. O_APPEND int = syscall.O_APPEND // append data to the file when writing. O_CREATE int = syscall.O_CREAT // create a new file if none exists. O_EXCL int = syscall.O_EXCL // used with O_CREATE, file must not exist. O_SYNC int = syscall.O_SYNC // open for synchronous I/O. O_TRUNC int = syscall.O_TRUNC // truncate regular writable file when opened. ) ``` Seek whence values. Deprecated: Use io.SeekStart, io.SeekCurrent, and io.SeekEnd. ``` const ( SEEK_SET int = 0 // seek relative to the origin of the file SEEK_CUR int = 1 // seek relative to the current offset SEEK_END int = 2 // seek relative to the end ) ``` ``` const ( PathSeparator = '/' // OS-specific path separator PathListSeparator = ':' // OS-specific path list separator ) ``` The defined file mode bits are the most significant bits of the FileMode. The nine least-significant bits are the standard Unix rwxrwxrwx permissions. The values of these bits should be considered part of the public API and may be used in wire protocols or disk representations: they must not be changed, although new bits might be added. ``` const ( // The single letters are the abbreviations // used by the String method's formatting. ModeDir = fs.ModeDir // d: is a directory ModeAppend = fs.ModeAppend // a: append-only ModeExclusive = fs.ModeExclusive // l: exclusive use ModeTemporary = fs.ModeTemporary // T: temporary file; Plan 9 only ModeSymlink = fs.ModeSymlink // L: symbolic link ModeDevice = fs.ModeDevice // D: device file ModeNamedPipe = fs.ModeNamedPipe // p: named pipe (FIFO) ModeSocket = fs.ModeSocket // S: Unix domain socket ModeSetuid = fs.ModeSetuid // u: setuid ModeSetgid = fs.ModeSetgid // g: setgid ModeCharDevice = fs.ModeCharDevice // c: Unix character device, when ModeDevice is set ModeSticky = fs.ModeSticky // t: sticky ModeIrregular = fs.ModeIrregular // ?: non-regular file; nothing else is known about this file // Mask for the type bits. For regular files, none will be set. ModeType = fs.ModeType ModePerm = fs.ModePerm // Unix permission bits, 0o777 ) ``` DevNull is the name of the operating system's “null device.” On Unix-like systems, it is "/dev/null"; on Windows, "NUL". ``` const DevNull = "/dev/null" ``` Variables --------- Portable analogs of some common system call errors. Errors returned from this package may be tested against these errors with errors.Is. ``` var ( // ErrInvalid indicates an invalid argument. // Methods on File will return this error when the receiver is nil. ErrInvalid = fs.ErrInvalid // "invalid argument" ErrPermission = fs.ErrPermission // "permission denied" ErrExist = fs.ErrExist // "file already exists" ErrNotExist = fs.ErrNotExist // "file does not exist" ErrClosed = fs.ErrClosed // "file already closed" ErrNoDeadline = errNoDeadline() // "file type does not support deadline" ErrDeadlineExceeded = errDeadlineExceeded() // "i/o timeout" ) ``` Stdin, Stdout, and Stderr are open Files pointing to the standard input, standard output, and standard error file descriptors. Note that the Go runtime writes to standard error for panics and crashes; closing Stderr may cause those messages to go elsewhere, perhaps to a file opened later. ``` var ( Stdin = NewFile(uintptr(syscall.Stdin), "/dev/stdin") Stdout = NewFile(uintptr(syscall.Stdout), "/dev/stdout") Stderr = NewFile(uintptr(syscall.Stderr), "/dev/stderr") ) ``` Args hold the command-line arguments, starting with the program name. ``` var Args []string ``` ErrProcessDone indicates a Process has finished. ``` var ErrProcessDone = errors.New("os: process already finished") ``` func Chdir ---------- ``` func Chdir(dir string) error ``` Chdir changes the current working directory to the named directory. If there is an error, it will be of type \*PathError. func Chmod ---------- ``` func Chmod(name string, mode FileMode) error ``` Chmod changes the mode of the named file to mode. If the file is a symbolic link, it changes the mode of the link's target. If there is an error, it will be of type \*PathError. A different subset of the mode bits are used, depending on the operating system. On Unix, the mode's permission bits, ModeSetuid, ModeSetgid, and ModeSticky are used. On Windows, only the 0200 bit (owner writable) of mode is used; it controls whether the file's read-only attribute is set or cleared. The other bits are currently unused. For compatibility with Go 1.12 and earlier, use a non-zero mode. Use mode 0400 for a read-only file and 0600 for a readable+writable file. On Plan 9, the mode's permission bits, ModeAppend, ModeExclusive, and ModeTemporary are used. #### Example Code: ``` if err := os.Chmod("some-filename", 0644); err != nil { log.Fatal(err) } ``` func Chown ---------- ``` func Chown(name string, uid, gid int) error ``` Chown changes the numeric uid and gid of the named file. If the file is a symbolic link, it changes the uid and gid of the link's target. A uid or gid of -1 means to not change that value. If there is an error, it will be of type \*PathError. On Windows or Plan 9, Chown always returns the syscall.EWINDOWS or EPLAN9 error, wrapped in \*PathError. func Chtimes ------------ ``` func Chtimes(name string, atime time.Time, mtime time.Time) error ``` Chtimes changes the access and modification times of the named file, similar to the Unix utime() or utimes() functions. The underlying filesystem may truncate or round the values to a less precise time unit. If there is an error, it will be of type \*PathError. #### Example Code: ``` mtime := time.Date(2006, time.February, 1, 3, 4, 5, 0, time.UTC) atime := time.Date(2007, time.March, 2, 4, 5, 6, 0, time.UTC) if err := os.Chtimes("some-filename", atime, mtime); err != nil { log.Fatal(err) } ``` func Clearenv ------------- ``` func Clearenv() ``` Clearenv deletes all environment variables. func DirFS 1.16 --------------- ``` func DirFS(dir string) fs.FS ``` DirFS returns a file system (an fs.FS) for the tree of files rooted at the directory dir. Note that DirFS("/prefix") only guarantees that the Open calls it makes to the operating system will begin with "/prefix": DirFS("/prefix").Open("file") is the same as os.Open("/prefix/file"). So if /prefix/file is a symbolic link pointing outside the /prefix tree, then using DirFS does not stop the access any more than using os.Open does. Additionally, the root of the fs.FS returned for a relative path, DirFS("prefix"), will be affected by later calls to Chdir. DirFS is therefore not a general substitute for a chroot-style security mechanism when the directory tree contains arbitrary content. The directory dir must not be "". The result implements fs.StatFS. func Environ ------------ ``` func Environ() []string ``` Environ returns a copy of strings representing the environment, in the form "key=value". func Executable 1.8 ------------------- ``` func Executable() (string, error) ``` Executable returns the path name for the executable that started the current process. There is no guarantee that the path is still pointing to the correct executable. If a symlink was used to start the process, depending on the operating system, the result might be the symlink or the path it pointed to. If a stable result is needed, path/filepath.EvalSymlinks might help. Executable returns an absolute path unless an error occurred. The main use case is finding resources located relative to an executable. func Exit --------- ``` func Exit(code int) ``` Exit causes the current program to exit with the given status code. Conventionally, code zero indicates success, non-zero an error. The program terminates immediately; deferred functions are not run. For portability, the status code should be in the range [0, 125]. func Expand ----------- ``` func Expand(s string, mapping func(string) string) string ``` Expand replaces ${var} or $var in the string based on the mapping function. For example, os.ExpandEnv(s) is equivalent to os.Expand(s, os.Getenv). #### Example Code: ``` mapper := func(placeholderName string) string { switch placeholderName { case "DAY_PART": return "morning" case "NAME": return "Gopher" } return "" } fmt.Println(os.Expand("Good ${DAY_PART}, $NAME!", mapper)) ``` Output: ``` Good morning, Gopher! ``` func ExpandEnv -------------- ``` func ExpandEnv(s string) string ``` ExpandEnv replaces ${var} or $var in the string according to the values of the current environment variables. References to undefined variables are replaced by the empty string. #### Example Code: ``` os.Setenv("NAME", "gopher") os.Setenv("BURROW", "/usr/gopher") fmt.Println(os.ExpandEnv("$NAME lives in ${BURROW}.")) ``` Output: ``` gopher lives in /usr/gopher. ``` func Getegid ------------ ``` func Getegid() int ``` Getegid returns the numeric effective group id of the caller. On Windows, it returns -1. func Getenv ----------- ``` func Getenv(key string) string ``` Getenv retrieves the value of the environment variable named by the key. It returns the value, which will be empty if the variable is not present. To distinguish between an empty value and an unset value, use LookupEnv. #### Example Code: ``` os.Setenv("NAME", "gopher") os.Setenv("BURROW", "/usr/gopher") fmt.Printf("%s lives in %s.\n", os.Getenv("NAME"), os.Getenv("BURROW")) ``` Output: ``` gopher lives in /usr/gopher. ``` func Geteuid ------------ ``` func Geteuid() int ``` Geteuid returns the numeric effective user id of the caller. On Windows, it returns -1. func Getgid ----------- ``` func Getgid() int ``` Getgid returns the numeric group id of the caller. On Windows, it returns -1. func Getgroups -------------- ``` func Getgroups() ([]int, error) ``` Getgroups returns a list of the numeric ids of groups that the caller belongs to. On Windows, it returns syscall.EWINDOWS. See the os/user package for a possible alternative. func Getpagesize ---------------- ``` func Getpagesize() int ``` Getpagesize returns the underlying system's memory page size. func Getpid ----------- ``` func Getpid() int ``` Getpid returns the process id of the caller. func Getppid ------------ ``` func Getppid() int ``` Getppid returns the process id of the caller's parent. func Getuid ----------- ``` func Getuid() int ``` Getuid returns the numeric user id of the caller. On Windows, it returns -1. func Getwd ---------- ``` func Getwd() (dir string, err error) ``` Getwd returns a rooted path name corresponding to the current directory. If the current directory can be reached via multiple paths (due to symbolic links), Getwd may return any one of them. func Hostname ------------- ``` func Hostname() (name string, err error) ``` Hostname returns the host name reported by the kernel. func IsExist ------------ ``` func IsExist(err error) bool ``` IsExist returns a boolean indicating whether the error is known to report that a file or directory already exists. It is satisfied by ErrExist as well as some syscall errors. This function predates errors.Is. It only supports errors returned by the os package. New code should use errors.Is(err, fs.ErrExist). func IsNotExist --------------- ``` func IsNotExist(err error) bool ``` IsNotExist returns a boolean indicating whether the error is known to report that a file or directory does not exist. It is satisfied by ErrNotExist as well as some syscall errors. This function predates errors.Is. It only supports errors returned by the os package. New code should use errors.Is(err, fs.ErrNotExist). func IsPathSeparator -------------------- ``` func IsPathSeparator(c uint8) bool ``` IsPathSeparator reports whether c is a directory separator character. func IsPermission ----------------- ``` func IsPermission(err error) bool ``` IsPermission returns a boolean indicating whether the error is known to report that permission is denied. It is satisfied by ErrPermission as well as some syscall errors. This function predates errors.Is. It only supports errors returned by the os package. New code should use errors.Is(err, fs.ErrPermission). func IsTimeout 1.10 ------------------- ``` func IsTimeout(err error) bool ``` IsTimeout returns a boolean indicating whether the error is known to report that a timeout occurred. This function predates errors.Is, and the notion of whether an error indicates a timeout can be ambiguous. For example, the Unix error EWOULDBLOCK sometimes indicates a timeout and sometimes does not. New code should use errors.Is with a value appropriate to the call returning the error, such as os.ErrDeadlineExceeded. func Lchown ----------- ``` func Lchown(name string, uid, gid int) error ``` Lchown changes the numeric uid and gid of the named file. If the file is a symbolic link, it changes the uid and gid of the link itself. If there is an error, it will be of type \*PathError. On Windows, it always returns the syscall.EWINDOWS error, wrapped in \*PathError. func Link --------- ``` func Link(oldname, newname string) error ``` Link creates newname as a hard link to the oldname file. If there is an error, it will be of type \*LinkError. func LookupEnv 1.5 ------------------ ``` func LookupEnv(key string) (string, bool) ``` LookupEnv retrieves the value of the environment variable named by the key. If the variable is present in the environment the value (which may be empty) is returned and the boolean is true. Otherwise the returned value will be empty and the boolean will be false. #### Example Code: ``` show := func(key string) { val, ok := os.LookupEnv(key) if !ok { fmt.Printf("%s not set\n", key) } else { fmt.Printf("%s=%s\n", key, val) } } os.Setenv("SOME_KEY", "value") os.Setenv("EMPTY_KEY", "") show("SOME_KEY") show("EMPTY_KEY") show("MISSING_KEY") ``` Output: ``` SOME_KEY=value EMPTY_KEY= MISSING_KEY not set ``` func Mkdir ---------- ``` func Mkdir(name string, perm FileMode) error ``` Mkdir creates a new directory with the specified name and permission bits (before umask). If there is an error, it will be of type \*PathError. #### Example Code: ``` err := os.Mkdir("testdir", 0750) if err != nil && !os.IsExist(err) { log.Fatal(err) } err = os.WriteFile("testdir/testfile.txt", []byte("Hello, Gophers!"), 0660) if err != nil { log.Fatal(err) } ``` func MkdirAll ------------- ``` func MkdirAll(path string, perm FileMode) error ``` MkdirAll creates a directory named path, along with any necessary parents, and returns nil, or else returns an error. The permission bits perm (before umask) are used for all directories that MkdirAll creates. If path is already a directory, MkdirAll does nothing and returns nil. #### Example Code: ``` err := os.MkdirAll("test/subdir", 0750) if err != nil && !os.IsExist(err) { log.Fatal(err) } err = os.WriteFile("test/subdir/testfile.txt", []byte("Hello, Gophers!"), 0660) if err != nil { log.Fatal(err) } ``` func MkdirTemp 1.16 ------------------- ``` func MkdirTemp(dir, pattern string) (string, error) ``` MkdirTemp creates a new temporary directory in the directory dir and returns the pathname of the new directory. The new directory's name is generated by adding a random string to the end of pattern. If pattern includes a "\*", the random string replaces the last "\*" instead. If dir is the empty string, MkdirTemp uses the default directory for temporary files, as returned by TempDir. Multiple programs or goroutines calling MkdirTemp simultaneously will not choose the same directory. It is the caller's responsibility to remove the directory when it is no longer needed. #### Example Code: ``` dir, err := os.MkdirTemp("", "example") if err != nil { log.Fatal(err) } defer os.RemoveAll(dir) // clean up file := filepath.Join(dir, "tmpfile") if err := os.WriteFile(file, []byte("content"), 0666); err != nil { log.Fatal(err) } ``` #### Example (Suffix) Code: ``` logsDir, err := os.MkdirTemp("", "*-logs") if err != nil { log.Fatal(err) } defer os.RemoveAll(logsDir) // clean up // Logs can be cleaned out earlier if needed by searching // for all directories whose suffix ends in *-logs. globPattern := filepath.Join(os.TempDir(), "*-logs") matches, err := filepath.Glob(globPattern) if err != nil { log.Fatalf("Failed to match %q: %v", globPattern, err) } for _, match := range matches { if err := os.RemoveAll(match); err != nil { log.Printf("Failed to remove %q: %v", match, err) } } ``` func NewSyscallError -------------------- ``` func NewSyscallError(syscall string, err error) error ``` NewSyscallError returns, as an error, a new SyscallError with the given system call name and error details. As a convenience, if err is nil, NewSyscallError returns nil. func Pipe --------- ``` func Pipe() (r *File, w *File, err error) ``` Pipe returns a connected pair of Files; reads from r return bytes written to w. It returns the files and an error, if any. func ReadFile 1.16 ------------------ ``` func ReadFile(name string) ([]byte, error) ``` ReadFile reads the named file and returns the contents. A successful call returns err == nil, not err == EOF. Because ReadFile reads the whole file, it does not treat an EOF from Read as an error to be reported. #### Example Code: ``` data, err := os.ReadFile("testdata/hello") if err != nil { log.Fatal(err) } os.Stdout.Write(data) ``` Output: ``` Hello, Gophers! ``` func Readlink ------------- ``` func Readlink(name string) (string, error) ``` Readlink returns the destination of the named symbolic link. If there is an error, it will be of type \*PathError. func Remove ----------- ``` func Remove(name string) error ``` Remove removes the named file or (empty) directory. If there is an error, it will be of type \*PathError. func RemoveAll -------------- ``` func RemoveAll(path string) error ``` RemoveAll removes path and any children it contains. It removes everything it can but returns the first error it encounters. If the path does not exist, RemoveAll returns nil (no error). If there is an error, it will be of type \*PathError. func Rename ----------- ``` func Rename(oldpath, newpath string) error ``` Rename renames (moves) oldpath to newpath. If newpath already exists and is not a directory, Rename replaces it. OS-specific restrictions may apply when oldpath and newpath are in different directories. Even within the same directory, on non-Unix platforms Rename is not an atomic operation. If there is an error, it will be of type \*LinkError. func SameFile ------------- ``` func SameFile(fi1, fi2 FileInfo) bool ``` SameFile reports whether fi1 and fi2 describe the same file. For example, on Unix this means that the device and inode fields of the two underlying structures are identical; on other systems the decision may be based on the path names. SameFile only applies to results returned by this package's Stat. It returns false in other cases. func Setenv ----------- ``` func Setenv(key, value string) error ``` Setenv sets the value of the environment variable named by the key. It returns an error, if any. func Symlink ------------ ``` func Symlink(oldname, newname string) error ``` Symlink creates newname as a symbolic link to oldname. On Windows, a symlink to a non-existent oldname creates a file symlink; if oldname is later created as a directory the symlink will not work. If there is an error, it will be of type \*LinkError. func TempDir ------------ ``` func TempDir() string ``` TempDir returns the default directory to use for temporary files. On Unix systems, it returns $TMPDIR if non-empty, else /tmp. On Windows, it uses GetTempPath, returning the first non-empty value from %TMP%, %TEMP%, %USERPROFILE%, or the Windows directory. On Plan 9, it returns /tmp. The directory is neither guaranteed to exist nor have accessible permissions. func Truncate ------------- ``` func Truncate(name string, size int64) error ``` Truncate changes the size of the named file. If the file is a symbolic link, it changes the size of the link's target. If there is an error, it will be of type \*PathError. func Unsetenv 1.4 ----------------- ``` func Unsetenv(key string) error ``` Unsetenv unsets a single environment variable. #### Example Code: ``` os.Setenv("TMPDIR", "/my/tmp") defer os.Unsetenv("TMPDIR") ``` func UserCacheDir 1.11 ---------------------- ``` func UserCacheDir() (string, error) ``` UserCacheDir returns the default root directory to use for user-specific cached data. Users should create their own application-specific subdirectory within this one and use that. On Unix systems, it returns $XDG\_CACHE\_HOME as specified by <https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html> if non-empty, else $HOME/.cache. On Darwin, it returns $HOME/Library/Caches. On Windows, it returns %LocalAppData%. On Plan 9, it returns $home/lib/cache. If the location cannot be determined (for example, $HOME is not defined), then it will return an error. func UserConfigDir 1.13 ----------------------- ``` func UserConfigDir() (string, error) ``` UserConfigDir returns the default root directory to use for user-specific configuration data. Users should create their own application-specific subdirectory within this one and use that. On Unix systems, it returns $XDG\_CONFIG\_HOME as specified by <https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html> if non-empty, else $HOME/.config. On Darwin, it returns $HOME/Library/Application Support. On Windows, it returns %AppData%. On Plan 9, it returns $home/lib. If the location cannot be determined (for example, $HOME is not defined), then it will return an error. func UserHomeDir 1.12 --------------------- ``` func UserHomeDir() (string, error) ``` UserHomeDir returns the current user's home directory. On Unix, including macOS, it returns the $HOME environment variable. On Windows, it returns %USERPROFILE%. On Plan 9, it returns the $home environment variable. func WriteFile 1.16 ------------------- ``` func WriteFile(name string, data []byte, perm FileMode) error ``` WriteFile writes data to the named file, creating it if necessary. If the file does not exist, WriteFile creates it with permissions perm (before umask); otherwise WriteFile truncates it before writing, without changing permissions. Since Writefile requires multiple system calls to complete, a failure mid-operation can leave the file in a partially written state. #### Example Code: ``` err := os.WriteFile("testdata/hello", []byte("Hello, Gophers!"), 0666) if err != nil { log.Fatal(err) } ``` type DirEntry 1.16 ------------------ A DirEntry is an entry read from a directory (using the ReadDir function or a File's ReadDir method). ``` type DirEntry = fs.DirEntry ``` ### func ReadDir 1.16 ``` func ReadDir(name string) ([]DirEntry, error) ``` ReadDir reads the named directory, returning all its directory entries sorted by filename. If an error occurs reading the directory, ReadDir returns the entries it was able to read before the error, along with the error. #### Example Code: ``` files, err := os.ReadDir(".") if err != nil { log.Fatal(err) } for _, file := range files { fmt.Println(file.Name()) } ``` type File --------- File represents an open file descriptor. ``` type File struct { // contains filtered or unexported fields } ``` ### func Create ``` func Create(name string) (*File, error) ``` Create creates or truncates the named file. If the file already exists, it is truncated. If the file does not exist, it is created with mode 0666 (before umask). If successful, methods on the returned File can be used for I/O; the associated file descriptor has mode O\_RDWR. If there is an error, it will be of type \*PathError. ### func CreateTemp 1.16 ``` func CreateTemp(dir, pattern string) (*File, error) ``` CreateTemp creates a new temporary file in the directory dir, opens the file for reading and writing, and returns the resulting file. The filename is generated by taking pattern and adding a random string to the end. If pattern includes a "\*", the random string replaces the last "\*". If dir is the empty string, CreateTemp uses the default directory for temporary files, as returned by TempDir. Multiple programs or goroutines calling CreateTemp simultaneously will not choose the same file. The caller can use the file's Name method to find the pathname of the file. It is the caller's responsibility to remove the file when it is no longer needed. #### Example Code: ``` f, err := os.CreateTemp("", "example") if err != nil { log.Fatal(err) } defer os.Remove(f.Name()) // clean up if _, err := f.Write([]byte("content")); err != nil { log.Fatal(err) } if err := f.Close(); err != nil { log.Fatal(err) } ``` #### Example (Suffix) Code: ``` f, err := os.CreateTemp("", "example.*.txt") if err != nil { log.Fatal(err) } defer os.Remove(f.Name()) // clean up if _, err := f.Write([]byte("content")); err != nil { f.Close() log.Fatal(err) } if err := f.Close(); err != nil { log.Fatal(err) } ``` ### func NewFile ``` func NewFile(fd uintptr, name string) *File ``` NewFile returns a new File with the given file descriptor and name. The returned value will be nil if fd is not a valid file descriptor. On Unix systems, if the file descriptor is in non-blocking mode, NewFile will attempt to return a pollable File (one for which the SetDeadline methods work). After passing it to NewFile, fd may become invalid under the same conditions described in the comments of the Fd method, and the same constraints apply. ### func Open ``` func Open(name string) (*File, error) ``` Open opens the named file for reading. If successful, methods on the returned file can be used for reading; the associated file descriptor has mode O\_RDONLY. If there is an error, it will be of type \*PathError. ### func OpenFile ``` func OpenFile(name string, flag int, perm FileMode) (*File, error) ``` OpenFile is the generalized open call; most users will use Open or Create instead. It opens the named file with specified flag (O\_RDONLY etc.). If the file does not exist, and the O\_CREATE flag is passed, it is created with mode perm (before umask). If successful, methods on the returned File can be used for I/O. If there is an error, it will be of type \*PathError. #### Example Code: ``` f, err := os.OpenFile("notes.txt", os.O_RDWR|os.O_CREATE, 0755) if err != nil { log.Fatal(err) } if err := f.Close(); err != nil { log.Fatal(err) } ``` #### Example (Append) Code: ``` // If the file doesn't exist, create it, or append to the file f, err := os.OpenFile("access.log", os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0644) if err != nil { log.Fatal(err) } if _, err := f.Write([]byte("appended some data\n")); err != nil { f.Close() // ignore error; Write error takes precedence log.Fatal(err) } if err := f.Close(); err != nil { log.Fatal(err) } ``` ### func (\*File) Chdir ``` func (f *File) Chdir() error ``` Chdir changes the current working directory to the file, which must be a directory. If there is an error, it will be of type \*PathError. ### func (\*File) Chmod ``` func (f *File) Chmod(mode FileMode) error ``` Chmod changes the mode of the file to mode. If there is an error, it will be of type \*PathError. ### func (\*File) Chown ``` func (f *File) Chown(uid, gid int) error ``` Chown changes the numeric uid and gid of the named file. If there is an error, it will be of type \*PathError. On Windows, it always returns the syscall.EWINDOWS error, wrapped in \*PathError. ### func (\*File) Close ``` func (f *File) Close() error ``` Close closes the File, rendering it unusable for I/O. On files that support SetDeadline, any pending I/O operations will be canceled and return immediately with an ErrClosed error. Close will return an error if it has already been called. ### func (\*File) Fd ``` func (f *File) Fd() uintptr ``` Fd returns the integer Unix file descriptor referencing the open file. If f is closed, the file descriptor becomes invalid. If f is garbage collected, a finalizer may close the file descriptor, making it invalid; see runtime.SetFinalizer for more information on when a finalizer might be run. On Unix systems this will cause the SetDeadline methods to stop working. Because file descriptors can be reused, the returned file descriptor may only be closed through the Close method of f, or by its finalizer during garbage collection. Otherwise, during garbage collection the finalizer may close an unrelated file descriptor with the same (reused) number. As an alternative, see the f.SyscallConn method. ### func (\*File) Name ``` func (f *File) Name() string ``` Name returns the name of the file as presented to Open. ### func (\*File) Read ``` func (f *File) Read(b []byte) (n int, err error) ``` Read reads up to len(b) bytes from the File and stores them in b. It returns the number of bytes read and any error encountered. At end of file, Read returns 0, io.EOF. ### func (\*File) ReadAt ``` func (f *File) ReadAt(b []byte, off int64) (n int, err error) ``` ReadAt reads len(b) bytes from the File starting at byte offset off. It returns the number of bytes read and the error, if any. ReadAt always returns a non-nil error when n < len(b). At end of file, that error is io.EOF. ### func (\*File) ReadDir 1.16 ``` func (f *File) ReadDir(n int) ([]DirEntry, error) ``` ReadDir reads the contents of the directory associated with the file f and returns a slice of DirEntry values in directory order. Subsequent calls on the same file will yield later DirEntry records in the directory. If n > 0, ReadDir returns at most n DirEntry records. In this case, if ReadDir returns an empty slice, it will return an error explaining why. At the end of a directory, the error is io.EOF. If n <= 0, ReadDir returns all the DirEntry records remaining in the directory. When it succeeds, it returns a nil error (not io.EOF). ### func (\*File) ReadFrom 1.15 ``` func (f *File) ReadFrom(r io.Reader) (n int64, err error) ``` ReadFrom implements io.ReaderFrom. ### func (\*File) Readdir ``` func (f *File) Readdir(n int) ([]FileInfo, error) ``` Readdir reads the contents of the directory associated with file and returns a slice of up to n FileInfo values, as would be returned by Lstat, in directory order. Subsequent calls on the same file will yield further FileInfos. If n > 0, Readdir returns at most n FileInfo structures. In this case, if Readdir returns an empty slice, it will return a non-nil error explaining why. At the end of a directory, the error is io.EOF. If n <= 0, Readdir returns all the FileInfo from the directory in a single slice. In this case, if Readdir succeeds (reads all the way to the end of the directory), it returns the slice and a nil error. If it encounters an error before the end of the directory, Readdir returns the FileInfo read until that point and a non-nil error. Most clients are better served by the more efficient ReadDir method. ### func (\*File) Readdirnames ``` func (f *File) Readdirnames(n int) (names []string, err error) ``` Readdirnames reads the contents of the directory associated with file and returns a slice of up to n names of files in the directory, in directory order. Subsequent calls on the same file will yield further names. If n > 0, Readdirnames returns at most n names. In this case, if Readdirnames returns an empty slice, it will return a non-nil error explaining why. At the end of a directory, the error is io.EOF. If n <= 0, Readdirnames returns all the names from the directory in a single slice. In this case, if Readdirnames succeeds (reads all the way to the end of the directory), it returns the slice and a nil error. If it encounters an error before the end of the directory, Readdirnames returns the names read until that point and a non-nil error. ### func (\*File) Seek ``` func (f *File) Seek(offset int64, whence int) (ret int64, err error) ``` Seek sets the offset for the next Read or Write on file to offset, interpreted according to whence: 0 means relative to the origin of the file, 1 means relative to the current offset, and 2 means relative to the end. It returns the new offset and an error, if any. The behavior of Seek on a file opened with O\_APPEND is not specified. ### func (\*File) SetDeadline 1.10 ``` func (f *File) SetDeadline(t time.Time) error ``` SetDeadline sets the read and write deadlines for a File. It is equivalent to calling both SetReadDeadline and SetWriteDeadline. Only some kinds of files support setting a deadline. Calls to SetDeadline for files that do not support deadlines will return ErrNoDeadline. On most systems ordinary files do not support deadlines, but pipes do. A deadline is an absolute time after which I/O operations fail with an error instead of blocking. The deadline applies to all future and pending I/O, not just the immediately following call to Read or Write. After a deadline has been exceeded, the connection can be refreshed by setting a deadline in the future. If the deadline is exceeded a call to Read or Write or to other I/O methods will return an error that wraps ErrDeadlineExceeded. This can be tested using errors.Is(err, os.ErrDeadlineExceeded). That error implements the Timeout method, and calling the Timeout method will return true, but there are other possible errors for which the Timeout will return true even if the deadline has not been exceeded. An idle timeout can be implemented by repeatedly extending the deadline after successful Read or Write calls. A zero value for t means I/O operations will not time out. ### func (\*File) SetReadDeadline 1.10 ``` func (f *File) SetReadDeadline(t time.Time) error ``` SetReadDeadline sets the deadline for future Read calls and any currently-blocked Read call. A zero value for t means Read will not time out. Not all files support setting deadlines; see SetDeadline. ### func (\*File) SetWriteDeadline 1.10 ``` func (f *File) SetWriteDeadline(t time.Time) error ``` SetWriteDeadline sets the deadline for any future Write calls and any currently-blocked Write call. Even if Write times out, it may return n > 0, indicating that some of the data was successfully written. A zero value for t means Write will not time out. Not all files support setting deadlines; see SetDeadline. ### func (\*File) Stat ``` func (f *File) Stat() (FileInfo, error) ``` Stat returns the FileInfo structure describing file. If there is an error, it will be of type \*PathError. ### func (\*File) Sync ``` func (f *File) Sync() error ``` Sync commits the current contents of the file to stable storage. Typically, this means flushing the file system's in-memory copy of recently written data to disk. ### func (\*File) SyscallConn 1.12 ``` func (f *File) SyscallConn() (syscall.RawConn, error) ``` SyscallConn returns a raw file. This implements the syscall.Conn interface. ### func (\*File) Truncate ``` func (f *File) Truncate(size int64) error ``` Truncate changes the size of the file. It does not change the I/O offset. If there is an error, it will be of type \*PathError. ### func (\*File) Write ``` func (f *File) Write(b []byte) (n int, err error) ``` Write writes len(b) bytes from b to the File. It returns the number of bytes written and an error, if any. Write returns a non-nil error when n != len(b). ### func (\*File) WriteAt ``` func (f *File) WriteAt(b []byte, off int64) (n int, err error) ``` WriteAt writes len(b) bytes to the File starting at byte offset off. It returns the number of bytes written and an error, if any. WriteAt returns a non-nil error when n != len(b). If file was opened with the O\_APPEND flag, WriteAt returns an error. ### func (\*File) WriteString ``` func (f *File) WriteString(s string) (n int, err error) ``` WriteString is like Write, but writes the contents of string s rather than a slice of bytes. type FileInfo ------------- A FileInfo describes a file and is returned by Stat and Lstat. ``` type FileInfo = fs.FileInfo ``` ### func Lstat ``` func Lstat(name string) (FileInfo, error) ``` Lstat returns a FileInfo describing the named file. If the file is a symbolic link, the returned FileInfo describes the symbolic link. Lstat makes no attempt to follow the link. If there is an error, it will be of type \*PathError. ### func Stat ``` func Stat(name string) (FileInfo, error) ``` Stat returns a FileInfo describing the named file. If there is an error, it will be of type \*PathError. type FileMode ------------- A FileMode represents a file's mode and permission bits. The bits have the same definition on all systems, so that information about files can be moved from one system to another portably. Not all bits apply to all systems. The only required bit is ModeDir for directories. ``` type FileMode = fs.FileMode ``` #### Example Code: ``` fi, err := os.Lstat("some-filename") if err != nil { log.Fatal(err) } fmt.Printf("permissions: %#o\n", fi.Mode().Perm()) // 0400, 0777, etc. switch mode := fi.Mode(); { case mode.IsRegular(): fmt.Println("regular file") case mode.IsDir(): fmt.Println("directory") case mode&fs.ModeSymlink != 0: fmt.Println("symbolic link") case mode&fs.ModeNamedPipe != 0: fmt.Println("named pipe") } ``` type LinkError -------------- LinkError records an error during a link or symlink or rename system call and the paths that caused it. ``` type LinkError struct { Op string Old string New string Err error } ``` ### func (\*LinkError) Error ``` func (e *LinkError) Error() string ``` ### func (\*LinkError) Unwrap 1.13 ``` func (e *LinkError) Unwrap() error ``` type PathError -------------- PathError records an error and the operation and file path that caused it. ``` type PathError = fs.PathError ``` type ProcAttr ------------- ProcAttr holds the attributes that will be applied to a new process started by StartProcess. ``` type ProcAttr struct { // If Dir is non-empty, the child changes into the directory before // creating the process. Dir string // If Env is non-nil, it gives the environment variables for the // new process in the form returned by Environ. // If it is nil, the result of Environ will be used. Env []string // Files specifies the open files inherited by the new process. The // first three entries correspond to standard input, standard output, and // standard error. An implementation may support additional entries, // depending on the underlying operating system. A nil entry corresponds // to that file being closed when the process starts. // On Unix systems, StartProcess will change these File values // to blocking mode, which means that SetDeadline will stop working // and calling Close will not interrupt a Read or Write. Files []*File // Operating system-specific process creation attributes. // Note that setting this field means that your program // may not execute properly or even compile on some // operating systems. Sys *syscall.SysProcAttr } ``` type Process ------------ Process stores the information about a process created by StartProcess. ``` type Process struct { Pid int // contains filtered or unexported fields } ``` ### func FindProcess ``` func FindProcess(pid int) (*Process, error) ``` FindProcess looks for a running process by its pid. The Process it returns can be used to obtain information about the underlying operating system process. On Unix systems, FindProcess always succeeds and returns a Process for the given pid, regardless of whether the process exists. ### func StartProcess ``` func StartProcess(name string, argv []string, attr *ProcAttr) (*Process, error) ``` StartProcess starts a new process with the program, arguments and attributes specified by name, argv and attr. The argv slice will become os.Args in the new process, so it normally starts with the program name. If the calling goroutine has locked the operating system thread with runtime.LockOSThread and modified any inheritable OS-level thread state (for example, Linux or Plan 9 name spaces), the new process will inherit the caller's thread state. StartProcess is a low-level interface. The os/exec package provides higher-level interfaces. If there is an error, it will be of type \*PathError. ### func (\*Process) Kill ``` func (p *Process) Kill() error ``` Kill causes the Process to exit immediately. Kill does not wait until the Process has actually exited. This only kills the Process itself, not any other processes it may have started. ### func (\*Process) Release ``` func (p *Process) Release() error ``` Release releases any resources associated with the Process p, rendering it unusable in the future. Release only needs to be called if Wait is not. ### func (\*Process) Signal ``` func (p *Process) Signal(sig Signal) error ``` Signal sends a signal to the Process. Sending Interrupt on Windows is not implemented. ### func (\*Process) Wait ``` func (p *Process) Wait() (*ProcessState, error) ``` Wait waits for the Process to exit, and then returns a ProcessState describing its status and an error, if any. Wait releases any resources associated with the Process. On most operating systems, the Process must be a child of the current process or an error will be returned. type ProcessState ----------------- ProcessState stores information about a process, as reported by Wait. ``` type ProcessState struct { // contains filtered or unexported fields } ``` ### func (\*ProcessState) ExitCode 1.12 ``` func (p *ProcessState) ExitCode() int ``` ExitCode returns the exit code of the exited process, or -1 if the process hasn't exited or was terminated by a signal. ### func (\*ProcessState) Exited ``` func (p *ProcessState) Exited() bool ``` Exited reports whether the program has exited. On Unix systems this reports true if the program exited due to calling exit, but false if the program terminated due to a signal. ### func (\*ProcessState) Pid ``` func (p *ProcessState) Pid() int ``` Pid returns the process id of the exited process. ### func (\*ProcessState) String ``` func (p *ProcessState) String() string ``` ### func (\*ProcessState) Success ``` func (p *ProcessState) Success() bool ``` Success reports whether the program exited successfully, such as with exit status 0 on Unix. ### func (\*ProcessState) Sys ``` func (p *ProcessState) Sys() any ``` Sys returns system-dependent exit information about the process. Convert it to the appropriate underlying type, such as syscall.WaitStatus on Unix, to access its contents. ### func (\*ProcessState) SysUsage ``` func (p *ProcessState) SysUsage() any ``` SysUsage returns system-dependent resource usage information about the exited process. Convert it to the appropriate underlying type, such as \*syscall.Rusage on Unix, to access its contents. (On Unix, \*syscall.Rusage matches struct rusage as defined in the getrusage(2) manual page.) ### func (\*ProcessState) SystemTime ``` func (p *ProcessState) SystemTime() time.Duration ``` SystemTime returns the system CPU time of the exited process and its children. ### func (\*ProcessState) UserTime ``` func (p *ProcessState) UserTime() time.Duration ``` UserTime returns the user CPU time of the exited process and its children. type Signal ----------- A Signal represents an operating system signal. The usual underlying implementation is operating system-dependent: on Unix it is syscall.Signal. ``` type Signal interface { String() string Signal() // to distinguish from other Stringers } ``` The only signal values guaranteed to be present in the os package on all systems are os.Interrupt (send the process an interrupt) and os.Kill (force the process to exit). On Windows, sending os.Interrupt to a process with os.Process.Signal is not implemented; it will return an error instead of sending a signal. ``` var ( Interrupt Signal = syscall.SIGINT Kill Signal = syscall.SIGKILL ) ``` type SyscallError ----------------- SyscallError records an error from a specific system call. ``` type SyscallError struct { Syscall string Err error } ``` ### func (\*SyscallError) Error ``` func (e *SyscallError) Error() string ``` ### func (\*SyscallError) Timeout 1.10 ``` func (e *SyscallError) Timeout() bool ``` Timeout reports whether this error represents a timeout. ### func (\*SyscallError) Unwrap 1.13 ``` func (e *SyscallError) Unwrap() error ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [exec](exec/index) | Package exec runs external commands. | | [signal](signal/index) | Package signal implements access to incoming signals. | | [user](user/index) | Package user allows user account lookups by name or id. |
programming_docs
go Package user Package user ============= * `import "os/user"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package user allows user account lookups by name or id. For most Unix systems, this package has two internal implementations of resolving user and group ids to names, and listing supplementary group IDs. One is written in pure Go and parses /etc/passwd and /etc/group. The other is cgo-based and relies on the standard C library (libc) routines such as getpwuid\_r, getgrnam\_r, and getgrouplist. When cgo is available, and the required routines are implemented in libc for a particular platform, cgo-based (libc-backed) code is used. This can be overridden by using osusergo build tag, which enforces the pure Go implementation. Index ----- * [type Group](#Group) * [func LookupGroup(name string) (\*Group, error)](#LookupGroup) * [func LookupGroupId(gid string) (\*Group, error)](#LookupGroupId) * [type UnknownGroupError](#UnknownGroupError) * [func (e UnknownGroupError) Error() string](#UnknownGroupError.Error) * [type UnknownGroupIdError](#UnknownGroupIdError) * [func (e UnknownGroupIdError) Error() string](#UnknownGroupIdError.Error) * [type UnknownUserError](#UnknownUserError) * [func (e UnknownUserError) Error() string](#UnknownUserError.Error) * [type UnknownUserIdError](#UnknownUserIdError) * [func (e UnknownUserIdError) Error() string](#UnknownUserIdError.Error) * [type User](#User) * [func Current() (\*User, error)](#Current) * [func Lookup(username string) (\*User, error)](#Lookup) * [func LookupId(uid string) (\*User, error)](#LookupId) * [func (u \*User) GroupIds() ([]string, error)](#User.GroupIds) ### Package files cgo\_listgroups\_unix.go cgo\_lookup\_cgo.go cgo\_lookup\_unix.go getgrouplist\_unix.go lookup.go user.go type Group 1.7 -------------- Group represents a grouping of users. On POSIX systems Gid contains a decimal number representing the group ID. ``` type Group struct { Gid string // group ID Name string // group name } ``` ### func LookupGroup 1.7 ``` func LookupGroup(name string) (*Group, error) ``` LookupGroup looks up a group by name. If the group cannot be found, the returned error is of type UnknownGroupError. ### func LookupGroupId 1.7 ``` func LookupGroupId(gid string) (*Group, error) ``` LookupGroupId looks up a group by groupid. If the group cannot be found, the returned error is of type UnknownGroupIdError. type UnknownGroupError 1.7 -------------------------- UnknownGroupError is returned by LookupGroup when a group cannot be found. ``` type UnknownGroupError string ``` ### func (UnknownGroupError) Error 1.7 ``` func (e UnknownGroupError) Error() string ``` type UnknownGroupIdError 1.7 ---------------------------- UnknownGroupIdError is returned by LookupGroupId when a group cannot be found. ``` type UnknownGroupIdError string ``` ### func (UnknownGroupIdError) Error 1.7 ``` func (e UnknownGroupIdError) Error() string ``` type UnknownUserError --------------------- UnknownUserError is returned by Lookup when a user cannot be found. ``` type UnknownUserError string ``` ### func (UnknownUserError) Error ``` func (e UnknownUserError) Error() string ``` type UnknownUserIdError ----------------------- UnknownUserIdError is returned by LookupId when a user cannot be found. ``` type UnknownUserIdError int ``` ### func (UnknownUserIdError) Error ``` func (e UnknownUserIdError) Error() string ``` type User --------- User represents a user account. ``` type User struct { // Uid is the user ID. // On POSIX systems, this is a decimal number representing the uid. // On Windows, this is a security identifier (SID) in a string format. // On Plan 9, this is the contents of /dev/user. Uid string // Gid is the primary group ID. // On POSIX systems, this is a decimal number representing the gid. // On Windows, this is a SID in a string format. // On Plan 9, this is the contents of /dev/user. Gid string // Username is the login name. Username string // Name is the user's real or display name. // It might be blank. // On POSIX systems, this is the first (or only) entry in the GECOS field // list. // On Windows, this is the user's display name. // On Plan 9, this is the contents of /dev/user. Name string // HomeDir is the path to the user's home directory (if they have one). HomeDir string } ``` ### func Current ``` func Current() (*User, error) ``` Current returns the current user. The first call will cache the current user information. Subsequent calls will return the cached value and will not reflect changes to the current user. ### func Lookup ``` func Lookup(username string) (*User, error) ``` Lookup looks up a user by username. If the user cannot be found, the returned error is of type UnknownUserError. ### func LookupId ``` func LookupId(uid string) (*User, error) ``` LookupId looks up a user by userid. If the user cannot be found, the returned error is of type UnknownUserIdError. ### func (\*User) GroupIds 1.7 ``` func (u *User) GroupIds() ([]string, error) ``` GroupIds returns the list of group IDs that the user is a member of. go Package exec Package exec ============= * `import "os/exec"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package exec runs external commands. It wraps os.StartProcess to make it easier to remap stdin and stdout, connect I/O with pipes, and do other adjustments. Unlike the "system" library call from C and other languages, the os/exec package intentionally does not invoke the system shell and does not expand any glob patterns or handle other expansions, pipelines, or redirections typically done by shells. The package behaves more like C's "exec" family of functions. To expand glob patterns, either call the shell directly, taking care to escape any dangerous input, or use the path/filepath package's Glob function. To expand environment variables, use package os's ExpandEnv. Note that the examples in this package assume a Unix system. They may not run on Windows, and they do not run in the Go Playground used by golang.org and godoc.org. ### Executables in the current directory The functions Command and LookPath look for a program in the directories listed in the current path, following the conventions of the host operating system. Operating systems have for decades included the current directory in this search, sometimes implicitly and sometimes configured explicitly that way by default. Modern practice is that including the current directory is usually unexpected and often leads to security problems. To avoid those security problems, as of Go 1.19, this package will not resolve a program using an implicit or explicit path entry relative to the current directory. That is, if you run exec.LookPath("go"), it will not successfully return ./go on Unix nor .\go.exe on Windows, no matter how the path is configured. Instead, if the usual path algorithms would result in that answer, these functions return an error err satisfying errors.Is(err, ErrDot). For example, consider these two program snippets: ``` path, err := exec.LookPath("prog") if err != nil { log.Fatal(err) } use(path) ``` and ``` cmd := exec.Command("prog") if err := cmd.Run(); err != nil { log.Fatal(err) } ``` These will not find and run ./prog or .\prog.exe, no matter how the current path is configured. Code that always wants to run a program from the current directory can be rewritten to say "./prog" instead of "prog". Code that insists on including results from relative path entries can instead override the error using an errors.Is check: ``` path, err := exec.LookPath("prog") if errors.Is(err, exec.ErrDot) { err = nil } if err != nil { log.Fatal(err) } use(path) ``` and ``` cmd := exec.Command("prog") if errors.Is(cmd.Err, exec.ErrDot) { cmd.Err = nil } if err := cmd.Run(); err != nil { log.Fatal(err) } ``` Setting the environment variable GODEBUG=execerrdot=0 disables generation of ErrDot entirely, temporarily restoring the pre-Go 1.19 behavior for programs that are unable to apply more targeted fixes. A future version of Go may remove support for this variable. Before adding such overrides, make sure you understand the security implications of doing so. See <https://go.dev/blog/path-security> for more information. Index ----- * [Variables](#pkg-variables) * [func LookPath(file string) (string, error)](#LookPath) * [type Cmd](#Cmd) * [func Command(name string, arg ...string) \*Cmd](#Command) * [func CommandContext(ctx context.Context, name string, arg ...string) \*Cmd](#CommandContext) * [func (c \*Cmd) CombinedOutput() ([]byte, error)](#Cmd.CombinedOutput) * [func (c \*Cmd) Environ() []string](#Cmd.Environ) * [func (c \*Cmd) Output() ([]byte, error)](#Cmd.Output) * [func (c \*Cmd) Run() error](#Cmd.Run) * [func (c \*Cmd) Start() error](#Cmd.Start) * [func (c \*Cmd) StderrPipe() (io.ReadCloser, error)](#Cmd.StderrPipe) * [func (c \*Cmd) StdinPipe() (io.WriteCloser, error)](#Cmd.StdinPipe) * [func (c \*Cmd) StdoutPipe() (io.ReadCloser, error)](#Cmd.StdoutPipe) * [func (c \*Cmd) String() string](#Cmd.String) * [func (c \*Cmd) Wait() error](#Cmd.Wait) * [type Error](#Error) * [func (e \*Error) Error() string](#Error.Error) * [func (e \*Error) Unwrap() error](#Error.Unwrap) * [type ExitError](#ExitError) * [func (e \*ExitError) Error() string](#ExitError.Error) ### Examples [Cmd.CombinedOutput](#example_Cmd_CombinedOutput) [Cmd.Environ](#example_Cmd_Environ) [Cmd.Output](#example_Cmd_Output) [Cmd.Run](#example_Cmd_Run) [Cmd.Start](#example_Cmd_Start) [Cmd.StderrPipe](#example_Cmd_StderrPipe) [Cmd.StdinPipe](#example_Cmd_StdinPipe) [Cmd.StdoutPipe](#example_Cmd_StdoutPipe) [Command](#example_Command) [CommandContext](#example_CommandContext) [Command (Environment)](#example_Command_environment) [LookPath](#example_LookPath) ### Package files exec.go exec\_unix.go lp\_unix.go Variables --------- ErrDot indicates that a path lookup resolved to an executable in the current directory due to ‘.’ being in the path, either implicitly or explicitly. See the package documentation for details. Note that functions in this package do not return ErrDot directly. Code should use errors.Is(err, ErrDot), not err == ErrDot, to test whether a returned error err is due to this condition. ``` var ErrDot = errors.New("cannot run executable found relative to current directory") ``` ErrNotFound is the error resulting if a path search failed to find an executable file. ``` var ErrNotFound = errors.New("executable file not found in $PATH") ``` ErrWaitDelay is returned by (\*Cmd).Wait if the process exits with a successful status code but its output pipes are not closed before the command's WaitDelay expires. ``` var ErrWaitDelay = errors.New("exec: WaitDelay expired before I/O complete") ``` func LookPath ------------- ``` func LookPath(file string) (string, error) ``` LookPath searches for an executable named file in the directories named by the PATH environment variable. If file contains a slash, it is tried directly and the PATH is not consulted. Otherwise, on success, the result is an absolute path. In older versions of Go, LookPath could return a path relative to the current directory. As of Go 1.19, LookPath will instead return that path along with an error satisfying errors.Is(err, ErrDot). See the package documentation for more details. #### Example Code: ``` path, err := exec.LookPath("fortune") if err != nil { log.Fatal("installing fortune is in your future") } fmt.Printf("fortune is available at %s\n", path) ``` type Cmd -------- Cmd represents an external command being prepared or run. A Cmd cannot be reused after calling its Run, Output or CombinedOutput methods. ``` type Cmd struct { // Path is the path of the command to run. // // This is the only field that must be set to a non-zero // value. If Path is relative, it is evaluated relative // to Dir. Path string // Args holds command line arguments, including the command as Args[0]. // If the Args field is empty or nil, Run uses {Path}. // // In typical use, both Path and Args are set by calling Command. Args []string // Env specifies the environment of the process. // Each entry is of the form "key=value". // If Env is nil, the new process uses the current process's // environment. // If Env contains duplicate environment keys, only the last // value in the slice for each duplicate key is used. // As a special case on Windows, SYSTEMROOT is always added if // missing and not explicitly set to the empty string. Env []string // Dir specifies the working directory of the command. // If Dir is the empty string, Run runs the command in the // calling process's current directory. Dir string // Stdin specifies the process's standard input. // // If Stdin is nil, the process reads from the null device (os.DevNull). // // If Stdin is an *os.File, the process's standard input is connected // directly to that file. // // Otherwise, during the execution of the command a separate // goroutine reads from Stdin and delivers that data to the command // over a pipe. In this case, Wait does not complete until the goroutine // stops copying, either because it has reached the end of Stdin // (EOF or a read error), or because writing to the pipe returned an error, // or because a nonzero WaitDelay was set and expired. Stdin io.Reader // Stdout and Stderr specify the process's standard output and error. // // If either is nil, Run connects the corresponding file descriptor // to the null device (os.DevNull). // // If either is an *os.File, the corresponding output from the process // is connected directly to that file. // // Otherwise, during the execution of the command a separate goroutine // reads from the process over a pipe and delivers that data to the // corresponding Writer. In this case, Wait does not complete until the // goroutine reaches EOF or encounters an error or a nonzero WaitDelay // expires. // // If Stdout and Stderr are the same writer, and have a type that can // be compared with ==, at most one goroutine at a time will call Write. Stdout io.Writer Stderr io.Writer // ExtraFiles specifies additional open files to be inherited by the // new process. It does not include standard input, standard output, or // standard error. If non-nil, entry i becomes file descriptor 3+i. // // ExtraFiles is not supported on Windows. ExtraFiles []*os.File // SysProcAttr holds optional, operating system-specific attributes. // Run passes it to os.StartProcess as the os.ProcAttr's Sys field. SysProcAttr *syscall.SysProcAttr // Process is the underlying process, once started. Process *os.Process // ProcessState contains information about an exited process. // If the process was started successfully, Wait or Run will // populate its ProcessState when the command completes. ProcessState *os.ProcessState Err error // LookPath error, if any; added in Go 1.19 // If Cancel is non-nil, the command must have been created with // CommandContext and Cancel will be called when the command's // Context is done. By default, CommandContext sets Cancel to // call the Kill method on the command's Process. // // Typically a custom Cancel will send a signal to the command's // Process, but it may instead take other actions to initiate cancellation, // such as closing a stdin or stdout pipe or sending a shutdown request on a // network socket. // // If the command exits with a success status after Cancel is // called, and Cancel does not return an error equivalent to // os.ErrProcessDone, then Wait and similar methods will return a non-nil // error: either an error wrapping the one returned by Cancel, // or the error from the Context. // (If the command exits with a non-success status, or Cancel // returns an error that wraps os.ErrProcessDone, Wait and similar methods // continue to return the command's usual exit status.) // // If Cancel is set to nil, nothing will happen immediately when the command's // Context is done, but a nonzero WaitDelay will still take effect. That may // be useful, for example, to work around deadlocks in commands that do not // support shutdown signals but are expected to always finish quickly. // // Cancel will not be called if Start returns a non-nil error. Cancel func() error // Go 1.20 // If WaitDelay is non-zero, it bounds the time spent waiting on two sources // of unexpected delay in Wait: a child process that fails to exit after the // associated Context is canceled, and a child process that exits but leaves // its I/O pipes unclosed. // // The WaitDelay timer starts when either the associated Context is done or a // call to Wait observes that the child process has exited, whichever occurs // first. When the delay has elapsed, the command shuts down the child process // and/or its I/O pipes. // // If the child process has failed to exit — perhaps because it ignored or // failed to receive a shutdown signal from a Cancel function, or because no // Cancel function was set — then it will be terminated using os.Process.Kill. // // Then, if the I/O pipes communicating with the child process are still open, // those pipes are closed in order to unblock any goroutines currently blocked // on Read or Write calls. // // If pipes are closed due to WaitDelay, no Cancel call has occurred, // and the command has otherwise exited with a successful status, Wait and // similar methods will return ErrWaitDelay instead of nil. // // If WaitDelay is zero (the default), I/O pipes will be read until EOF, // which might not occur until orphaned subprocesses of the command have // also closed their descriptors for the pipes. WaitDelay time.Duration // Go 1.20 // contains filtered or unexported fields } ``` ### func Command ``` func Command(name string, arg ...string) *Cmd ``` Command returns the Cmd struct to execute the named program with the given arguments. It sets only the Path and Args in the returned structure. If name contains no path separators, Command uses LookPath to resolve name to a complete path if possible. Otherwise it uses name directly as Path. The returned Cmd's Args field is constructed from the command name followed by the elements of arg, so arg should not include the command name itself. For example, Command("echo", "hello"). Args[0] is always name, not the possibly resolved Path. On Windows, processes receive the whole command line as a single string and do their own parsing. Command combines and quotes Args into a command line string with an algorithm compatible with applications using CommandLineToArgvW (which is the most common way). Notable exceptions are msiexec.exe and cmd.exe (and thus, all batch files), which have a different unquoting algorithm. In these or other similar cases, you can do the quoting yourself and provide the full command line in SysProcAttr.CmdLine, leaving Args empty. #### Example Code: ``` cmd := exec.Command("tr", "a-z", "A-Z") cmd.Stdin = strings.NewReader("some input") var out strings.Builder cmd.Stdout = &out err := cmd.Run() if err != nil { log.Fatal(err) } fmt.Printf("in all caps: %q\n", out.String()) ``` #### Example (Environment) Code: ``` cmd := exec.Command("prog") cmd.Env = append(os.Environ(), "FOO=duplicate_value", // ignored "FOO=actual_value", // this value is used ) if err := cmd.Run(); err != nil { log.Fatal(err) } ``` ### func CommandContext 1.7 ``` func CommandContext(ctx context.Context, name string, arg ...string) *Cmd ``` CommandContext is like Command but includes a context. The provided context is used to interrupt the process (by calling cmd.Cancel or os.Process.Kill) if the context becomes done before the command completes on its own. CommandContext sets the command's Cancel function to invoke the Kill method on its Process, and leaves its WaitDelay unset. The caller may change the cancellation behavior by modifying those fields before starting the command. #### Example Code: ``` ctx, cancel := context.WithTimeout(context.Background(), 100*time.Millisecond) defer cancel() if err := exec.CommandContext(ctx, "sleep", "5").Run(); err != nil { // This will fail after 100 milliseconds. The 5 second sleep // will be interrupted. } ``` ### func (\*Cmd) CombinedOutput ``` func (c *Cmd) CombinedOutput() ([]byte, error) ``` CombinedOutput runs the command and returns its combined standard output and standard error. #### Example Code: ``` cmd := exec.Command("sh", "-c", "echo stdout; echo 1>&2 stderr") stdoutStderr, err := cmd.CombinedOutput() if err != nil { log.Fatal(err) } fmt.Printf("%s\n", stdoutStderr) ``` ### func (\*Cmd) Environ 1.19 ``` func (c *Cmd) Environ() []string ``` Environ returns a copy of the environment in which the command would be run as it is currently configured. #### Example Code: ``` cmd := exec.Command("pwd") // Set Dir before calling cmd.Environ so that it will include an // updated PWD variable (on platforms where that is used). cmd.Dir = ".." cmd.Env = append(cmd.Environ(), "POSIXLY_CORRECT=1") out, err := cmd.Output() if err != nil { log.Fatal(err) } fmt.Printf("%s\n", out) ``` ### func (\*Cmd) Output ``` func (c *Cmd) Output() ([]byte, error) ``` Output runs the command and returns its standard output. Any returned error will usually be of type \*ExitError. If c.Stderr was nil, Output populates ExitError.Stderr. #### Example Code: ``` out, err := exec.Command("date").Output() if err != nil { log.Fatal(err) } fmt.Printf("The date is %s\n", out) ``` ### func (\*Cmd) Run ``` func (c *Cmd) Run() error ``` Run starts the specified command and waits for it to complete. The returned error is nil if the command runs, has no problems copying stdin, stdout, and stderr, and exits with a zero exit status. If the command starts but does not complete successfully, the error is of type \*ExitError. Other error types may be returned for other situations. If the calling goroutine has locked the operating system thread with runtime.LockOSThread and modified any inheritable OS-level thread state (for example, Linux or Plan 9 name spaces), the new process will inherit the caller's thread state. #### Example Code: ``` cmd := exec.Command("sleep", "1") log.Printf("Running command and waiting for it to finish...") err := cmd.Run() log.Printf("Command finished with error: %v", err) ``` ### func (\*Cmd) Start ``` func (c *Cmd) Start() error ``` Start starts the specified command but does not wait for it to complete. If Start returns successfully, the c.Process field will be set. After a successful call to Start the Wait method must be called in order to release associated system resources. #### Example Code: ``` cmd := exec.Command("sleep", "5") err := cmd.Start() if err != nil { log.Fatal(err) } log.Printf("Waiting for command to finish...") err = cmd.Wait() log.Printf("Command finished with error: %v", err) ``` ### func (\*Cmd) StderrPipe ``` func (c *Cmd) StderrPipe() (io.ReadCloser, error) ``` StderrPipe returns a pipe that will be connected to the command's standard error when the command starts. Wait will close the pipe after seeing the command exit, so most callers need not close the pipe themselves. It is thus incorrect to call Wait before all reads from the pipe have completed. For the same reason, it is incorrect to use Run when using StderrPipe. See the StdoutPipe example for idiomatic usage. #### Example Code: ``` cmd := exec.Command("sh", "-c", "echo stdout; echo 1>&2 stderr") stderr, err := cmd.StderrPipe() if err != nil { log.Fatal(err) } if err := cmd.Start(); err != nil { log.Fatal(err) } slurp, _ := io.ReadAll(stderr) fmt.Printf("%s\n", slurp) if err := cmd.Wait(); err != nil { log.Fatal(err) } ``` ### func (\*Cmd) StdinPipe ``` func (c *Cmd) StdinPipe() (io.WriteCloser, error) ``` StdinPipe returns a pipe that will be connected to the command's standard input when the command starts. The pipe will be closed automatically after Wait sees the command exit. A caller need only call Close to force the pipe to close sooner. For example, if the command being run will not exit until standard input is closed, the caller must close the pipe. #### Example Code: ``` cmd := exec.Command("cat") stdin, err := cmd.StdinPipe() if err != nil { log.Fatal(err) } go func() { defer stdin.Close() io.WriteString(stdin, "values written to stdin are passed to cmd's standard input") }() out, err := cmd.CombinedOutput() if err != nil { log.Fatal(err) } fmt.Printf("%s\n", out) ``` ### func (\*Cmd) StdoutPipe ``` func (c *Cmd) StdoutPipe() (io.ReadCloser, error) ``` StdoutPipe returns a pipe that will be connected to the command's standard output when the command starts. Wait will close the pipe after seeing the command exit, so most callers need not close the pipe themselves. It is thus incorrect to call Wait before all reads from the pipe have completed. For the same reason, it is incorrect to call Run when using StdoutPipe. See the example for idiomatic usage. #### Example Code: ``` cmd := exec.Command("echo", "-n", `{"Name": "Bob", "Age": 32}`) stdout, err := cmd.StdoutPipe() if err != nil { log.Fatal(err) } if err := cmd.Start(); err != nil { log.Fatal(err) } var person struct { Name string Age int } if err := json.NewDecoder(stdout).Decode(&person); err != nil { log.Fatal(err) } if err := cmd.Wait(); err != nil { log.Fatal(err) } fmt.Printf("%s is %d years old\n", person.Name, person.Age) ``` ### func (\*Cmd) String 1.13 ``` func (c *Cmd) String() string ``` String returns a human-readable description of c. It is intended only for debugging. In particular, it is not suitable for use as input to a shell. The output of String may vary across Go releases. ### func (\*Cmd) Wait ``` func (c *Cmd) Wait() error ``` Wait waits for the command to exit and waits for any copying to stdin or copying from stdout or stderr to complete. The command must have been started by Start. The returned error is nil if the command runs, has no problems copying stdin, stdout, and stderr, and exits with a zero exit status. If the command fails to run or doesn't complete successfully, the error is of type \*ExitError. Other error types may be returned for I/O problems. If any of c.Stdin, c.Stdout or c.Stderr are not an \*os.File, Wait also waits for the respective I/O loop copying to or from the process to complete. Wait releases any resources associated with the Cmd. type Error ---------- Error is returned by LookPath when it fails to classify a file as an executable. ``` type Error struct { // Name is the file name for which the error occurred. Name string // Err is the underlying error. Err error } ``` ### func (\*Error) Error ``` func (e *Error) Error() string ``` ### func (\*Error) Unwrap 1.13 ``` func (e *Error) Unwrap() error ``` type ExitError -------------- An ExitError reports an unsuccessful exit by a command. ``` type ExitError struct { *os.ProcessState // Stderr holds a subset of the standard error output from the // Cmd.Output method if standard error was not otherwise being // collected. // // If the error output is long, Stderr may contain only a prefix // and suffix of the output, with the middle replaced with // text about the number of omitted bytes. // // Stderr is provided for debugging, for inclusion in error messages. // Users with other needs should redirect Cmd.Stderr as needed. Stderr []byte // Go 1.6 } ``` ### func (\*ExitError) Error ``` func (e *ExitError) Error() string ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) |
programming_docs
go Package signal Package signal =============== * `import "os/signal"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package signal implements access to incoming signals. Signals are primarily used on Unix-like systems. For the use of this package on Windows and Plan 9, see below. ### Types of signals The signals SIGKILL and SIGSTOP may not be caught by a program, and therefore cannot be affected by this package. Synchronous signals are signals triggered by errors in program execution: SIGBUS, SIGFPE, and SIGSEGV. These are only considered synchronous when caused by program execution, not when sent using os.Process.Kill or the kill program or some similar mechanism. In general, except as discussed below, Go programs will convert a synchronous signal into a run-time panic. The remaining signals are asynchronous signals. They are not triggered by program errors, but are instead sent from the kernel or from some other program. Of the asynchronous signals, the SIGHUP signal is sent when a program loses its controlling terminal. The SIGINT signal is sent when the user at the controlling terminal presses the interrupt character, which by default is ^C (Control-C). The SIGQUIT signal is sent when the user at the controlling terminal presses the quit character, which by default is ^\ (Control-Backslash). In general you can cause a program to simply exit by pressing ^C, and you can cause it to exit with a stack dump by pressing ^\. ### Default behavior of signals in Go programs By default, a synchronous signal is converted into a run-time panic. A SIGHUP, SIGINT, or SIGTERM signal causes the program to exit. A SIGQUIT, SIGILL, SIGTRAP, SIGABRT, SIGSTKFLT, SIGEMT, or SIGSYS signal causes the program to exit with a stack dump. A SIGTSTP, SIGTTIN, or SIGTTOU signal gets the system default behavior (these signals are used by the shell for job control). The SIGPROF signal is handled directly by the Go runtime to implement runtime.CPUProfile. Other signals will be caught but no action will be taken. If the Go program is started with either SIGHUP or SIGINT ignored (signal handler set to SIG\_IGN), they will remain ignored. If the Go program is started with a non-empty signal mask, that will generally be honored. However, some signals are explicitly unblocked: the synchronous signals, SIGILL, SIGTRAP, SIGSTKFLT, SIGCHLD, SIGPROF, and, on Linux, signals 32 (SIGCANCEL) and 33 (SIGSETXID) (SIGCANCEL and SIGSETXID are used internally by glibc). Subprocesses started by os.Exec, or by the os/exec package, will inherit the modified signal mask. ### Changing the behavior of signals in Go programs The functions in this package allow a program to change the way Go programs handle signals. Notify disables the default behavior for a given set of asynchronous signals and instead delivers them over one or more registered channels. Specifically, it applies to the signals SIGHUP, SIGINT, SIGQUIT, SIGABRT, and SIGTERM. It also applies to the job control signals SIGTSTP, SIGTTIN, and SIGTTOU, in which case the system default behavior does not occur. It also applies to some signals that otherwise cause no action: SIGUSR1, SIGUSR2, SIGPIPE, SIGALRM, SIGCHLD, SIGCONT, SIGURG, SIGXCPU, SIGXFSZ, SIGVTALRM, SIGWINCH, SIGIO, SIGPWR, SIGSYS, SIGINFO, SIGTHR, SIGWAITING, SIGLWP, SIGFREEZE, SIGTHAW, SIGLOST, SIGXRES, SIGJVM1, SIGJVM2, and any real time signals used on the system. Note that not all of these signals are available on all systems. If the program was started with SIGHUP or SIGINT ignored, and Notify is called for either signal, a signal handler will be installed for that signal and it will no longer be ignored. If, later, Reset or Ignore is called for that signal, or Stop is called on all channels passed to Notify for that signal, the signal will once again be ignored. Reset will restore the system default behavior for the signal, while Ignore will cause the system to ignore the signal entirely. If the program is started with a non-empty signal mask, some signals will be explicitly unblocked as described above. If Notify is called for a blocked signal, it will be unblocked. If, later, Reset is called for that signal, or Stop is called on all channels passed to Notify for that signal, the signal will once again be blocked. ### SIGPIPE When a Go program writes to a broken pipe, the kernel will raise a SIGPIPE signal. If the program has not called Notify to receive SIGPIPE signals, then the behavior depends on the file descriptor number. A write to a broken pipe on file descriptors 1 or 2 (standard output or standard error) will cause the program to exit with a SIGPIPE signal. A write to a broken pipe on some other file descriptor will take no action on the SIGPIPE signal, and the write will fail with an EPIPE error. If the program has called Notify to receive SIGPIPE signals, the file descriptor number does not matter. The SIGPIPE signal will be delivered to the Notify channel, and the write will fail with an EPIPE error. This means that, by default, command line programs will behave like typical Unix command line programs, while other programs will not crash with SIGPIPE when writing to a closed network connection. ### Go programs that use cgo or SWIG In a Go program that includes non-Go code, typically C/C++ code accessed using cgo or SWIG, Go's startup code normally runs first. It configures the signal handlers as expected by the Go runtime, before the non-Go startup code runs. If the non-Go startup code wishes to install its own signal handlers, it must take certain steps to keep Go working well. This section documents those steps and the overall effect changes to signal handler settings by the non-Go code can have on Go programs. In rare cases, the non-Go code may run before the Go code, in which case the next section also applies. If the non-Go code called by the Go program does not change any signal handlers or masks, then the behavior is the same as for a pure Go program. If the non-Go code installs any signal handlers, it must use the SA\_ONSTACK flag with sigaction. Failing to do so is likely to cause the program to crash if the signal is received. Go programs routinely run with a limited stack, and therefore set up an alternate signal stack. If the non-Go code installs a signal handler for any of the synchronous signals (SIGBUS, SIGFPE, SIGSEGV), then it should record the existing Go signal handler. If those signals occur while executing Go code, it should invoke the Go signal handler (whether the signal occurs while executing Go code can be determined by looking at the PC passed to the signal handler). Otherwise some Go run-time panics will not occur as expected. If the non-Go code installs a signal handler for any of the asynchronous signals, it may invoke the Go signal handler or not as it chooses. Naturally, if it does not invoke the Go signal handler, the Go behavior described above will not occur. This can be an issue with the SIGPROF signal in particular. The non-Go code should not change the signal mask on any threads created by the Go runtime. If the non-Go code starts new threads of its own, it may set the signal mask as it pleases. If the non-Go code starts a new thread, changes the signal mask, and then invokes a Go function in that thread, the Go runtime will automatically unblock certain signals: the synchronous signals, SIGILL, SIGTRAP, SIGSTKFLT, SIGCHLD, SIGPROF, SIGCANCEL, and SIGSETXID. When the Go function returns, the non-Go signal mask will be restored. If the Go signal handler is invoked on a non-Go thread not running Go code, the handler generally forwards the signal to the non-Go code, as follows. If the signal is SIGPROF, the Go handler does nothing. Otherwise, the Go handler removes itself, unblocks the signal, and raises it again, to invoke any non-Go handler or default system handler. If the program does not exit, the Go handler then reinstalls itself and continues execution of the program. If a SIGPIPE signal is received, the Go program will invoke the special handling described above if the SIGPIPE is received on a Go thread. If the SIGPIPE is received on a non-Go thread the signal will be forwarded to the non-Go handler, if any; if there is none the default system handler will cause the program to terminate. ### Non-Go programs that call Go code When Go code is built with options like -buildmode=c-shared, it will be run as part of an existing non-Go program. The non-Go code may have already installed signal handlers when the Go code starts (that may also happen in unusual cases when using cgo or SWIG; in that case, the discussion here applies). For -buildmode=c-archive the Go runtime will initialize signals at global constructor time. For -buildmode=c-shared the Go runtime will initialize signals when the shared library is loaded. If the Go runtime sees an existing signal handler for the SIGCANCEL or SIGSETXID signals (which are used only on Linux), it will turn on the SA\_ONSTACK flag and otherwise keep the signal handler. For the synchronous signals and SIGPIPE, the Go runtime will install a signal handler. It will save any existing signal handler. If a synchronous signal arrives while executing non-Go code, the Go runtime will invoke the existing signal handler instead of the Go signal handler. Go code built with -buildmode=c-archive or -buildmode=c-shared will not install any other signal handlers by default. If there is an existing signal handler, the Go runtime will turn on the SA\_ONSTACK flag and otherwise keep the signal handler. If Notify is called for an asynchronous signal, a Go signal handler will be installed for that signal. If, later, Reset is called for that signal, the original handling for that signal will be reinstalled, restoring the non-Go signal handler if any. Go code built without -buildmode=c-archive or -buildmode=c-shared will install a signal handler for the asynchronous signals listed above, and save any existing signal handler. If a signal is delivered to a non-Go thread, it will act as described above, except that if there is an existing non-Go signal handler, that handler will be installed before raising the signal. ### Windows On Windows a ^C (Control-C) or ^BREAK (Control-Break) normally cause the program to exit. If Notify is called for os.Interrupt, ^C or ^BREAK will cause os.Interrupt to be sent on the channel, and the program will not exit. If Reset is called, or Stop is called on all channels passed to Notify, then the default behavior will be restored. Additionally, if Notify is called, and Windows sends CTRL\_CLOSE\_EVENT, CTRL\_LOGOFF\_EVENT or CTRL\_SHUTDOWN\_EVENT to the process, Notify will return syscall.SIGTERM. Unlike Control-C and Control-Break, Notify does not change process behavior when either CTRL\_CLOSE\_EVENT, CTRL\_LOGOFF\_EVENT or CTRL\_SHUTDOWN\_EVENT is received - the process will still get terminated unless it exits. But receiving syscall.SIGTERM will give the process an opportunity to clean up before termination. ### Plan 9 On Plan 9, signals have type syscall.Note, which is a string. Calling Notify with a syscall.Note will cause that value to be sent on the channel when that string is posted as a note. Index ----- * [func Ignore(sig ...os.Signal)](#Ignore) * [func Ignored(sig os.Signal) bool](#Ignored) * [func Notify(c chan<- os.Signal, sig ...os.Signal)](#Notify) * [func NotifyContext(parent context.Context, signals ...os.Signal) (ctx context.Context, stop context.CancelFunc)](#NotifyContext) * [func Reset(sig ...os.Signal)](#Reset) * [func Stop(c chan<- os.Signal)](#Stop) ### Examples [Notify](#example_Notify) [NotifyContext](#example_NotifyContext) [Notify (AllSignals)](#example_Notify_allSignals) ### Package files doc.go signal.go signal\_unix.go func Ignore 1.5 --------------- ``` func Ignore(sig ...os.Signal) ``` Ignore causes the provided signals to be ignored. If they are received by the program, nothing will happen. Ignore undoes the effect of any prior calls to Notify for the provided signals. If no signals are provided, all incoming signals will be ignored. func Ignored 1.11 ----------------- ``` func Ignored(sig os.Signal) bool ``` Ignored reports whether sig is currently ignored. func Notify ----------- ``` func Notify(c chan<- os.Signal, sig ...os.Signal) ``` Notify causes package signal to relay incoming signals to c. If no signals are provided, all incoming signals will be relayed to c. Otherwise, just the provided signals will. Package signal will not block sending to c: the caller must ensure that c has sufficient buffer space to keep up with the expected signal rate. For a channel used for notification of just one signal value, a buffer of size 1 is sufficient. It is allowed to call Notify multiple times with the same channel: each call expands the set of signals sent to that channel. The only way to remove signals from the set is to call Stop. It is allowed to call Notify multiple times with different channels and the same signals: each channel receives copies of incoming signals independently. #### Example Code: ``` // Set up channel on which to send signal notifications. // We must use a buffered channel or risk missing the signal // if we're not ready to receive when the signal is sent. c := make(chan os.Signal, 1) signal.Notify(c, os.Interrupt) // Block until a signal is received. s := <-c fmt.Println("Got signal:", s) ``` #### Example (AllSignals) Code: ``` // Set up channel on which to send signal notifications. // We must use a buffered channel or risk missing the signal // if we're not ready to receive when the signal is sent. c := make(chan os.Signal, 1) // Passing no signals to Notify means that // all signals will be sent to the channel. signal.Notify(c) // Block until any signal is received. s := <-c fmt.Println("Got signal:", s) ``` func NotifyContext 1.16 ----------------------- ``` func NotifyContext(parent context.Context, signals ...os.Signal) (ctx context.Context, stop context.CancelFunc) ``` NotifyContext returns a copy of the parent context that is marked done (its Done channel is closed) when one of the listed signals arrives, when the returned stop function is called, or when the parent context's Done channel is closed, whichever happens first. The stop function unregisters the signal behavior, which, like signal.Reset, may restore the default behavior for a given signal. For example, the default behavior of a Go program receiving os.Interrupt is to exit. Calling NotifyContext(parent, os.Interrupt) will change the behavior to cancel the returned context. Future interrupts received will not trigger the default (exit) behavior until the returned stop function is called. The stop function releases resources associated with it, so code should call stop as soon as the operations running in this Context complete and signals no longer need to be diverted to the context. #### Example This example passes a context with a signal to tell a blocking function that it should abandon its work after a signal is received. Code: ``` ctx, stop := signal.NotifyContext(context.Background(), os.Interrupt) defer stop() p, err := os.FindProcess(os.Getpid()) if err != nil { log.Fatal(err) } // On a Unix-like system, pressing Ctrl+C on a keyboard sends a // SIGINT signal to the process of the program in execution. // // This example simulates that by sending a SIGINT signal to itself. if err := p.Signal(os.Interrupt); err != nil { log.Fatal(err) } select { case <-time.After(time.Second): fmt.Println("missed signal") case <-ctx.Done(): fmt.Println(ctx.Err()) // prints "context canceled" stop() // stop receiving signal notifications as soon as possible. } ``` Output: ``` context canceled ``` func Reset 1.5 -------------- ``` func Reset(sig ...os.Signal) ``` Reset undoes the effect of any prior calls to Notify for the provided signals. If no signals are provided, all signal handlers will be reset. func Stop 1.1 ------------- ``` func Stop(c chan<- os.Signal) ``` Stop causes package signal to stop relaying incoming signals to c. It undoes the effect of all prior calls to Notify using c. When Stop returns, it is guaranteed that c will receive no more signals. go Command arena Command arena ============== The arena package provides the ability to allocate memory for a collection of Go values and free that space manually all at once, safely. The purpose of this functionality is to improve efficiency: manually freeing memory before a garbage collection delays that cycle. Less frequent cycles means the CPU cost of the garbage collector is incurred less frequently. This functionality in this package is mostly captured in the Arena type. Arenas allocate large chunks of memory for Go values, so they're likely to be inefficient for allocating only small amounts of small Go values. They're best used in bulk, on the order of MiB of memory allocated on each use. Note that by allowing for this limited form of manual memory allocation that use-after-free bugs are possible with regular Go values. This package limits the impact of these use-after-free bugs by preventing reuse of freed memory regions until the garbage collector is able to determine that it is safe. Typically, a use-after-free bug will result in a fault and a helpful error message, but this package reserves the right to not force a fault on freed memory. That means a valid implementation of this package is to just allocate all memory the way the runtime normally would, and in fact, it reserves the right to occasionally do so for some Go values. go Package tabwriter Package tabwriter ================== * `import "text/tabwriter"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package tabwriter implements a write filter (tabwriter.Writer) that translates tabbed columns in input into properly aligned text. The package is using the Elastic Tabstops algorithm described at <http://nickgravgaard.com/elastictabstops/index.html>. The text/tabwriter package is frozen and is not accepting new features. #### Example (Elastic) Code: ``` // Observe how the b's and the d's, despite appearing in the // second cell of each line, belong to different columns. w := tabwriter.NewWriter(os.Stdout, 0, 0, 1, '.', tabwriter.AlignRight|tabwriter.Debug) fmt.Fprintln(w, "a\tb\tc") fmt.Fprintln(w, "aa\tbb\tcc") fmt.Fprintln(w, "aaa\t") // trailing tab fmt.Fprintln(w, "aaaa\tdddd\teeee") w.Flush() ``` Output: ``` ....a|..b|c ...aa|.bb|cc ..aaa| .aaaa|.dddd|eeee ``` #### Example (TrailingTab) Code: ``` // Observe that the third line has no trailing tab, // so its final cell is not part of an aligned column. const padding = 3 w := tabwriter.NewWriter(os.Stdout, 0, 0, padding, '-', tabwriter.AlignRight|tabwriter.Debug) fmt.Fprintln(w, "a\tb\taligned\t") fmt.Fprintln(w, "aa\tbb\taligned\t") fmt.Fprintln(w, "aaa\tbbb\tunaligned") // no trailing tab fmt.Fprintln(w, "aaaa\tbbbb\taligned\t") w.Flush() ``` Output: ``` ------a|------b|---aligned| -----aa|-----bb|---aligned| ----aaa|----bbb|unaligned ---aaaa|---bbbb|---aligned| ``` Index ----- * [Constants](#pkg-constants) * [type Writer](#Writer) * [func NewWriter(output io.Writer, minwidth, tabwidth, padding int, padchar byte, flags uint) \*Writer](#NewWriter) * [func (b \*Writer) Flush() error](#Writer.Flush) * [func (b \*Writer) Init(output io.Writer, minwidth, tabwidth, padding int, padchar byte, flags uint) \*Writer](#Writer.Init) * [func (b \*Writer) Write(buf []byte) (n int, err error)](#Writer.Write) ### Examples [Writer.Init](#example_Writer_Init) [Package (Elastic)](#example__elastic) [Package (TrailingTab)](#example__trailingTab) ### Package files tabwriter.go Constants --------- Formatting can be controlled with these flags. ``` const ( // Ignore html tags and treat entities (starting with '&' // and ending in ';') as single characters (width = 1). FilterHTML uint = 1 << iota // Strip Escape characters bracketing escaped text segments // instead of passing them through unchanged with the text. StripEscape // Force right-alignment of cell content. // Default is left-alignment. AlignRight // Handle empty columns as if they were not present in // the input in the first place. DiscardEmptyColumns // Always use tabs for indentation columns (i.e., padding of // leading empty cells on the left) independent of padchar. TabIndent // Print a vertical bar ('|') between columns (after formatting). // Discarded columns appear as zero-width columns ("||"). Debug ) ``` To escape a text segment, bracket it with Escape characters. For instance, the tab in this string "Ignore this tab: \xff\t\xff" does not terminate a cell and constitutes a single character of width one for formatting purposes. The value 0xff was chosen because it cannot appear in a valid UTF-8 sequence. ``` const Escape = '\xff' ``` type Writer ----------- A Writer is a filter that inserts padding around tab-delimited columns in its input to align them in the output. The Writer treats incoming bytes as UTF-8-encoded text consisting of cells terminated by horizontal ('\t') or vertical ('\v') tabs, and newline ('\n') or formfeed ('\f') characters; both newline and formfeed act as line breaks. Tab-terminated cells in contiguous lines constitute a column. The Writer inserts padding as needed to make all cells in a column have the same width, effectively aligning the columns. It assumes that all characters have the same width, except for tabs for which a tabwidth must be specified. Column cells must be tab-terminated, not tab-separated: non-tab terminated trailing text at the end of a line forms a cell but that cell is not part of an aligned column. For instance, in this example (where | stands for a horizontal tab): ``` aaaa|bbb|d aa |b |dd a | aa |cccc|eee ``` the b and c are in distinct columns (the b column is not contiguous all the way). The d and e are not in a column at all (there's no terminating tab, nor would the column be contiguous). The Writer assumes that all Unicode code points have the same width; this may not be true in some fonts or if the string contains combining characters. If DiscardEmptyColumns is set, empty columns that are terminated entirely by vertical (or "soft") tabs are discarded. Columns terminated by horizontal (or "hard") tabs are not affected by this flag. If a Writer is configured to filter HTML, HTML tags and entities are passed through. The widths of tags and entities are assumed to be zero (tags) and one (entities) for formatting purposes. A segment of text may be escaped by bracketing it with Escape characters. The tabwriter passes escaped text segments through unchanged. In particular, it does not interpret any tabs or line breaks within the segment. If the StripEscape flag is set, the Escape characters are stripped from the output; otherwise they are passed through as well. For the purpose of formatting, the width of the escaped text is always computed excluding the Escape characters. The formfeed character acts like a newline but it also terminates all columns in the current line (effectively calling Flush). Tab- terminated cells in the next line start new columns. Unless found inside an HTML tag or inside an escaped text segment, formfeed characters appear as newlines in the output. The Writer must buffer input internally, because proper spacing of one line may depend on the cells in future lines. Clients must call Flush when done calling Write. ``` type Writer struct { // contains filtered or unexported fields } ``` ### func NewWriter ``` func NewWriter(output io.Writer, minwidth, tabwidth, padding int, padchar byte, flags uint) *Writer ``` NewWriter allocates and initializes a new tabwriter.Writer. The parameters are the same as for the Init function. ### func (\*Writer) Flush ``` func (b *Writer) Flush() error ``` Flush should be called after the last call to Write to ensure that any data buffered in the Writer is written to output. Any incomplete escape sequence at the end is considered complete for formatting purposes. ### func (\*Writer) Init ``` func (b *Writer) Init(output io.Writer, minwidth, tabwidth, padding int, padchar byte, flags uint) *Writer ``` A Writer must be initialized with a call to Init. The first parameter (output) specifies the filter output. The remaining parameters control the formatting: ``` minwidth minimal cell width including any padding tabwidth width of tab characters (equivalent number of spaces) padding padding added to a cell before computing its width padchar ASCII char used for padding if padchar == '\t', the Writer will assume that the width of a '\t' in the formatted output is tabwidth, and cells are left-aligned independent of align_left (for correct-looking results, tabwidth must correspond to the tab width in the viewer displaying the result) flags formatting control ``` #### Example Code: ``` w := new(tabwriter.Writer) // Format in tab-separated columns with a tab stop of 8. w.Init(os.Stdout, 0, 8, 0, '\t', 0) fmt.Fprintln(w, "a\tb\tc\td\t.") fmt.Fprintln(w, "123\t12345\t1234567\t123456789\t.") fmt.Fprintln(w) w.Flush() // Format right-aligned in space-separated columns of minimal width 5 // and at least one blank of padding (so wider column entries do not // touch each other). w.Init(os.Stdout, 5, 0, 1, ' ', tabwriter.AlignRight) fmt.Fprintln(w, "a\tb\tc\td\t.") fmt.Fprintln(w, "123\t12345\t1234567\t123456789\t.") fmt.Fprintln(w) w.Flush() ``` Output: ``` a b c d . 123 12345 1234567 123456789 . a b c d. 123 12345 1234567 123456789. ``` ### func (\*Writer) Write ``` func (b *Writer) Write(buf []byte) (n int, err error) ``` Write writes buf to the writer b. The only errors returned are ones encountered while writing to the underlying output stream.
programming_docs
go Package template Package template ================= * `import "text/template"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package template implements data-driven templates for generating textual output. To generate HTML output, see package html/template, which has the same interface as this package but automatically secures HTML output against certain attacks. Templates are executed by applying them to a data structure. Annotations in the template refer to elements of the data structure (typically a field of a struct or a key in a map) to control execution and derive values to be displayed. Execution of the template walks the structure and sets the cursor, represented by a period '.' and called "dot", to the value at the current location in the structure as execution proceeds. The input text for a template is UTF-8-encoded text in any format. "Actions"--data evaluations or control structures--are delimited by "{{" and "}}"; all text outside actions is copied to the output unchanged. Once parsed, a template may be executed safely in parallel, although if parallel executions share a Writer the output may be interleaved. Here is a trivial example that prints "17 items are made of wool". ``` type Inventory struct { Material string Count uint } sweaters := Inventory{"wool", 17} tmpl, err := template.New("test").Parse("{{.Count}} items are made of {{.Material}}") if err != nil { panic(err) } err = tmpl.Execute(os.Stdout, sweaters) if err != nil { panic(err) } ``` More intricate examples appear below. ### Text and spaces By default, all text between actions is copied verbatim when the template is executed. For example, the string " items are made of " in the example above appears on standard output when the program is run. However, to aid in formatting template source code, if an action's left delimiter (by default "{{") is followed immediately by a minus sign and white space, all trailing white space is trimmed from the immediately preceding text. Similarly, if the right delimiter ("}}") is preceded by white space and a minus sign, all leading white space is trimmed from the immediately following text. In these trim markers, the white space must be present: "{{- 3}}" is like "{{3}}" but trims the immediately preceding text, while "{{-3}}" parses as an action containing the number -3. For instance, when executing the template whose source is ``` "{{23 -}} < {{- 45}}" ``` the generated output would be ``` "23<45" ``` For this trimming, the definition of white space characters is the same as in Go: space, horizontal tab, carriage return, and newline. ### Actions Here is the list of actions. "Arguments" and "pipelines" are evaluations of data, defined in detail in the corresponding sections that follow. ``` {{/* a comment */}} {{- /* a comment with white space trimmed from preceding and following text */ -}} A comment; discarded. May contain newlines. Comments do not nest and must start and end at the delimiters, as shown here. {{pipeline}} The default textual representation (the same as would be printed by fmt.Print) of the value of the pipeline is copied to the output. {{if pipeline}} T1 {{end}} If the value of the pipeline is empty, no output is generated; otherwise, T1 is executed. The empty values are false, 0, any nil pointer or interface value, and any array, slice, map, or string of length zero. Dot is unaffected. {{if pipeline}} T1 {{else}} T0 {{end}} If the value of the pipeline is empty, T0 is executed; otherwise, T1 is executed. Dot is unaffected. {{if pipeline}} T1 {{else if pipeline}} T0 {{end}} To simplify the appearance of if-else chains, the else action of an if may include another if directly; the effect is exactly the same as writing {{if pipeline}} T1 {{else}}{{if pipeline}} T0 {{end}}{{end}} {{range pipeline}} T1 {{end}} The value of the pipeline must be an array, slice, map, or channel. If the value of the pipeline has length zero, nothing is output; otherwise, dot is set to the successive elements of the array, slice, or map and T1 is executed. If the value is a map and the keys are of basic type with a defined order, the elements will be visited in sorted key order. {{range pipeline}} T1 {{else}} T0 {{end}} The value of the pipeline must be an array, slice, map, or channel. If the value of the pipeline has length zero, dot is unaffected and T0 is executed; otherwise, dot is set to the successive elements of the array, slice, or map and T1 is executed. {{break}} The innermost {{range pipeline}} loop is ended early, stopping the current iteration and bypassing all remaining iterations. {{continue}} The current iteration of the innermost {{range pipeline}} loop is stopped, and the loop starts the next iteration. {{template "name"}} The template with the specified name is executed with nil data. {{template "name" pipeline}} The template with the specified name is executed with dot set to the value of the pipeline. {{block "name" pipeline}} T1 {{end}} A block is shorthand for defining a template {{define "name"}} T1 {{end}} and then executing it in place {{template "name" pipeline}} The typical use is to define a set of root templates that are then customized by redefining the block templates within. {{with pipeline}} T1 {{end}} If the value of the pipeline is empty, no output is generated; otherwise, dot is set to the value of the pipeline and T1 is executed. {{with pipeline}} T1 {{else}} T0 {{end}} If the value of the pipeline is empty, dot is unaffected and T0 is executed; otherwise, dot is set to the value of the pipeline and T1 is executed. ``` ### Arguments An argument is a simple value, denoted by one of the following. * A boolean, string, character, integer, floating-point, imaginary or complex constant in Go syntax. These behave like Go's untyped constants. Note that, as in Go, whether a large integer constant overflows when assigned or passed to a function can depend on whether the host machine's ints are 32 or 64 bits. * The keyword nil, representing an untyped Go nil. * The character '.' (period): . The result is the value of dot. * A variable name, which is a (possibly empty) alphanumeric string preceded by a dollar sign, such as $piOver2 or $ The result is the value of the variable. Variables are described below. * The name of a field of the data, which must be a struct, preceded by a period, such as .Field The result is the value of the field. Field invocations may be chained: .Field1.Field2 Fields can also be evaluated on variables, including chaining: $x.Field1.Field2 * The name of a key of the data, which must be a map, preceded by a period, such as .Key The result is the map element value indexed by the key. Key invocations may be chained and combined with fields to any depth: .Field1.Key1.Field2.Key2 Although the key must be an alphanumeric identifier, unlike with field names they do not need to start with an upper case letter. Keys can also be evaluated on variables, including chaining: $x.key1.key2 * The name of a niladic method of the data, preceded by a period, such as .Method The result is the value of invoking the method with dot as the receiver, dot.Method(). Such a method must have one return value (of any type) or two return values, the second of which is an error. If it has two and the returned error is non-nil, execution terminates and an error is returned to the caller as the value of Execute. Method invocations may be chained and combined with fields and keys to any depth: .Field1.Key1.Method1.Field2.Key2.Method2 Methods can also be evaluated on variables, including chaining: $x.Method1.Field * The name of a niladic function, such as fun The result is the value of invoking the function, fun(). The return types and values behave as in methods. Functions and function names are described below. * A parenthesized instance of one the above, for grouping. The result may be accessed by a field or map key invocation. print (.F1 arg1) (.F2 arg2) (.StructValuedMethod "arg").Field Arguments may evaluate to any type; if they are pointers the implementation automatically indirects to the base type when required. If an evaluation yields a function value, such as a function-valued field of a struct, the function is not invoked automatically, but it can be used as a truth value for an if action and the like. To invoke it, use the call function, defined below. ### Pipelines A pipeline is a possibly chained sequence of "commands". A command is a simple value (argument) or a function or method call, possibly with multiple arguments: ``` Argument The result is the value of evaluating the argument. .Method [Argument...] The method can be alone or the last element of a chain but, unlike methods in the middle of a chain, it can take arguments. The result is the value of calling the method with the arguments: dot.Method(Argument1, etc.) functionName [Argument...] The result is the value of calling the function associated with the name: function(Argument1, etc.) Functions and function names are described below. ``` A pipeline may be "chained" by separating a sequence of commands with pipeline characters '|'. In a chained pipeline, the result of each command is passed as the last argument of the following command. The output of the final command in the pipeline is the value of the pipeline. The output of a command will be either one value or two values, the second of which has type error. If that second value is present and evaluates to non-nil, execution terminates and the error is returned to the caller of Execute. ### Variables A pipeline inside an action may initialize a variable to capture the result. The initialization has syntax ``` $variable := pipeline ``` where $variable is the name of the variable. An action that declares a variable produces no output. Variables previously declared can also be assigned, using the syntax ``` $variable = pipeline ``` If a "range" action initializes a variable, the variable is set to the successive elements of the iteration. Also, a "range" may declare two variables, separated by a comma: ``` range $index, $element := pipeline ``` in which case $index and $element are set to the successive values of the array/slice index or map key and element, respectively. Note that if there is only one variable, it is assigned the element; this is opposite to the convention in Go range clauses. A variable's scope extends to the "end" action of the control structure ("if", "with", or "range") in which it is declared, or to the end of the template if there is no such control structure. A template invocation does not inherit variables from the point of its invocation. When execution begins, $ is set to the data argument passed to Execute, that is, to the starting value of dot. ### Examples Here are some example one-line templates demonstrating pipelines and variables. All produce the quoted word "output": ``` {{"\"output\""}} A string constant. {{`"output"`}} A raw string constant. {{printf "%q" "output"}} A function call. {{"output" | printf "%q"}} A function call whose final argument comes from the previous command. {{printf "%q" (print "out" "put")}} A parenthesized argument. {{"put" | printf "%s%s" "out" | printf "%q"}} A more elaborate call. {{"output" | printf "%s" | printf "%q"}} A longer chain. {{with "output"}}{{printf "%q" .}}{{end}} A with action using dot. {{with $x := "output" | printf "%q"}}{{$x}}{{end}} A with action that creates and uses a variable. {{with $x := "output"}}{{printf "%q" $x}}{{end}} A with action that uses the variable in another action. {{with $x := "output"}}{{$x | printf "%q"}}{{end}} The same, but pipelined. ``` ### Functions During execution functions are found in two function maps: first in the template, then in the global function map. By default, no functions are defined in the template but the Funcs method can be used to add them. Predefined global functions are named as follows. ``` and Returns the boolean AND of its arguments by returning the first empty argument or the last argument. That is, "and x y" behaves as "if x then y else x." Evaluation proceeds through the arguments left to right and returns when the result is determined. call Returns the result of calling the first argument, which must be a function, with the remaining arguments as parameters. Thus "call .X.Y 1 2" is, in Go notation, dot.X.Y(1, 2) where Y is a func-valued field, map entry, or the like. The first argument must be the result of an evaluation that yields a value of function type (as distinct from a predefined function such as print). The function must return either one or two result values, the second of which is of type error. If the arguments don't match the function or the returned error value is non-nil, execution stops. html Returns the escaped HTML equivalent of the textual representation of its arguments. This function is unavailable in html/template, with a few exceptions. index Returns the result of indexing its first argument by the following arguments. Thus "index x 1 2 3" is, in Go syntax, x[1][2][3]. Each indexed item must be a map, slice, or array. slice slice returns the result of slicing its first argument by the remaining arguments. Thus "slice x 1 2" is, in Go syntax, x[1:2], while "slice x" is x[:], "slice x 1" is x[1:], and "slice x 1 2 3" is x[1:2:3]. The first argument must be a string, slice, or array. js Returns the escaped JavaScript equivalent of the textual representation of its arguments. len Returns the integer length of its argument. not Returns the boolean negation of its single argument. or Returns the boolean OR of its arguments by returning the first non-empty argument or the last argument, that is, "or x y" behaves as "if x then x else y". Evaluation proceeds through the arguments left to right and returns when the result is determined. print An alias for fmt.Sprint printf An alias for fmt.Sprintf println An alias for fmt.Sprintln urlquery Returns the escaped value of the textual representation of its arguments in a form suitable for embedding in a URL query. This function is unavailable in html/template, with a few exceptions. ``` The boolean functions take any zero value to be false and a non-zero value to be true. There is also a set of binary comparison operators defined as functions: ``` eq Returns the boolean truth of arg1 == arg2 ne Returns the boolean truth of arg1 != arg2 lt Returns the boolean truth of arg1 < arg2 le Returns the boolean truth of arg1 <= arg2 gt Returns the boolean truth of arg1 > arg2 ge Returns the boolean truth of arg1 >= arg2 ``` For simpler multi-way equality tests, eq (only) accepts two or more arguments and compares the second and subsequent to the first, returning in effect ``` arg1==arg2 || arg1==arg3 || arg1==arg4 ... ``` (Unlike with || in Go, however, eq is a function call and all the arguments will be evaluated.) The comparison functions work on any values whose type Go defines as comparable. For basic types such as integers, the rules are relaxed: size and exact type are ignored, so any integer value, signed or unsigned, may be compared with any other integer value. (The arithmetic value is compared, not the bit pattern, so all negative integers are less than all unsigned integers.) However, as usual, one may not compare an int with a float32 and so on. ### Associated templates Each template is named by a string specified when it is created. Also, each template is associated with zero or more other templates that it may invoke by name; such associations are transitive and form a name space of templates. A template may use a template invocation to instantiate another associated template; see the explanation of the "template" action above. The name must be that of a template associated with the template that contains the invocation. ### Nested template definitions When parsing a template, another template may be defined and associated with the template being parsed. Template definitions must appear at the top level of the template, much like global variables in a Go program. The syntax of such definitions is to surround each template declaration with a "define" and "end" action. The define action names the template being created by providing a string constant. Here is a simple example: ``` {{define "T1"}}ONE{{end}} {{define "T2"}}TWO{{end}} {{define "T3"}}{{template "T1"}} {{template "T2"}}{{end}} {{template "T3"}} ``` This defines two templates, T1 and T2, and a third T3 that invokes the other two when it is executed. Finally it invokes T3. If executed this template will produce the text ``` ONE TWO ``` By construction, a template may reside in only one association. If it's necessary to have a template addressable from multiple associations, the template definition must be parsed multiple times to create distinct \*Template values, or must be copied with the Clone or AddParseTree method. Parse may be called multiple times to assemble the various associated templates; see the ParseFiles and ParseGlob functions and methods for simple ways to parse related templates stored in files. A template may be executed directly or through ExecuteTemplate, which executes an associated template identified by name. To invoke our example above, we might write, ``` err := tmpl.Execute(os.Stdout, "no data needed") if err != nil { log.Fatalf("execution failed: %s", err) } ``` or to invoke a particular template explicitly by name, ``` err := tmpl.ExecuteTemplate(os.Stdout, "T2", "no data needed") if err != nil { log.Fatalf("execution failed: %s", err) } ``` Index ----- * [func HTMLEscape(w io.Writer, b []byte)](#HTMLEscape) * [func HTMLEscapeString(s string) string](#HTMLEscapeString) * [func HTMLEscaper(args ...any) string](#HTMLEscaper) * [func IsTrue(val any) (truth, ok bool)](#IsTrue) * [func JSEscape(w io.Writer, b []byte)](#JSEscape) * [func JSEscapeString(s string) string](#JSEscapeString) * [func JSEscaper(args ...any) string](#JSEscaper) * [func URLQueryEscaper(args ...any) string](#URLQueryEscaper) * [type ExecError](#ExecError) * [func (e ExecError) Error() string](#ExecError.Error) * [func (e ExecError) Unwrap() error](#ExecError.Unwrap) * [type FuncMap](#FuncMap) * [type Template](#Template) * [func Must(t \*Template, err error) \*Template](#Must) * [func New(name string) \*Template](#New) * [func ParseFS(fsys fs.FS, patterns ...string) (\*Template, error)](#ParseFS) * [func ParseFiles(filenames ...string) (\*Template, error)](#ParseFiles) * [func ParseGlob(pattern string) (\*Template, error)](#ParseGlob) * [func (t \*Template) AddParseTree(name string, tree \*parse.Tree) (\*Template, error)](#Template.AddParseTree) * [func (t \*Template) Clone() (\*Template, error)](#Template.Clone) * [func (t \*Template) DefinedTemplates() string](#Template.DefinedTemplates) * [func (t \*Template) Delims(left, right string) \*Template](#Template.Delims) * [func (t \*Template) Execute(wr io.Writer, data any) error](#Template.Execute) * [func (t \*Template) ExecuteTemplate(wr io.Writer, name string, data any) error](#Template.ExecuteTemplate) * [func (t \*Template) Funcs(funcMap FuncMap) \*Template](#Template.Funcs) * [func (t \*Template) Lookup(name string) \*Template](#Template.Lookup) * [func (t \*Template) Name() string](#Template.Name) * [func (t \*Template) New(name string) \*Template](#Template.New) * [func (t \*Template) Option(opt ...string) \*Template](#Template.Option) * [func (t \*Template) Parse(text string) (\*Template, error)](#Template.Parse) * [func (t \*Template) ParseFS(fsys fs.FS, patterns ...string) (\*Template, error)](#Template.ParseFS) * [func (t \*Template) ParseFiles(filenames ...string) (\*Template, error)](#Template.ParseFiles) * [func (t \*Template) ParseGlob(pattern string) (\*Template, error)](#Template.ParseGlob) * [func (t \*Template) Templates() []\*Template](#Template.Templates) ### Examples [Template](#example_Template) [Template (Block)](#example_Template_block) [Template (Func)](#example_Template_func) [Template (Glob)](#example_Template_glob) [Template (Helpers)](#example_Template_helpers) [Template (Share)](#example_Template_share) ### Package files doc.go exec.go funcs.go helper.go option.go template.go func HTMLEscape --------------- ``` func HTMLEscape(w io.Writer, b []byte) ``` HTMLEscape writes to w the escaped HTML equivalent of the plain text data b. func HTMLEscapeString --------------------- ``` func HTMLEscapeString(s string) string ``` HTMLEscapeString returns the escaped HTML equivalent of the plain text data s. func HTMLEscaper ---------------- ``` func HTMLEscaper(args ...any) string ``` HTMLEscaper returns the escaped HTML equivalent of the textual representation of its arguments. func IsTrue 1.6 --------------- ``` func IsTrue(val any) (truth, ok bool) ``` IsTrue reports whether the value is 'true', in the sense of not the zero of its type, and whether the value has a meaningful truth value. This is the definition of truth used by if and other such actions. func JSEscape ------------- ``` func JSEscape(w io.Writer, b []byte) ``` JSEscape writes to w the escaped JavaScript equivalent of the plain text data b. func JSEscapeString ------------------- ``` func JSEscapeString(s string) string ``` JSEscapeString returns the escaped JavaScript equivalent of the plain text data s. func JSEscaper -------------- ``` func JSEscaper(args ...any) string ``` JSEscaper returns the escaped JavaScript equivalent of the textual representation of its arguments. func URLQueryEscaper -------------------- ``` func URLQueryEscaper(args ...any) string ``` URLQueryEscaper returns the escaped value of the textual representation of its arguments in a form suitable for embedding in a URL query. type ExecError 1.6 ------------------ ExecError is the custom error type returned when Execute has an error evaluating its template. (If a write error occurs, the actual error is returned; it will not be of type ExecError.) ``` type ExecError struct { Name string // Name of template. Err error // Pre-formatted error. } ``` ### func (ExecError) Error 1.6 ``` func (e ExecError) Error() string ``` ### func (ExecError) Unwrap 1.13 ``` func (e ExecError) Unwrap() error ``` type FuncMap ------------ FuncMap is the type of the map defining the mapping from names to functions. Each function must have either a single return value, or two return values of which the second has type error. In that case, if the second (error) return value evaluates to non-nil during execution, execution terminates and Execute returns that error. Errors returned by Execute wrap the underlying error; call errors.As to uncover them. When template execution invokes a function with an argument list, that list must be assignable to the function's parameter types. Functions meant to apply to arguments of arbitrary type can use parameters of type interface{} or of type reflect.Value. Similarly, functions meant to return a result of arbitrary type can return interface{} or reflect.Value. ``` type FuncMap map[string]any ``` type Template ------------- Template is the representation of a parsed template. The \*parse.Tree field is exported only for use by html/template and should be treated as unexported by all other clients. ``` type Template struct { *parse.Tree // contains filtered or unexported fields } ``` #### Example Code: ``` // Define a template. const letter = ` Dear {{.Name}}, {{if .Attended}} It was a pleasure to see you at the wedding. {{- else}} It is a shame you couldn't make it to the wedding. {{- end}} {{with .Gift -}} Thank you for the lovely {{.}}. {{end}} Best wishes, Josie ` // Prepare some data to insert into the template. type Recipient struct { Name, Gift string Attended bool } var recipients = []Recipient{ {"Aunt Mildred", "bone china tea set", true}, {"Uncle John", "moleskin pants", false}, {"Cousin Rodney", "", false}, } // Create a new template and parse the letter into it. t := template.Must(template.New("letter").Parse(letter)) // Execute the template for each recipient. for _, r := range recipients { err := t.Execute(os.Stdout, r) if err != nil { log.Println("executing template:", err) } } ``` Output: ``` Dear Aunt Mildred, It was a pleasure to see you at the wedding. Thank you for the lovely bone china tea set. Best wishes, Josie Dear Uncle John, It is a shame you couldn't make it to the wedding. Thank you for the lovely moleskin pants. Best wishes, Josie Dear Cousin Rodney, It is a shame you couldn't make it to the wedding. Best wishes, Josie ``` #### Example (Block) Code: ``` const ( master = `Names:{{block "list" .}}{{"\n"}}{{range .}}{{println "-" .}}{{end}}{{end}}` overlay = `{{define "list"}} {{join . ", "}}{{end}} ` ) var ( funcs = template.FuncMap{"join": strings.Join} guardians = []string{"Gamora", "Groot", "Nebula", "Rocket", "Star-Lord"} ) masterTmpl, err := template.New("master").Funcs(funcs).Parse(master) if err != nil { log.Fatal(err) } overlayTmpl, err := template.Must(masterTmpl.Clone()).Parse(overlay) if err != nil { log.Fatal(err) } if err := masterTmpl.Execute(os.Stdout, guardians); err != nil { log.Fatal(err) } if err := overlayTmpl.Execute(os.Stdout, guardians); err != nil { log.Fatal(err) } ``` Output: ``` Names: - Gamora - Groot - Nebula - Rocket - Star-Lord Names: Gamora, Groot, Nebula, Rocket, Star-Lord ``` #### Example (Func) This example demonstrates a custom function to process template text. It installs the strings.Title function and uses it to Make Title Text Look Good In Our Template's Output. Code: ``` // First we create a FuncMap with which to register the function. funcMap := template.FuncMap{ // The name "title" is what the function will be called in the template text. "title": strings.Title, } // A simple template definition to test our function. // We print the input text several ways: // - the original // - title-cased // - title-cased and then printed with %q // - printed with %q and then title-cased. const templateText = ` Input: {{printf "%q" .}} Output 0: {{title .}} Output 1: {{title . | printf "%q"}} Output 2: {{printf "%q" . | title}} ` // Create a template, add the function map, and parse the text. tmpl, err := template.New("titleTest").Funcs(funcMap).Parse(templateText) if err != nil { log.Fatalf("parsing: %s", err) } // Run the template to verify the output. err = tmpl.Execute(os.Stdout, "the go programming language") if err != nil { log.Fatalf("execution: %s", err) } ``` Output: ``` Input: "the go programming language" Output 0: The Go Programming Language Output 1: "The Go Programming Language" Output 2: "The Go Programming Language" ``` #### Example (Glob) Here we demonstrate loading a set of templates from a directory. Code: ``` // Here we create a temporary directory and populate it with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir := createTestDir([]templateFile{ // T0.tmpl is a plain template file that just invokes T1. {"T0.tmpl", `T0 invokes T1: ({{template "T1"}})`}, // T1.tmpl defines a template, T1 that invokes T2. {"T1.tmpl", `{{define "T1"}}T1 invokes T2: ({{template "T2"}}){{end}}`}, // T2.tmpl defines a template T2. {"T2.tmpl", `{{define "T2"}}This is T2{{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer os.RemoveAll(dir) // pattern is the glob pattern used to find all the template files. pattern := filepath.Join(dir, "*.tmpl") // Here starts the example proper. // T0.tmpl is the first name matched, so it becomes the starting template, // the value returned by ParseGlob. tmpl := template.Must(template.ParseGlob(pattern)) err := tmpl.Execute(os.Stdout, nil) if err != nil { log.Fatalf("template execution: %s", err) } ``` Output: ``` T0 invokes T1: (T1 invokes T2: (This is T2)) ``` #### Example (Helpers) This example demonstrates one way to share some templates and use them in different contexts. In this variant we add multiple driver templates by hand to an existing bundle of templates. Code: ``` // Here we create a temporary directory and populate it with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir := createTestDir([]templateFile{ // T1.tmpl defines a template, T1 that invokes T2. {"T1.tmpl", `{{define "T1"}}T1 invokes T2: ({{template "T2"}}){{end}}`}, // T2.tmpl defines a template T2. {"T2.tmpl", `{{define "T2"}}This is T2{{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer os.RemoveAll(dir) // pattern is the glob pattern used to find all the template files. pattern := filepath.Join(dir, "*.tmpl") // Here starts the example proper. // Load the helpers. templates := template.Must(template.ParseGlob(pattern)) // Add one driver template to the bunch; we do this with an explicit template definition. _, err := templates.Parse("{{define `driver1`}}Driver 1 calls T1: ({{template `T1`}})\n{{end}}") if err != nil { log.Fatal("parsing driver1: ", err) } // Add another driver template. _, err = templates.Parse("{{define `driver2`}}Driver 2 calls T2: ({{template `T2`}})\n{{end}}") if err != nil { log.Fatal("parsing driver2: ", err) } // We load all the templates before execution. This package does not require // that behavior but html/template's escaping does, so it's a good habit. err = templates.ExecuteTemplate(os.Stdout, "driver1", nil) if err != nil { log.Fatalf("driver1 execution: %s", err) } err = templates.ExecuteTemplate(os.Stdout, "driver2", nil) if err != nil { log.Fatalf("driver2 execution: %s", err) } ``` Output: ``` Driver 1 calls T1: (T1 invokes T2: (This is T2)) Driver 2 calls T2: (This is T2) ``` #### Example (Share) This example demonstrates how to use one group of driver templates with distinct sets of helper templates. Code: ``` // Here we create a temporary directory and populate it with our sample // template definition files; usually the template files would already // exist in some location known to the program. dir := createTestDir([]templateFile{ // T0.tmpl is a plain template file that just invokes T1. {"T0.tmpl", "T0 ({{.}} version) invokes T1: ({{template `T1`}})\n"}, // T1.tmpl defines a template, T1 that invokes T2. Note T2 is not defined {"T1.tmpl", `{{define "T1"}}T1 invokes T2: ({{template "T2"}}){{end}}`}, }) // Clean up after the test; another quirk of running as an example. defer os.RemoveAll(dir) // pattern is the glob pattern used to find all the template files. pattern := filepath.Join(dir, "*.tmpl") // Here starts the example proper. // Load the drivers. drivers := template.Must(template.ParseGlob(pattern)) // We must define an implementation of the T2 template. First we clone // the drivers, then add a definition of T2 to the template name space. // 1. Clone the helper set to create a new name space from which to run them. first, err := drivers.Clone() if err != nil { log.Fatal("cloning helpers: ", err) } // 2. Define T2, version A, and parse it. _, err = first.Parse("{{define `T2`}}T2, version A{{end}}") if err != nil { log.Fatal("parsing T2: ", err) } // Now repeat the whole thing, using a different version of T2. // 1. Clone the drivers. second, err := drivers.Clone() if err != nil { log.Fatal("cloning drivers: ", err) } // 2. Define T2, version B, and parse it. _, err = second.Parse("{{define `T2`}}T2, version B{{end}}") if err != nil { log.Fatal("parsing T2: ", err) } // Execute the templates in the reverse order to verify the // first is unaffected by the second. err = second.ExecuteTemplate(os.Stdout, "T0.tmpl", "second") if err != nil { log.Fatalf("second execution: %s", err) } err = first.ExecuteTemplate(os.Stdout, "T0.tmpl", "first") if err != nil { log.Fatalf("first: execution: %s", err) } ``` Output: ``` T0 (second version) invokes T1: (T1 invokes T2: (T2, version B)) T0 (first version) invokes T1: (T1 invokes T2: (T2, version A)) ``` ### func Must ``` func Must(t *Template, err error) *Template ``` Must is a helper that wraps a call to a function returning (\*Template, error) and panics if the error is non-nil. It is intended for use in variable initializations such as ``` var t = template.Must(template.New("name").Parse("text")) ``` ### func New ``` func New(name string) *Template ``` New allocates a new, undefined template with the given name. ### func ParseFS 1.16 ``` func ParseFS(fsys fs.FS, patterns ...string) (*Template, error) ``` ParseFS is like ParseFiles or ParseGlob but reads from the file system fsys instead of the host operating system's file system. It accepts a list of glob patterns. (Note that most file names serve as glob patterns matching only themselves.) ### func ParseFiles ``` func ParseFiles(filenames ...string) (*Template, error) ``` ParseFiles creates a new Template and parses the template definitions from the named files. The returned template's name will have the base name and parsed contents of the first file. There must be at least one file. If an error occurs, parsing stops and the returned \*Template is nil. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. For instance, ParseFiles("a/foo", "b/foo") stores "b/foo" as the template named "foo", while "a/foo" is unavailable. ### func ParseGlob ``` func ParseGlob(pattern string) (*Template, error) ``` ParseGlob creates a new Template and parses the template definitions from the files identified by the pattern. The files are matched according to the semantics of filepath.Match, and the pattern must match at least one file. The returned template will have the (base) name and (parsed) contents of the first file matched by the pattern. ParseGlob is equivalent to calling ParseFiles with the list of files matched by the pattern. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. ### func (\*Template) AddParseTree ``` func (t *Template) AddParseTree(name string, tree *parse.Tree) (*Template, error) ``` AddParseTree associates the argument parse tree with the template t, giving it the specified name. If the template has not been defined, this tree becomes its definition. If it has been defined and already has that name, the existing definition is replaced; otherwise a new template is created, defined, and returned. ### func (\*Template) Clone ``` func (t *Template) Clone() (*Template, error) ``` Clone returns a duplicate of the template, including all associated templates. The actual representation is not copied, but the name space of associated templates is, so further calls to Parse in the copy will add templates to the copy but not to the original. Clone can be used to prepare common templates and use them with variant definitions for other templates by adding the variants after the clone is made. ### func (\*Template) DefinedTemplates 1.5 ``` func (t *Template) DefinedTemplates() string ``` DefinedTemplates returns a string listing the defined templates, prefixed by the string "; defined templates are: ". If there are none, it returns the empty string. For generating an error message here and in html/template. ### func (\*Template) Delims ``` func (t *Template) Delims(left, right string) *Template ``` Delims sets the action delimiters to the specified strings, to be used in subsequent calls to Parse, ParseFiles, or ParseGlob. Nested template definitions will inherit the settings. An empty delimiter stands for the corresponding default: {{ or }}. The return value is the template, so calls can be chained. ### func (\*Template) Execute ``` func (t *Template) Execute(wr io.Writer, data any) error ``` Execute applies a parsed template to the specified data object, and writes the output to wr. If an error occurs executing the template or writing its output, execution stops, but partial results may already have been written to the output writer. A template may be executed safely in parallel, although if parallel executions share a Writer the output may be interleaved. If data is a reflect.Value, the template applies to the concrete value that the reflect.Value holds, as in fmt.Print. ### func (\*Template) ExecuteTemplate ``` func (t *Template) ExecuteTemplate(wr io.Writer, name string, data any) error ``` ExecuteTemplate applies the template associated with t that has the given name to the specified data object and writes the output to wr. If an error occurs executing the template or writing its output, execution stops, but partial results may already have been written to the output writer. A template may be executed safely in parallel, although if parallel executions share a Writer the output may be interleaved. ### func (\*Template) Funcs ``` func (t *Template) Funcs(funcMap FuncMap) *Template ``` Funcs adds the elements of the argument map to the template's function map. It must be called before the template is parsed. It panics if a value in the map is not a function with appropriate return type or if the name cannot be used syntactically as a function in a template. It is legal to overwrite elements of the map. The return value is the template, so calls can be chained. ### func (\*Template) Lookup ``` func (t *Template) Lookup(name string) *Template ``` Lookup returns the template with the given name that is associated with t. It returns nil if there is no such template or the template has no definition. ### func (\*Template) Name ``` func (t *Template) Name() string ``` Name returns the name of the template. ### func (\*Template) New ``` func (t *Template) New(name string) *Template ``` New allocates a new, undefined template associated with the given one and with the same delimiters. The association, which is transitive, allows one template to invoke another with a {{template}} action. Because associated templates share underlying data, template construction cannot be done safely in parallel. Once the templates are constructed, they can be executed in parallel. ### func (\*Template) Option 1.5 ``` func (t *Template) Option(opt ...string) *Template ``` Option sets options for the template. Options are described by strings, either a simple string or "key=value". There can be at most one equals sign in an option string. If the option string is unrecognized or otherwise invalid, Option panics. Known options: missingkey: Control the behavior during execution if a map is indexed with a key that is not present in the map. ``` "missingkey=default" or "missingkey=invalid" The default behavior: Do nothing and continue execution. If printed, the result of the index operation is the string "<no value>". "missingkey=zero" The operation returns the zero value for the map type's element. "missingkey=error" Execution stops immediately with an error. ``` ### func (\*Template) Parse ``` func (t *Template) Parse(text string) (*Template, error) ``` Parse parses text as a template body for t. Named template definitions ({{define ...}} or {{block ...}} statements) in text define additional templates associated with t and are removed from the definition of t itself. Templates can be redefined in successive calls to Parse. A template definition with a body containing only white space and comments is considered empty and will not replace an existing template's body. This allows using Parse to add new named template definitions without overwriting the main template body. ### func (\*Template) ParseFS 1.16 ``` func (t *Template) ParseFS(fsys fs.FS, patterns ...string) (*Template, error) ``` ParseFS is like ParseFiles or ParseGlob but reads from the file system fsys instead of the host operating system's file system. It accepts a list of glob patterns. (Note that most file names serve as glob patterns matching only themselves.) ### func (\*Template) ParseFiles ``` func (t *Template) ParseFiles(filenames ...string) (*Template, error) ``` ParseFiles parses the named files and associates the resulting templates with t. If an error occurs, parsing stops and the returned template is nil; otherwise it is t. There must be at least one file. Since the templates created by ParseFiles are named by the base names of the argument files, t should usually have the name of one of the (base) names of the files. If it does not, depending on t's contents before calling ParseFiles, t.Execute may fail. In that case use t.ExecuteTemplate to execute a valid template. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. ### func (\*Template) ParseGlob ``` func (t *Template) ParseGlob(pattern string) (*Template, error) ``` ParseGlob parses the template definitions in the files identified by the pattern and associates the resulting templates with t. The files are matched according to the semantics of filepath.Match, and the pattern must match at least one file. ParseGlob is equivalent to calling t.ParseFiles with the list of files matched by the pattern. When parsing multiple files with the same name in different directories, the last one mentioned will be the one that results. ### func (\*Template) Templates ``` func (t *Template) Templates() []*Template ``` Templates returns a slice of defined templates associated with t. Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [parse](parse/index) | Package parse builds parse trees for templates as defined by text/template and html/template. |
programming_docs
go Package parse Package parse ============== * `import "text/template/parse"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package parse builds parse trees for templates as defined by text/template and html/template. Clients should use those packages to construct templates rather than this one, which provides shared internal data structures not intended for general use. Index ----- * [func IsEmptyTree(n Node) bool](#IsEmptyTree) * [func Parse(name, text, leftDelim, rightDelim string, funcs ...map[string]any) (map[string]\*Tree, error)](#Parse) * [type ActionNode](#ActionNode) * [func (a \*ActionNode) Copy() Node](#ActionNode.Copy) * [func (a \*ActionNode) String() string](#ActionNode.String) * [type BoolNode](#BoolNode) * [func (b \*BoolNode) Copy() Node](#BoolNode.Copy) * [func (b \*BoolNode) String() string](#BoolNode.String) * [type BranchNode](#BranchNode) * [func (b \*BranchNode) Copy() Node](#BranchNode.Copy) * [func (b \*BranchNode) String() string](#BranchNode.String) * [type BreakNode](#BreakNode) * [func (b \*BreakNode) Copy() Node](#BreakNode.Copy) * [func (b \*BreakNode) String() string](#BreakNode.String) * [type ChainNode](#ChainNode) * [func (c \*ChainNode) Add(field string)](#ChainNode.Add) * [func (c \*ChainNode) Copy() Node](#ChainNode.Copy) * [func (c \*ChainNode) String() string](#ChainNode.String) * [type CommandNode](#CommandNode) * [func (c \*CommandNode) Copy() Node](#CommandNode.Copy) * [func (c \*CommandNode) String() string](#CommandNode.String) * [type CommentNode](#CommentNode) * [func (c \*CommentNode) Copy() Node](#CommentNode.Copy) * [func (c \*CommentNode) String() string](#CommentNode.String) * [type ContinueNode](#ContinueNode) * [func (c \*ContinueNode) Copy() Node](#ContinueNode.Copy) * [func (c \*ContinueNode) String() string](#ContinueNode.String) * [type DotNode](#DotNode) * [func (d \*DotNode) Copy() Node](#DotNode.Copy) * [func (d \*DotNode) String() string](#DotNode.String) * [func (d \*DotNode) Type() NodeType](#DotNode.Type) * [type FieldNode](#FieldNode) * [func (f \*FieldNode) Copy() Node](#FieldNode.Copy) * [func (f \*FieldNode) String() string](#FieldNode.String) * [type IdentifierNode](#IdentifierNode) * [func NewIdentifier(ident string) \*IdentifierNode](#NewIdentifier) * [func (i \*IdentifierNode) Copy() Node](#IdentifierNode.Copy) * [func (i \*IdentifierNode) SetPos(pos Pos) \*IdentifierNode](#IdentifierNode.SetPos) * [func (i \*IdentifierNode) SetTree(t \*Tree) \*IdentifierNode](#IdentifierNode.SetTree) * [func (i \*IdentifierNode) String() string](#IdentifierNode.String) * [type IfNode](#IfNode) * [func (i \*IfNode) Copy() Node](#IfNode.Copy) * [type ListNode](#ListNode) * [func (l \*ListNode) Copy() Node](#ListNode.Copy) * [func (l \*ListNode) CopyList() \*ListNode](#ListNode.CopyList) * [func (l \*ListNode) String() string](#ListNode.String) * [type Mode](#Mode) * [type NilNode](#NilNode) * [func (n \*NilNode) Copy() Node](#NilNode.Copy) * [func (n \*NilNode) String() string](#NilNode.String) * [func (n \*NilNode) Type() NodeType](#NilNode.Type) * [type Node](#Node) * [type NodeType](#NodeType) * [func (t NodeType) Type() NodeType](#NodeType.Type) * [type NumberNode](#NumberNode) * [func (n \*NumberNode) Copy() Node](#NumberNode.Copy) * [func (n \*NumberNode) String() string](#NumberNode.String) * [type PipeNode](#PipeNode) * [func (p \*PipeNode) Copy() Node](#PipeNode.Copy) * [func (p \*PipeNode) CopyPipe() \*PipeNode](#PipeNode.CopyPipe) * [func (p \*PipeNode) String() string](#PipeNode.String) * [type Pos](#Pos) * [func (p Pos) Position() Pos](#Pos.Position) * [type RangeNode](#RangeNode) * [func (r \*RangeNode) Copy() Node](#RangeNode.Copy) * [type StringNode](#StringNode) * [func (s \*StringNode) Copy() Node](#StringNode.Copy) * [func (s \*StringNode) String() string](#StringNode.String) * [type TemplateNode](#TemplateNode) * [func (t \*TemplateNode) Copy() Node](#TemplateNode.Copy) * [func (t \*TemplateNode) String() string](#TemplateNode.String) * [type TextNode](#TextNode) * [func (t \*TextNode) Copy() Node](#TextNode.Copy) * [func (t \*TextNode) String() string](#TextNode.String) * [type Tree](#Tree) * [func New(name string, funcs ...map[string]any) \*Tree](#New) * [func (t \*Tree) Copy() \*Tree](#Tree.Copy) * [func (t \*Tree) ErrorContext(n Node) (location, context string)](#Tree.ErrorContext) * [func (t \*Tree) Parse(text, leftDelim, rightDelim string, treeSet map[string]\*Tree, funcs ...map[string]any) (tree \*Tree, err error)](#Tree.Parse) * [type VariableNode](#VariableNode) * [func (v \*VariableNode) Copy() Node](#VariableNode.Copy) * [func (v \*VariableNode) String() string](#VariableNode.String) * [type WithNode](#WithNode) * [func (w \*WithNode) Copy() Node](#WithNode.Copy) ### Package files lex.go node.go parse.go func IsEmptyTree ---------------- ``` func IsEmptyTree(n Node) bool ``` IsEmptyTree reports whether this tree (node) is empty of everything but space or comments. func Parse ---------- ``` func Parse(name, text, leftDelim, rightDelim string, funcs ...map[string]any) (map[string]*Tree, error) ``` Parse returns a map from template name to parse.Tree, created by parsing the templates described in the argument string. The top-level template will be given the specified name. If an error is encountered, parsing stops and an empty map is returned with the error. type ActionNode --------------- ActionNode holds an action (something bounded by delimiters). Control actions have their own nodes; ActionNode represents simple ones such as field evaluations and parenthesized pipelines. ``` type ActionNode struct { NodeType Pos Line int // The line number in the input. Deprecated: Kept for compatibility. Pipe *PipeNode // The pipeline in the action. // contains filtered or unexported fields } ``` ### func (\*ActionNode) Copy ``` func (a *ActionNode) Copy() Node ``` ### func (\*ActionNode) String ``` func (a *ActionNode) String() string ``` type BoolNode ------------- BoolNode holds a boolean constant. ``` type BoolNode struct { NodeType Pos True bool // The value of the boolean constant. // contains filtered or unexported fields } ``` ### func (\*BoolNode) Copy ``` func (b *BoolNode) Copy() Node ``` ### func (\*BoolNode) String ``` func (b *BoolNode) String() string ``` type BranchNode --------------- BranchNode is the common representation of if, range, and with. ``` type BranchNode struct { NodeType Pos Line int // The line number in the input. Deprecated: Kept for compatibility. Pipe *PipeNode // The pipeline to be evaluated. List *ListNode // What to execute if the value is non-empty. ElseList *ListNode // What to execute if the value is empty (nil if absent). // contains filtered or unexported fields } ``` ### func (\*BranchNode) Copy 1.4 ``` func (b *BranchNode) Copy() Node ``` ### func (\*BranchNode) String ``` func (b *BranchNode) String() string ``` type BreakNode 1.18 ------------------- BreakNode represents a {{break}} action. ``` type BreakNode struct { NodeType Pos Line int // contains filtered or unexported fields } ``` ### func (\*BreakNode) Copy 1.18 ``` func (b *BreakNode) Copy() Node ``` ### func (\*BreakNode) String 1.18 ``` func (b *BreakNode) String() string ``` type ChainNode 1.1 ------------------ ChainNode holds a term followed by a chain of field accesses (identifier starting with '.'). The names may be chained ('.x.y'). The periods are dropped from each ident. ``` type ChainNode struct { NodeType Pos Node Node Field []string // The identifiers in lexical order. // contains filtered or unexported fields } ``` ### func (\*ChainNode) Add 1.1 ``` func (c *ChainNode) Add(field string) ``` Add adds the named field (which should start with a period) to the end of the chain. ### func (\*ChainNode) Copy 1.1 ``` func (c *ChainNode) Copy() Node ``` ### func (\*ChainNode) String 1.1 ``` func (c *ChainNode) String() string ``` type CommandNode ---------------- CommandNode holds a command (a pipeline inside an evaluating action). ``` type CommandNode struct { NodeType Pos Args []Node // Arguments in lexical order: Identifier, field, or constant. // contains filtered or unexported fields } ``` ### func (\*CommandNode) Copy ``` func (c *CommandNode) Copy() Node ``` ### func (\*CommandNode) String ``` func (c *CommandNode) String() string ``` type CommentNode 1.16 --------------------- CommentNode holds a comment. ``` type CommentNode struct { NodeType Pos Text string // Comment text. // contains filtered or unexported fields } ``` ### func (\*CommentNode) Copy 1.16 ``` func (c *CommentNode) Copy() Node ``` ### func (\*CommentNode) String 1.16 ``` func (c *CommentNode) String() string ``` type ContinueNode 1.18 ---------------------- ContinueNode represents a {{continue}} action. ``` type ContinueNode struct { NodeType Pos Line int // contains filtered or unexported fields } ``` ### func (\*ContinueNode) Copy 1.18 ``` func (c *ContinueNode) Copy() Node ``` ### func (\*ContinueNode) String 1.18 ``` func (c *ContinueNode) String() string ``` type DotNode ------------ DotNode holds the special identifier '.'. ``` type DotNode struct { NodeType Pos // contains filtered or unexported fields } ``` ### func (\*DotNode) Copy ``` func (d *DotNode) Copy() Node ``` ### func (\*DotNode) String ``` func (d *DotNode) String() string ``` ### func (\*DotNode) Type ``` func (d *DotNode) Type() NodeType ``` type FieldNode -------------- FieldNode holds a field (identifier starting with '.'). The names may be chained ('.x.y'). The period is dropped from each ident. ``` type FieldNode struct { NodeType Pos Ident []string // The identifiers in lexical order. // contains filtered or unexported fields } ``` ### func (\*FieldNode) Copy ``` func (f *FieldNode) Copy() Node ``` ### func (\*FieldNode) String ``` func (f *FieldNode) String() string ``` type IdentifierNode ------------------- IdentifierNode holds an identifier. ``` type IdentifierNode struct { NodeType Pos Ident string // The identifier's name. // contains filtered or unexported fields } ``` ### func NewIdentifier ``` func NewIdentifier(ident string) *IdentifierNode ``` NewIdentifier returns a new IdentifierNode with the given identifier name. ### func (\*IdentifierNode) Copy ``` func (i *IdentifierNode) Copy() Node ``` ### func (\*IdentifierNode) SetPos 1.1 ``` func (i *IdentifierNode) SetPos(pos Pos) *IdentifierNode ``` SetPos sets the position. NewIdentifier is a public method so we can't modify its signature. Chained for convenience. TODO: fix one day? ### func (\*IdentifierNode) SetTree 1.4 ``` func (i *IdentifierNode) SetTree(t *Tree) *IdentifierNode ``` SetTree sets the parent tree for the node. NewIdentifier is a public method so we can't modify its signature. Chained for convenience. TODO: fix one day? ### func (\*IdentifierNode) String ``` func (i *IdentifierNode) String() string ``` type IfNode ----------- IfNode represents an {{if}} action and its commands. ``` type IfNode struct { BranchNode } ``` ### func (\*IfNode) Copy ``` func (i *IfNode) Copy() Node ``` type ListNode ------------- ListNode holds a sequence of nodes. ``` type ListNode struct { NodeType Pos Nodes []Node // The element nodes in lexical order. // contains filtered or unexported fields } ``` ### func (\*ListNode) Copy ``` func (l *ListNode) Copy() Node ``` ### func (\*ListNode) CopyList ``` func (l *ListNode) CopyList() *ListNode ``` ### func (\*ListNode) String ``` func (l *ListNode) String() string ``` type Mode 1.16 -------------- A mode value is a set of flags (or 0). Modes control parser behavior. ``` type Mode uint ``` ``` const ( ParseComments Mode = 1 << iota // parse comments and add them to AST SkipFuncCheck // do not check that functions are defined ) ``` type NilNode 1.1 ---------------- NilNode holds the special identifier 'nil' representing an untyped nil constant. ``` type NilNode struct { NodeType Pos // contains filtered or unexported fields } ``` ### func (\*NilNode) Copy 1.1 ``` func (n *NilNode) Copy() Node ``` ### func (\*NilNode) String 1.1 ``` func (n *NilNode) String() string ``` ### func (\*NilNode) Type 1.1 ``` func (n *NilNode) Type() NodeType ``` type Node --------- A Node is an element in the parse tree. The interface is trivial. The interface contains an unexported method so that only types local to this package can satisfy it. ``` type Node interface { Type() NodeType String() string // Copy does a deep copy of the Node and all its components. // To avoid type assertions, some XxxNodes also have specialized // CopyXxx methods that return *XxxNode. Copy() Node Position() Pos // byte position of start of node in full original input string // contains filtered or unexported methods } ``` type NodeType ------------- NodeType identifies the type of a parse tree node. ``` type NodeType int ``` ``` const ( NodeText NodeType = iota // Plain text. NodeAction // A non-control action such as a field evaluation. NodeBool // A boolean constant. NodeChain // A sequence of field accesses. NodeCommand // An element of a pipeline. NodeDot // The cursor, dot. NodeField // A field or method name. NodeIdentifier // An identifier; always a function name. NodeIf // An if action. NodeList // A list of Nodes. NodeNil // An untyped nil constant. NodeNumber // A numerical constant. NodePipe // A pipeline of commands. NodeRange // A range action. NodeString // A string constant. NodeTemplate // A template invocation action. NodeVariable // A $ variable. NodeWith // A with action. NodeComment // A comment. NodeBreak // A break action. NodeContinue // A continue action. ) ``` ### func (NodeType) Type ``` func (t NodeType) Type() NodeType ``` Type returns itself and provides an easy default implementation for embedding in a Node. Embedded in all non-trivial Nodes. type NumberNode --------------- NumberNode holds a number: signed or unsigned integer, float, or complex. The value is parsed and stored under all the types that can represent the value. This simulates in a small amount of code the behavior of Go's ideal constants. ``` type NumberNode struct { NodeType Pos IsInt bool // Number has an integral value. IsUint bool // Number has an unsigned integral value. IsFloat bool // Number has a floating-point value. IsComplex bool // Number is complex. Int64 int64 // The signed integer value. Uint64 uint64 // The unsigned integer value. Float64 float64 // The floating-point value. Complex128 complex128 // The complex value. Text string // The original textual representation from the input. // contains filtered or unexported fields } ``` ### func (\*NumberNode) Copy ``` func (n *NumberNode) Copy() Node ``` ### func (\*NumberNode) String ``` func (n *NumberNode) String() string ``` type PipeNode ------------- PipeNode holds a pipeline with optional declaration ``` type PipeNode struct { NodeType Pos Line int // The line number in the input. Deprecated: Kept for compatibility. IsAssign bool // The variables are being assigned, not declared; added in Go 1.11 Decl []*VariableNode // Variables in lexical order. Cmds []*CommandNode // The commands in lexical order. // contains filtered or unexported fields } ``` ### func (\*PipeNode) Copy ``` func (p *PipeNode) Copy() Node ``` ### func (\*PipeNode) CopyPipe ``` func (p *PipeNode) CopyPipe() *PipeNode ``` ### func (\*PipeNode) String ``` func (p *PipeNode) String() string ``` type Pos 1.1 ------------ Pos represents a byte position in the original input text from which this template was parsed. ``` type Pos int ``` ### func (Pos) Position 1.1 ``` func (p Pos) Position() Pos ``` type RangeNode -------------- RangeNode represents a {{range}} action and its commands. ``` type RangeNode struct { BranchNode } ``` ### func (\*RangeNode) Copy ``` func (r *RangeNode) Copy() Node ``` type StringNode --------------- StringNode holds a string constant. The value has been "unquoted". ``` type StringNode struct { NodeType Pos Quoted string // The original text of the string, with quotes. Text string // The string, after quote processing. // contains filtered or unexported fields } ``` ### func (\*StringNode) Copy ``` func (s *StringNode) Copy() Node ``` ### func (\*StringNode) String ``` func (s *StringNode) String() string ``` type TemplateNode ----------------- TemplateNode represents a {{template}} action. ``` type TemplateNode struct { NodeType Pos Line int // The line number in the input. Deprecated: Kept for compatibility. Name string // The name of the template (unquoted). Pipe *PipeNode // The command to evaluate as dot for the template. // contains filtered or unexported fields } ``` ### func (\*TemplateNode) Copy ``` func (t *TemplateNode) Copy() Node ``` ### func (\*TemplateNode) String ``` func (t *TemplateNode) String() string ``` type TextNode ------------- TextNode holds plain text. ``` type TextNode struct { NodeType Pos Text []byte // The text; may span newlines. // contains filtered or unexported fields } ``` ### func (\*TextNode) Copy ``` func (t *TextNode) Copy() Node ``` ### func (\*TextNode) String ``` func (t *TextNode) String() string ``` type Tree --------- Tree is the representation of a single parsed template. ``` type Tree struct { Name string // name of the template represented by the tree. ParseName string // name of the top-level template during parsing, for error messages; added in Go 1.1 Root *ListNode // top-level root of the tree. Mode Mode // parsing mode; added in Go 1.16 // contains filtered or unexported fields } ``` ### func New ``` func New(name string, funcs ...map[string]any) *Tree ``` New allocates a new parse tree with the given name. ### func (\*Tree) Copy 1.2 ``` func (t *Tree) Copy() *Tree ``` Copy returns a copy of the Tree. Any parsing state is discarded. ### func (\*Tree) ErrorContext 1.1 ``` func (t *Tree) ErrorContext(n Node) (location, context string) ``` ErrorContext returns a textual representation of the location of the node in the input text. The receiver is only used when the node does not have a pointer to the tree inside, which can occur in old code. ### func (\*Tree) Parse ``` func (t *Tree) Parse(text, leftDelim, rightDelim string, treeSet map[string]*Tree, funcs ...map[string]any) (tree *Tree, err error) ``` Parse parses the template definition string to construct a representation of the template for execution. If either action delimiter string is empty, the default ("{{" or "}}") is used. Embedded template definitions are added to the treeSet map. type VariableNode ----------------- VariableNode holds a list of variable names, possibly with chained field accesses. The dollar sign is part of the (first) name. ``` type VariableNode struct { NodeType Pos Ident []string // Variable name and fields in lexical order. // contains filtered or unexported fields } ``` ### func (\*VariableNode) Copy ``` func (v *VariableNode) Copy() Node ``` ### func (\*VariableNode) String ``` func (v *VariableNode) String() string ``` type WithNode ------------- WithNode represents a {{with}} action and its commands. ``` type WithNode struct { BranchNode } ``` ### func (\*WithNode) Copy ``` func (w *WithNode) Copy() Node ```
programming_docs
go Package scanner Package scanner ================ * `import "text/scanner"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package scanner provides a scanner and tokenizer for UTF-8-encoded text. It takes an io.Reader providing the source, which then can be tokenized through repeated calls to the Scan function. For compatibility with existing tools, the NUL character is not allowed. If the first character in the source is a UTF-8 encoded byte order mark (BOM), it is discarded. By default, a Scanner skips white space and Go comments and recognizes all literals as defined by the Go language specification. It may be customized to recognize only a subset of those literals and to recognize different identifier and white space characters. #### Example Code: ``` const src = ` // This is scanned code. if a > 10 { someParsable = text }` var s scanner.Scanner s.Init(strings.NewReader(src)) s.Filename = "example" for tok := s.Scan(); tok != scanner.EOF; tok = s.Scan() { fmt.Printf("%s: %s\n", s.Position, s.TokenText()) } ``` Output: ``` example:3:1: if example:3:4: a example:3:6: > example:3:8: 10 example:3:11: { example:4:2: someParsable example:4:15: = example:4:17: text example:5:1: } ``` #### Example (IsIdentRune) Code: ``` const src = "%var1 var2%" var s scanner.Scanner s.Init(strings.NewReader(src)) s.Filename = "default" for tok := s.Scan(); tok != scanner.EOF; tok = s.Scan() { fmt.Printf("%s: %s\n", s.Position, s.TokenText()) } fmt.Println() s.Init(strings.NewReader(src)) s.Filename = "percent" // treat leading '%' as part of an identifier s.IsIdentRune = func(ch rune, i int) bool { return ch == '%' && i == 0 || unicode.IsLetter(ch) || unicode.IsDigit(ch) && i > 0 } for tok := s.Scan(); tok != scanner.EOF; tok = s.Scan() { fmt.Printf("%s: %s\n", s.Position, s.TokenText()) } ``` Output: ``` default:1:1: % default:1:2: var1 default:1:7: var2 default:1:11: % percent:1:1: %var1 percent:1:7: var2 percent:1:11: % ``` #### Example (Mode) Code: ``` const src = ` // Comment begins at column 5. This line should not be included in the output. /* This multiline comment should be extracted in its entirety. */ ` var s scanner.Scanner s.Init(strings.NewReader(src)) s.Filename = "comments" s.Mode ^= scanner.SkipComments // don't skip comments for tok := s.Scan(); tok != scanner.EOF; tok = s.Scan() { txt := s.TokenText() if strings.HasPrefix(txt, "//") || strings.HasPrefix(txt, "/*") { fmt.Printf("%s: %s\n", s.Position, txt) } } ``` Output: ``` comments:2:5: // Comment begins at column 5. comments:6:1: /* This multiline comment should be extracted in its entirety. */ ``` #### Example (Whitespace) Code: ``` // tab-separated values const src = `aa ab ac ad ba bb bc bd ca cb cc cd da db dc dd` var ( col, row int s scanner.Scanner tsv [4][4]string // large enough for example above ) s.Init(strings.NewReader(src)) s.Whitespace ^= 1<<'\t' | 1<<'\n' // don't skip tabs and new lines for tok := s.Scan(); tok != scanner.EOF; tok = s.Scan() { switch tok { case '\n': row++ col = 0 case '\t': col++ default: tsv[row][col] = s.TokenText() } } fmt.Print(tsv) ``` Output: ``` [[aa ab ac ad] [ba bb bc bd] [ca cb cc cd] [da db dc dd]] ``` Index ----- * [Constants](#pkg-constants) * [func TokenString(tok rune) string](#TokenString) * [type Position](#Position) * [func (pos \*Position) IsValid() bool](#Position.IsValid) * [func (pos Position) String() string](#Position.String) * [type Scanner](#Scanner) * [func (s \*Scanner) Init(src io.Reader) \*Scanner](#Scanner.Init) * [func (s \*Scanner) Next() rune](#Scanner.Next) * [func (s \*Scanner) Peek() rune](#Scanner.Peek) * [func (s \*Scanner) Pos() (pos Position)](#Scanner.Pos) * [func (s \*Scanner) Scan() rune](#Scanner.Scan) * [func (s \*Scanner) TokenText() string](#Scanner.TokenText) ### Examples [Package](#example_) [Package (IsIdentRune)](#example__isIdentRune) [Package (Mode)](#example__mode) [Package (Whitespace)](#example__whitespace) ### Package files scanner.go Constants --------- Predefined mode bits to control recognition of tokens. For instance, to configure a Scanner such that it only recognizes (Go) identifiers, integers, and skips comments, set the Scanner's Mode field to: ``` ScanIdents | ScanInts | SkipComments ``` With the exceptions of comments, which are skipped if SkipComments is set, unrecognized tokens are not ignored. Instead, the scanner simply returns the respective individual characters (or possibly sub-tokens). For instance, if the mode is ScanIdents (not ScanStrings), the string "foo" is scanned as the token sequence '"' Ident '"'. Use GoTokens to configure the Scanner such that it accepts all Go literal tokens including Go identifiers. Comments will be skipped. ``` const ( ScanIdents = 1 << -Ident ScanInts = 1 << -Int ScanFloats = 1 << -Float // includes Ints and hexadecimal floats ScanChars = 1 << -Char ScanStrings = 1 << -String ScanRawStrings = 1 << -RawString ScanComments = 1 << -Comment SkipComments = 1 << -skipComment // if set with ScanComments, comments become white space GoTokens = ScanIdents | ScanFloats | ScanChars | ScanStrings | ScanRawStrings | ScanComments | SkipComments ) ``` The result of Scan is one of these tokens or a Unicode character. ``` const ( EOF = -(iota + 1) Ident Int Float Char String RawString Comment ) ``` GoWhitespace is the default value for the Scanner's Whitespace field. Its value selects Go's white space characters. ``` const GoWhitespace = 1<<'\t' | 1<<'\n' | 1<<'\r' | 1<<' ' ``` func TokenString ---------------- ``` func TokenString(tok rune) string ``` TokenString returns a printable string for a token or Unicode character. type Position ------------- Position is a value that represents a source position. A position is valid if Line > 0. ``` type Position struct { Filename string // filename, if any Offset int // byte offset, starting at 0 Line int // line number, starting at 1 Column int // column number, starting at 1 (character count per line) } ``` ### func (\*Position) IsValid ``` func (pos *Position) IsValid() bool ``` IsValid reports whether the position is valid. ### func (Position) String ``` func (pos Position) String() string ``` type Scanner ------------ A Scanner implements reading of Unicode characters and tokens from an io.Reader. ``` type Scanner struct { // Error is called for each error encountered. If no Error // function is set, the error is reported to os.Stderr. Error func(s *Scanner, msg string) // ErrorCount is incremented by one for each error encountered. ErrorCount int // The Mode field controls which tokens are recognized. For instance, // to recognize Ints, set the ScanInts bit in Mode. The field may be // changed at any time. Mode uint // The Whitespace field controls which characters are recognized // as white space. To recognize a character ch <= ' ' as white space, // set the ch'th bit in Whitespace (the Scanner's behavior is undefined // for values ch > ' '). The field may be changed at any time. Whitespace uint64 // IsIdentRune is a predicate controlling the characters accepted // as the ith rune in an identifier. The set of valid characters // must not intersect with the set of white space characters. // If no IsIdentRune function is set, regular Go identifiers are // accepted instead. The field may be changed at any time. IsIdentRune func(ch rune, i int) bool // Go 1.4 // Start position of most recently scanned token; set by Scan. // Calling Init or Next invalidates the position (Line == 0). // The Filename field is always left untouched by the Scanner. // If an error is reported (via Error) and Position is invalid, // the scanner is not inside a token. Call Pos to obtain an error // position in that case, or to obtain the position immediately // after the most recently scanned token. Position // contains filtered or unexported fields } ``` ### func (\*Scanner) Init ``` func (s *Scanner) Init(src io.Reader) *Scanner ``` Init initializes a Scanner with a new source and returns s. Error is set to nil, ErrorCount is set to 0, Mode is set to GoTokens, and Whitespace is set to GoWhitespace. ### func (\*Scanner) Next ``` func (s *Scanner) Next() rune ``` Next reads and returns the next Unicode character. It returns EOF at the end of the source. It reports a read error by calling s.Error, if not nil; otherwise it prints an error message to os.Stderr. Next does not update the Scanner's Position field; use Pos() to get the current position. ### func (\*Scanner) Peek ``` func (s *Scanner) Peek() rune ``` Peek returns the next Unicode character in the source without advancing the scanner. It returns EOF if the scanner's position is at the last character of the source. ### func (\*Scanner) Pos ``` func (s *Scanner) Pos() (pos Position) ``` Pos returns the position of the character immediately after the character or token returned by the last call to Next or Scan. Use the Scanner's Position field for the start position of the most recently scanned token. ### func (\*Scanner) Scan ``` func (s *Scanner) Scan() rune ``` Scan reads the next token or Unicode character from source and returns it. It only recognizes tokens t for which the respective Mode bit (1<<-t) is set. It returns EOF at the end of the source. It reports scanner errors (read and token errors) by calling s.Error, if not nil; otherwise it prints an error message to os.Stderr. ### func (\*Scanner) TokenText ``` func (s *Scanner) TokenText() string ``` TokenText returns the string corresponding to the most recently scanned token. Valid after calling Scan and in calls of Scanner.Error. go Package errors Package errors =============== * `import "errors"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package errors implements functions to manipulate errors. The New function creates errors whose only content is a text message. An error e wraps another error if e's type has one of the methods ``` Unwrap() error Unwrap() []error ``` If e.Unwrap() returns a non-nil error w or a slice containing w, then we say that e wraps w. A nil error returned from e.Unwrap() indicates that e does not wrap any error. It is invalid for an Unwrap method to return an []error containing a nil error value. An easy way to create wrapped errors is to call fmt.Errorf and apply the %w verb to the error argument: ``` wrapsErr := fmt.Errorf("... %w ...", ..., err, ...) ``` Successive unwrapping of an error creates a tree. The Is and As functions inspect an error's tree by examining first the error itself followed by the tree of each of its children in turn (pre-order, depth-first traversal). Is examines the tree of its first argument looking for an error that matches the second. It reports whether it finds a match. It should be used in preference to simple equality checks: ``` if errors.Is(err, fs.ErrExist) ``` is preferable to ``` if err == fs.ErrExist ``` because the former will succeed if err wraps fs.ErrExist. As examines the tree of its first argument looking for an error that can be assigned to its second argument, which must be a pointer. If it succeeds, it performs the assignment and returns true. Otherwise, it returns false. The form ``` var perr *fs.PathError if errors.As(err, &perr) { fmt.Println(perr.Path) } ``` is preferable to ``` if perr, ok := err.(*fs.PathError); ok { fmt.Println(perr.Path) } ``` because the former will succeed if err wraps an \*fs.PathError. #### Example Code: ``` package errors_test import ( "fmt" "time" ) // MyError is an error implementation that includes a time and message. type MyError struct { When time.Time What string } func (e MyError) Error() string { return fmt.Sprintf("%v: %v", e.When, e.What) } func oops() error { return MyError{ time.Date(1989, 3, 15, 22, 30, 0, 0, time.UTC), "the file system has gone away", } } func Example() { if err := oops(); err != nil { fmt.Println(err) } // Output: 1989-03-15 22:30:00 +0000 UTC: the file system has gone away } ``` Index ----- * [func As(err error, target any) bool](#As) * [func Is(err, target error) bool](#Is) * [func Join(errs ...error) error](#Join) * [func New(text string) error](#New) * [func Unwrap(err error) error](#Unwrap) ### Examples [Package](#example_) [As](#example_As) [Is](#example_Is) [Join](#example_Join) [New](#example_New) [New (Errorf)](#example_New_errorf) [Unwrap](#example_Unwrap) ### Package files errors.go join.go wrap.go func As 1.13 ------------ ``` func As(err error, target any) bool ``` As finds the first error in err's tree that matches target, and if one is found, sets target to that error value and returns true. Otherwise, it returns false. The tree consists of err itself, followed by the errors obtained by repeatedly calling Unwrap. When err wraps multiple errors, As examines err followed by a depth-first traversal of its children. An error matches target if the error's concrete value is assignable to the value pointed to by target, or if the error has a method As(interface{}) bool such that As(target) returns true. In the latter case, the As method is responsible for setting target. An error type might provide an As method so it can be treated as if it were a different error type. As panics if target is not a non-nil pointer to either a type that implements error, or to any interface type. #### Example Code: ``` if _, err := os.Open("non-existing"); err != nil { var pathError *fs.PathError if errors.As(err, &pathError) { fmt.Println("Failed at path:", pathError.Path) } else { fmt.Println(err) } } ``` Output: ``` Failed at path: non-existing ``` func Is 1.13 ------------ ``` func Is(err, target error) bool ``` Is reports whether any error in err's tree matches target. The tree consists of err itself, followed by the errors obtained by repeatedly calling Unwrap. When err wraps multiple errors, Is examines err followed by a depth-first traversal of its children. An error is considered to match a target if it is equal to that target or if it implements a method Is(error) bool such that Is(target) returns true. An error type might provide an Is method so it can be treated as equivalent to an existing error. For example, if MyError defines ``` func (m MyError) Is(target error) bool { return target == fs.ErrExist } ``` then Is(MyError{}, fs.ErrExist) returns true. See syscall.Errno.Is for an example in the standard library. An Is method should only shallowly compare err and the target and not call Unwrap on either. #### Example Code: ``` if _, err := os.Open("non-existing"); err != nil { if errors.Is(err, fs.ErrNotExist) { fmt.Println("file does not exist") } else { fmt.Println(err) } } ``` Output: ``` file does not exist ``` func Join 1.20 -------------- ``` func Join(errs ...error) error ``` Join returns an error that wraps the given errors. Any nil error values are discarded. Join returns nil if errs contains no non-nil values. The error formats as the concatenation of the strings obtained by calling the Error method of each element of errs, with a newline between each string. #### Example Code: ``` err1 := errors.New("err1") err2 := errors.New("err2") err := errors.Join(err1, err2) fmt.Println(err) if errors.Is(err, err1) { fmt.Println("err is err1") } if errors.Is(err, err2) { fmt.Println("err is err2") } ``` Output: ``` err1 err2 err is err1 err is err2 ``` func New -------- ``` func New(text string) error ``` New returns an error that formats as the given text. Each call to New returns a distinct error value even if the text is identical. #### Example Code: ``` err := errors.New("emit macho dwarf: elf header corrupted") if err != nil { fmt.Print(err) } ``` Output: ``` emit macho dwarf: elf header corrupted ``` #### Example (Errorf) The fmt package's Errorf function lets us use the package's formatting features to create descriptive error messages. Code: ``` const name, id = "bimmler", 17 err := fmt.Errorf("user %q (id %d) not found", name, id) if err != nil { fmt.Print(err) } ``` Output: ``` user "bimmler" (id 17) not found ``` func Unwrap 1.13 ---------------- ``` func Unwrap(err error) error ``` Unwrap returns the result of calling the Unwrap method on err, if err's type contains an Unwrap method returning error. Otherwise, Unwrap returns nil. Unwrap returns nil if the Unwrap method returns []error. #### Example Code: ``` err1 := errors.New("error1") err2 := fmt.Errorf("error2: [%w]", err1) fmt.Println(err2) fmt.Println(errors.Unwrap(err2)) // Output // error2: [error1] // error1 ``` go Package sync Package sync ============= * `import "sync"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) * [Subdirectories](#pkg-subdirectories) Overview -------- Package sync provides basic synchronization primitives such as mutual exclusion locks. Other than the Once and WaitGroup types, most are intended for use by low-level library routines. Higher-level synchronization is better done via channels and communication. Values containing the types defined in this package should not be copied. Index ----- * [type Cond](#Cond) * [func NewCond(l Locker) \*Cond](#NewCond) * [func (c \*Cond) Broadcast()](#Cond.Broadcast) * [func (c \*Cond) Signal()](#Cond.Signal) * [func (c \*Cond) Wait()](#Cond.Wait) * [type Locker](#Locker) * [type Map](#Map) * [func (m \*Map) CompareAndDelete(key, old any) (deleted bool)](#Map.CompareAndDelete) * [func (m \*Map) CompareAndSwap(key, old, new any) bool](#Map.CompareAndSwap) * [func (m \*Map) Delete(key any)](#Map.Delete) * [func (m \*Map) Load(key any) (value any, ok bool)](#Map.Load) * [func (m \*Map) LoadAndDelete(key any) (value any, loaded bool)](#Map.LoadAndDelete) * [func (m \*Map) LoadOrStore(key, value any) (actual any, loaded bool)](#Map.LoadOrStore) * [func (m \*Map) Range(f func(key, value any) bool)](#Map.Range) * [func (m \*Map) Store(key, value any)](#Map.Store) * [func (m \*Map) Swap(key, value any) (previous any, loaded bool)](#Map.Swap) * [type Mutex](#Mutex) * [func (m \*Mutex) Lock()](#Mutex.Lock) * [func (m \*Mutex) TryLock() bool](#Mutex.TryLock) * [func (m \*Mutex) Unlock()](#Mutex.Unlock) * [type Once](#Once) * [func (o \*Once) Do(f func())](#Once.Do) * [type Pool](#Pool) * [func (p \*Pool) Get() any](#Pool.Get) * [func (p \*Pool) Put(x any)](#Pool.Put) * [type RWMutex](#RWMutex) * [func (rw \*RWMutex) Lock()](#RWMutex.Lock) * [func (rw \*RWMutex) RLock()](#RWMutex.RLock) * [func (rw \*RWMutex) RLocker() Locker](#RWMutex.RLocker) * [func (rw \*RWMutex) RUnlock()](#RWMutex.RUnlock) * [func (rw \*RWMutex) TryLock() bool](#RWMutex.TryLock) * [func (rw \*RWMutex) TryRLock() bool](#RWMutex.TryRLock) * [func (rw \*RWMutex) Unlock()](#RWMutex.Unlock) * [type WaitGroup](#WaitGroup) * [func (wg \*WaitGroup) Add(delta int)](#WaitGroup.Add) * [func (wg \*WaitGroup) Done()](#WaitGroup.Done) * [func (wg \*WaitGroup) Wait()](#WaitGroup.Wait) ### Examples [Once](#example_Once) [Pool](#example_Pool) [WaitGroup](#example_WaitGroup) ### Package files cond.go map.go mutex.go once.go pool.go poolqueue.go runtime.go runtime2.go rwmutex.go waitgroup.go type Cond --------- Cond implements a condition variable, a rendezvous point for goroutines waiting for or announcing the occurrence of an event. Each Cond has an associated Locker L (often a \*Mutex or \*RWMutex), which must be held when changing the condition and when calling the Wait method. A Cond must not be copied after first use. In the terminology of the Go memory model, Cond arranges that a call to Broadcast or Signal “synchronizes before” any Wait call that it unblocks. For many simple use cases, users will be better off using channels than a Cond (Broadcast corresponds to closing a channel, and Signal corresponds to sending on a channel). For more on replacements for sync.Cond, see [Roberto Clapis's series on advanced concurrency patterns](https://blogtitle.github.io/categories/concurrency/), as well as [Bryan Mills's talk on concurrency patterns](https://drive.google.com/file/d/1nPdvhB0PutEJzdCq5ms6UI58dp50fcAN/view). ``` type Cond struct { // L is held while observing or changing the condition L Locker // contains filtered or unexported fields } ``` ### func NewCond ``` func NewCond(l Locker) *Cond ``` NewCond returns a new Cond with Locker l. ### func (\*Cond) Broadcast ``` func (c *Cond) Broadcast() ``` Broadcast wakes all goroutines waiting on c. It is allowed but not required for the caller to hold c.L during the call. ### func (\*Cond) Signal ``` func (c *Cond) Signal() ``` Signal wakes one goroutine waiting on c, if there is any. It is allowed but not required for the caller to hold c.L during the call. Signal() does not affect goroutine scheduling priority; if other goroutines are attempting to lock c.L, they may be awoken before a "waiting" goroutine. ### func (\*Cond) Wait ``` func (c *Cond) Wait() ``` Wait atomically unlocks c.L and suspends execution of the calling goroutine. After later resuming execution, Wait locks c.L before returning. Unlike in other systems, Wait cannot return unless awoken by Broadcast or Signal. Because c.L is not locked while Wait is waiting, the caller typically cannot assume that the condition is true when Wait returns. Instead, the caller should Wait in a loop: ``` c.L.Lock() for !condition() { c.Wait() } ... make use of condition ... c.L.Unlock() ``` type Locker ----------- A Locker represents an object that can be locked and unlocked. ``` type Locker interface { Lock() Unlock() } ``` type Map 1.9 ------------ Map is like a Go map[interface{}]interface{} but is safe for concurrent use by multiple goroutines without additional locking or coordination. Loads, stores, and deletes run in amortized constant time. The Map type is specialized. Most code should use a plain Go map instead, with separate locking or coordination, for better type safety and to make it easier to maintain other invariants along with the map content. The Map type is optimized for two common use cases: (1) when the entry for a given key is only ever written once but read many times, as in caches that only grow, or (2) when multiple goroutines read, write, and overwrite entries for disjoint sets of keys. In these two cases, use of a Map may significantly reduce lock contention compared to a Go map paired with a separate Mutex or RWMutex. The zero Map is empty and ready for use. A Map must not be copied after first use. In the terminology of the Go memory model, Map arranges that a write operation “synchronizes before” any read operation that observes the effect of the write, where read and write operations are defined as follows. Load, LoadAndDelete, LoadOrStore, Swap, CompareAndSwap, and CompareAndDelete are read operations; Delete, LoadAndDelete, Store, and Swap are write operations; LoadOrStore is a write operation when it returns loaded set to false; CompareAndSwap is a write operation when it returns swapped set to true; and CompareAndDelete is a write operation when it returns deleted set to true. ``` type Map struct { // contains filtered or unexported fields } ``` ### func (\*Map) CompareAndDelete 1.20 ``` func (m *Map) CompareAndDelete(key, old any) (deleted bool) ``` CompareAndDelete deletes the entry for key if its value is equal to old. The old value must be of a comparable type. If there is no current value for key in the map, CompareAndDelete returns false (even if the old value is the nil interface value). ### func (\*Map) CompareAndSwap 1.20 ``` func (m *Map) CompareAndSwap(key, old, new any) bool ``` CompareAndSwap swaps the old and new values for key if the value stored in the map is equal to old. The old value must be of a comparable type. ### func (\*Map) Delete 1.9 ``` func (m *Map) Delete(key any) ``` Delete deletes the value for a key. ### func (\*Map) Load 1.9 ``` func (m *Map) Load(key any) (value any, ok bool) ``` Load returns the value stored in the map for a key, or nil if no value is present. The ok result indicates whether value was found in the map. ### func (\*Map) LoadAndDelete 1.15 ``` func (m *Map) LoadAndDelete(key any) (value any, loaded bool) ``` LoadAndDelete deletes the value for a key, returning the previous value if any. The loaded result reports whether the key was present. ### func (\*Map) LoadOrStore 1.9 ``` func (m *Map) LoadOrStore(key, value any) (actual any, loaded bool) ``` LoadOrStore returns the existing value for the key if present. Otherwise, it stores and returns the given value. The loaded result is true if the value was loaded, false if stored. ### func (\*Map) Range 1.9 ``` func (m *Map) Range(f func(key, value any) bool) ``` Range calls f sequentially for each key and value present in the map. If f returns false, range stops the iteration. Range does not necessarily correspond to any consistent snapshot of the Map's contents: no key will be visited more than once, but if the value for any key is stored or deleted concurrently (including by f), Range may reflect any mapping for that key from any point during the Range call. Range does not block other methods on the receiver; even f itself may call any method on m. Range may be O(N) with the number of elements in the map even if f returns false after a constant number of calls. ### func (\*Map) Store 1.9 ``` func (m *Map) Store(key, value any) ``` Store sets the value for a key. ### func (\*Map) Swap 1.20 ``` func (m *Map) Swap(key, value any) (previous any, loaded bool) ``` Swap swaps the value for a key and returns the previous value if any. The loaded result reports whether the key was present. type Mutex ---------- A Mutex is a mutual exclusion lock. The zero value for a Mutex is an unlocked mutex. A Mutex must not be copied after first use. In the terminology of the Go memory model, the n'th call to Unlock “synchronizes before” the m'th call to Lock for any n < m. A successful call to TryLock is equivalent to a call to Lock. A failed call to TryLock does not establish any “synchronizes before” relation at all. ``` type Mutex struct { // contains filtered or unexported fields } ``` ### func (\*Mutex) Lock ``` func (m *Mutex) Lock() ``` Lock locks m. If the lock is already in use, the calling goroutine blocks until the mutex is available. ### func (\*Mutex) TryLock 1.18 ``` func (m *Mutex) TryLock() bool ``` TryLock tries to lock m and reports whether it succeeded. Note that while correct uses of TryLock do exist, they are rare, and use of TryLock is often a sign of a deeper problem in a particular use of mutexes. ### func (\*Mutex) Unlock ``` func (m *Mutex) Unlock() ``` Unlock unlocks m. It is a run-time error if m is not locked on entry to Unlock. A locked Mutex is not associated with a particular goroutine. It is allowed for one goroutine to lock a Mutex and then arrange for another goroutine to unlock it. type Once --------- Once is an object that will perform exactly one action. A Once must not be copied after first use. In the terminology of the Go memory model, the return from f “synchronizes before” the return from any call of once.Do(f). ``` type Once struct { // contains filtered or unexported fields } ``` #### Example Code: ``` var once sync.Once onceBody := func() { fmt.Println("Only once") } done := make(chan bool) for i := 0; i < 10; i++ { go func() { once.Do(onceBody) done <- true }() } for i := 0; i < 10; i++ { <-done } ``` Output: ``` Only once ``` ### func (\*Once) Do ``` func (o *Once) Do(f func()) ``` Do calls the function f if and only if Do is being called for the first time for this instance of Once. In other words, given ``` var once Once ``` if once.Do(f) is called multiple times, only the first call will invoke f, even if f has a different value in each invocation. A new instance of Once is required for each function to execute. Do is intended for initialization that must be run exactly once. Since f is niladic, it may be necessary to use a function literal to capture the arguments to a function to be invoked by Do: ``` config.once.Do(func() { config.init(filename) }) ``` Because no call to Do returns until the one call to f returns, if f causes Do to be called, it will deadlock. If f panics, Do considers it to have returned; future calls of Do return without calling f. type Pool 1.3 ------------- A Pool is a set of temporary objects that may be individually saved and retrieved. Any item stored in the Pool may be removed automatically at any time without notification. If the Pool holds the only reference when this happens, the item might be deallocated. A Pool is safe for use by multiple goroutines simultaneously. Pool's purpose is to cache allocated but unused items for later reuse, relieving pressure on the garbage collector. That is, it makes it easy to build efficient, thread-safe free lists. However, it is not suitable for all free lists. An appropriate use of a Pool is to manage a group of temporary items silently shared among and potentially reused by concurrent independent clients of a package. Pool provides a way to amortize allocation overhead across many clients. An example of good use of a Pool is in the fmt package, which maintains a dynamically-sized store of temporary output buffers. The store scales under load (when many goroutines are actively printing) and shrinks when quiescent. On the other hand, a free list maintained as part of a short-lived object is not a suitable use for a Pool, since the overhead does not amortize well in that scenario. It is more efficient to have such objects implement their own free list. A Pool must not be copied after first use. In the terminology of the Go memory model, a call to Put(x) “synchronizes before” a call to Get returning that same value x. Similarly, a call to New returning x “synchronizes before” a call to Get returning that same value x. ``` type Pool struct { // New optionally specifies a function to generate // a value when Get would otherwise return nil. // It may not be changed concurrently with calls to Get. New func() any // contains filtered or unexported fields } ``` #### Example Code: ``` package sync_test import ( "bytes" "io" "os" "sync" "time" ) var bufPool = sync.Pool{ New: func() any { // The Pool's New function should generally only return pointer // types, since a pointer can be put into the return interface // value without an allocation: return new(bytes.Buffer) }, } // timeNow is a fake version of time.Now for tests. func timeNow() time.Time { return time.Unix(1136214245, 0) } func Log(w io.Writer, key, val string) { b := bufPool.Get().(*bytes.Buffer) b.Reset() // Replace this with time.Now() in a real logger. b.WriteString(timeNow().UTC().Format(time.RFC3339)) b.WriteByte(' ') b.WriteString(key) b.WriteByte('=') b.WriteString(val) w.Write(b.Bytes()) bufPool.Put(b) } func ExamplePool() { Log(os.Stdout, "path", "/search?q=flowers") // Output: 2006-01-02T15:04:05Z path=/search?q=flowers } ``` ### func (\*Pool) Get 1.3 ``` func (p *Pool) Get() any ``` Get selects an arbitrary item from the Pool, removes it from the Pool, and returns it to the caller. Get may choose to ignore the pool and treat it as empty. Callers should not assume any relation between values passed to Put and the values returned by Get. If Get would otherwise return nil and p.New is non-nil, Get returns the result of calling p.New. ### func (\*Pool) Put 1.3 ``` func (p *Pool) Put(x any) ``` Put adds x to the pool. type RWMutex ------------ A RWMutex is a reader/writer mutual exclusion lock. The lock can be held by an arbitrary number of readers or a single writer. The zero value for a RWMutex is an unlocked mutex. A RWMutex must not be copied after first use. If a goroutine holds a RWMutex for reading and another goroutine might call Lock, no goroutine should expect to be able to acquire a read lock until the initial read lock is released. In particular, this prohibits recursive read locking. This is to ensure that the lock eventually becomes available; a blocked Lock call excludes new readers from acquiring the lock. In the terminology of the Go memory model, the n'th call to Unlock “synchronizes before” the m'th call to Lock for any n < m, just as for Mutex. For any call to RLock, there exists an n such that the n'th call to Unlock “synchronizes before” that call to RLock, and the corresponding call to RUnlock “synchronizes before” the n+1'th call to Lock. ``` type RWMutex struct { // contains filtered or unexported fields } ``` ### func (\*RWMutex) Lock ``` func (rw *RWMutex) Lock() ``` Lock locks rw for writing. If the lock is already locked for reading or writing, Lock blocks until the lock is available. ### func (\*RWMutex) RLock ``` func (rw *RWMutex) RLock() ``` RLock locks rw for reading. It should not be used for recursive read locking; a blocked Lock call excludes new readers from acquiring the lock. See the documentation on the RWMutex type. ### func (\*RWMutex) RLocker ``` func (rw *RWMutex) RLocker() Locker ``` RLocker returns a Locker interface that implements the Lock and Unlock methods by calling rw.RLock and rw.RUnlock. ### func (\*RWMutex) RUnlock ``` func (rw *RWMutex) RUnlock() ``` RUnlock undoes a single RLock call; it does not affect other simultaneous readers. It is a run-time error if rw is not locked for reading on entry to RUnlock. ### func (\*RWMutex) TryLock 1.18 ``` func (rw *RWMutex) TryLock() bool ``` TryLock tries to lock rw for writing and reports whether it succeeded. Note that while correct uses of TryLock do exist, they are rare, and use of TryLock is often a sign of a deeper problem in a particular use of mutexes. ### func (\*RWMutex) TryRLock 1.18 ``` func (rw *RWMutex) TryRLock() bool ``` TryRLock tries to lock rw for reading and reports whether it succeeded. Note that while correct uses of TryRLock do exist, they are rare, and use of TryRLock is often a sign of a deeper problem in a particular use of mutexes. ### func (\*RWMutex) Unlock ``` func (rw *RWMutex) Unlock() ``` Unlock unlocks rw for writing. It is a run-time error if rw is not locked for writing on entry to Unlock. As with Mutexes, a locked RWMutex is not associated with a particular goroutine. One goroutine may RLock (Lock) a RWMutex and then arrange for another goroutine to RUnlock (Unlock) it. type WaitGroup -------------- A WaitGroup waits for a collection of goroutines to finish. The main goroutine calls Add to set the number of goroutines to wait for. Then each of the goroutines runs and calls Done when finished. At the same time, Wait can be used to block until all goroutines have finished. A WaitGroup must not be copied after first use. In the terminology of the Go memory model, a call to Done “synchronizes before” the return of any Wait call that it unblocks. ``` type WaitGroup struct { // contains filtered or unexported fields } ``` #### Example This example fetches several URLs concurrently, using a WaitGroup to block until all the fetches are complete. Code: ``` var wg sync.WaitGroup var urls = []string{ "http://www.golang.org/", "http://www.google.com/", "http://www.example.com/", } for _, url := range urls { // Increment the WaitGroup counter. wg.Add(1) // Launch a goroutine to fetch the URL. go func(url string) { // Decrement the counter when the goroutine completes. defer wg.Done() // Fetch the URL. http.Get(url) }(url) } // Wait for all HTTP fetches to complete. wg.Wait() ``` ### func (\*WaitGroup) Add ``` func (wg *WaitGroup) Add(delta int) ``` Add adds delta, which may be negative, to the WaitGroup counter. If the counter becomes zero, all goroutines blocked on Wait are released. If the counter goes negative, Add panics. Note that calls with a positive delta that occur when the counter is zero must happen before a Wait. Calls with a negative delta, or calls with a positive delta that start when the counter is greater than zero, may happen at any time. Typically this means the calls to Add should execute before the statement creating the goroutine or other event to be waited for. If a WaitGroup is reused to wait for several independent sets of events, new Add calls must happen after all previous Wait calls have returned. See the WaitGroup example. ### func (\*WaitGroup) Done ``` func (wg *WaitGroup) Done() ``` Done decrements the WaitGroup counter by one. ### func (\*WaitGroup) Wait ``` func (wg *WaitGroup) Wait() ``` Wait blocks until the WaitGroup counter is zero. Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [atomic](atomic/index) | Package atomic provides low-level atomic memory primitives useful for implementing synchronization algorithms. |
programming_docs
go Package atomic Package atomic =============== * `import "sync/atomic"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package atomic provides low-level atomic memory primitives useful for implementing synchronization algorithms. These functions require great care to be used correctly. Except for special, low-level applications, synchronization is better done with channels or the facilities of the sync package. Share memory by communicating; don't communicate by sharing memory. The swap operation, implemented by the SwapT functions, is the atomic equivalent of: ``` old = *addr *addr = new return old ``` The compare-and-swap operation, implemented by the CompareAndSwapT functions, is the atomic equivalent of: ``` if *addr == old { *addr = new return true } return false ``` The add operation, implemented by the AddT functions, is the atomic equivalent of: ``` *addr += delta return *addr ``` The load and store operations, implemented by the LoadT and StoreT functions, are the atomic equivalents of "return \*addr" and "\*addr = val". In the terminology of the Go memory model, if the effect of an atomic operation A is observed by atomic operation B, then A “synchronizes before” B. Additionally, all the atomic operations executed in a program behave as though executed in some sequentially consistent order. This definition provides the same semantics as C++'s sequentially consistent atomics and Java's volatile variables. Index ----- * [func AddInt32(addr \*int32, delta int32) (new int32)](#AddInt32) * [func AddInt64(addr \*int64, delta int64) (new int64)](#AddInt64) * [func AddUint32(addr \*uint32, delta uint32) (new uint32)](#AddUint32) * [func AddUint64(addr \*uint64, delta uint64) (new uint64)](#AddUint64) * [func AddUintptr(addr \*uintptr, delta uintptr) (new uintptr)](#AddUintptr) * [func CompareAndSwapInt32(addr \*int32, old, new int32) (swapped bool)](#CompareAndSwapInt32) * [func CompareAndSwapInt64(addr \*int64, old, new int64) (swapped bool)](#CompareAndSwapInt64) * [func CompareAndSwapPointer(addr \*unsafe.Pointer, old, new unsafe.Pointer) (swapped bool)](#CompareAndSwapPointer) * [func CompareAndSwapUint32(addr \*uint32, old, new uint32) (swapped bool)](#CompareAndSwapUint32) * [func CompareAndSwapUint64(addr \*uint64, old, new uint64) (swapped bool)](#CompareAndSwapUint64) * [func CompareAndSwapUintptr(addr \*uintptr, old, new uintptr) (swapped bool)](#CompareAndSwapUintptr) * [func LoadInt32(addr \*int32) (val int32)](#LoadInt32) * [func LoadInt64(addr \*int64) (val int64)](#LoadInt64) * [func LoadPointer(addr \*unsafe.Pointer) (val unsafe.Pointer)](#LoadPointer) * [func LoadUint32(addr \*uint32) (val uint32)](#LoadUint32) * [func LoadUint64(addr \*uint64) (val uint64)](#LoadUint64) * [func LoadUintptr(addr \*uintptr) (val uintptr)](#LoadUintptr) * [func StoreInt32(addr \*int32, val int32)](#StoreInt32) * [func StoreInt64(addr \*int64, val int64)](#StoreInt64) * [func StorePointer(addr \*unsafe.Pointer, val unsafe.Pointer)](#StorePointer) * [func StoreUint32(addr \*uint32, val uint32)](#StoreUint32) * [func StoreUint64(addr \*uint64, val uint64)](#StoreUint64) * [func StoreUintptr(addr \*uintptr, val uintptr)](#StoreUintptr) * [func SwapInt32(addr \*int32, new int32) (old int32)](#SwapInt32) * [func SwapInt64(addr \*int64, new int64) (old int64)](#SwapInt64) * [func SwapPointer(addr \*unsafe.Pointer, new unsafe.Pointer) (old unsafe.Pointer)](#SwapPointer) * [func SwapUint32(addr \*uint32, new uint32) (old uint32)](#SwapUint32) * [func SwapUint64(addr \*uint64, new uint64) (old uint64)](#SwapUint64) * [func SwapUintptr(addr \*uintptr, new uintptr) (old uintptr)](#SwapUintptr) * [type Bool](#Bool) * [func (x \*Bool) CompareAndSwap(old, new bool) (swapped bool)](#Bool.CompareAndSwap) * [func (x \*Bool) Load() bool](#Bool.Load) * [func (x \*Bool) Store(val bool)](#Bool.Store) * [func (x \*Bool) Swap(new bool) (old bool)](#Bool.Swap) * [type Int32](#Int32) * [func (x \*Int32) Add(delta int32) (new int32)](#Int32.Add) * [func (x \*Int32) CompareAndSwap(old, new int32) (swapped bool)](#Int32.CompareAndSwap) * [func (x \*Int32) Load() int32](#Int32.Load) * [func (x \*Int32) Store(val int32)](#Int32.Store) * [func (x \*Int32) Swap(new int32) (old int32)](#Int32.Swap) * [type Int64](#Int64) * [func (x \*Int64) Add(delta int64) (new int64)](#Int64.Add) * [func (x \*Int64) CompareAndSwap(old, new int64) (swapped bool)](#Int64.CompareAndSwap) * [func (x \*Int64) Load() int64](#Int64.Load) * [func (x \*Int64) Store(val int64)](#Int64.Store) * [func (x \*Int64) Swap(new int64) (old int64)](#Int64.Swap) * [type Pointer](#Pointer) * [func (x \*Pointer[T]) CompareAndSwap(old, new \*T) (swapped bool)](#Pointer.CompareAndSwap) * [func (x \*Pointer[T]) Load() \*T](#Pointer.Load) * [func (x \*Pointer[T]) Store(val \*T)](#Pointer.Store) * [func (x \*Pointer[T]) Swap(new \*T) (old \*T)](#Pointer.Swap) * [type Uint32](#Uint32) * [func (x \*Uint32) Add(delta uint32) (new uint32)](#Uint32.Add) * [func (x \*Uint32) CompareAndSwap(old, new uint32) (swapped bool)](#Uint32.CompareAndSwap) * [func (x \*Uint32) Load() uint32](#Uint32.Load) * [func (x \*Uint32) Store(val uint32)](#Uint32.Store) * [func (x \*Uint32) Swap(new uint32) (old uint32)](#Uint32.Swap) * [type Uint64](#Uint64) * [func (x \*Uint64) Add(delta uint64) (new uint64)](#Uint64.Add) * [func (x \*Uint64) CompareAndSwap(old, new uint64) (swapped bool)](#Uint64.CompareAndSwap) * [func (x \*Uint64) Load() uint64](#Uint64.Load) * [func (x \*Uint64) Store(val uint64)](#Uint64.Store) * [func (x \*Uint64) Swap(new uint64) (old uint64)](#Uint64.Swap) * [type Uintptr](#Uintptr) * [func (x \*Uintptr) Add(delta uintptr) (new uintptr)](#Uintptr.Add) * [func (x \*Uintptr) CompareAndSwap(old, new uintptr) (swapped bool)](#Uintptr.CompareAndSwap) * [func (x \*Uintptr) Load() uintptr](#Uintptr.Load) * [func (x \*Uintptr) Store(val uintptr)](#Uintptr.Store) * [func (x \*Uintptr) Swap(new uintptr) (old uintptr)](#Uintptr.Swap) * [type Value](#Value) * [func (v \*Value) CompareAndSwap(old, new any) (swapped bool)](#Value.CompareAndSwap) * [func (v \*Value) Load() (val any)](#Value.Load) * [func (v \*Value) Store(val any)](#Value.Store) * [func (v \*Value) Swap(new any) (old any)](#Value.Swap) * [Bugs](#pkg-note-BUG) ### Examples [Value (Config)](#example_Value_config) [Value (ReadMostly)](#example_Value_readMostly) ### Package files doc.go type.go value.go func AddInt32 ------------- ``` func AddInt32(addr *int32, delta int32) (new int32) ``` AddInt32 atomically adds delta to \*addr and returns the new value. Consider using the more ergonomic and less error-prone [Int32.Add](#Int32.Add) instead. func AddInt64 ------------- ``` func AddInt64(addr *int64, delta int64) (new int64) ``` AddInt64 atomically adds delta to \*addr and returns the new value. Consider using the more ergonomic and less error-prone [Int64.Add](#Int64.Add) instead (particularly if you target 32-bit platforms; see the bugs section). func AddUint32 -------------- ``` func AddUint32(addr *uint32, delta uint32) (new uint32) ``` AddUint32 atomically adds delta to \*addr and returns the new value. To subtract a signed positive constant value c from x, do AddUint32(&x, ^uint32(c-1)). In particular, to decrement x, do AddUint32(&x, ^uint32(0)). Consider using the more ergonomic and less error-prone [Uint32.Add](#Uint32.Add) instead. func AddUint64 -------------- ``` func AddUint64(addr *uint64, delta uint64) (new uint64) ``` AddUint64 atomically adds delta to \*addr and returns the new value. To subtract a signed positive constant value c from x, do AddUint64(&x, ^uint64(c-1)). In particular, to decrement x, do AddUint64(&x, ^uint64(0)). Consider using the more ergonomic and less error-prone [Uint64.Add](#Uint64.Add) instead (particularly if you target 32-bit platforms; see the bugs section). func AddUintptr --------------- ``` func AddUintptr(addr *uintptr, delta uintptr) (new uintptr) ``` AddUintptr atomically adds delta to \*addr and returns the new value. Consider using the more ergonomic and less error-prone [Uintptr.Add](#Uintptr.Add) instead. func CompareAndSwapInt32 ------------------------ ``` func CompareAndSwapInt32(addr *int32, old, new int32) (swapped bool) ``` CompareAndSwapInt32 executes the compare-and-swap operation for an int32 value. Consider using the more ergonomic and less error-prone [Int32.CompareAndSwap](#Int32.CompareAndSwap) instead. func CompareAndSwapInt64 ------------------------ ``` func CompareAndSwapInt64(addr *int64, old, new int64) (swapped bool) ``` CompareAndSwapInt64 executes the compare-and-swap operation for an int64 value. Consider using the more ergonomic and less error-prone [Int64.CompareAndSwap](#Int64.CompareAndSwap) instead (particularly if you target 32-bit platforms; see the bugs section). func CompareAndSwapPointer -------------------------- ``` func CompareAndSwapPointer(addr *unsafe.Pointer, old, new unsafe.Pointer) (swapped bool) ``` CompareAndSwapPointer executes the compare-and-swap operation for a unsafe.Pointer value. Consider using the more ergonomic and less error-prone [Pointer.CompareAndSwap](#Pointer.CompareAndSwap) instead. func CompareAndSwapUint32 ------------------------- ``` func CompareAndSwapUint32(addr *uint32, old, new uint32) (swapped bool) ``` CompareAndSwapUint32 executes the compare-and-swap operation for a uint32 value. Consider using the more ergonomic and less error-prone [Uint32.CompareAndSwap](#Uint32.CompareAndSwap) instead. func CompareAndSwapUint64 ------------------------- ``` func CompareAndSwapUint64(addr *uint64, old, new uint64) (swapped bool) ``` CompareAndSwapUint64 executes the compare-and-swap operation for a uint64 value. Consider using the more ergonomic and less error-prone [Uint64.CompareAndSwap](#Uint64.CompareAndSwap) instead (particularly if you target 32-bit platforms; see the bugs section). func CompareAndSwapUintptr -------------------------- ``` func CompareAndSwapUintptr(addr *uintptr, old, new uintptr) (swapped bool) ``` CompareAndSwapUintptr executes the compare-and-swap operation for a uintptr value. Consider using the more ergonomic and less error-prone [Uintptr.CompareAndSwap](#Uintptr.CompareAndSwap) instead. func LoadInt32 -------------- ``` func LoadInt32(addr *int32) (val int32) ``` LoadInt32 atomically loads \*addr. Consider using the more ergonomic and less error-prone [Int32.Load](#Int32.Load) instead. func LoadInt64 -------------- ``` func LoadInt64(addr *int64) (val int64) ``` LoadInt64 atomically loads \*addr. Consider using the more ergonomic and less error-prone [Int64.Load](#Int64.Load) instead (particularly if you target 32-bit platforms; see the bugs section). func LoadPointer ---------------- ``` func LoadPointer(addr *unsafe.Pointer) (val unsafe.Pointer) ``` LoadPointer atomically loads \*addr. Consider using the more ergonomic and less error-prone [Pointer.Load](#Pointer.Load) instead. func LoadUint32 --------------- ``` func LoadUint32(addr *uint32) (val uint32) ``` LoadUint32 atomically loads \*addr. Consider using the more ergonomic and less error-prone [Uint32.Load](#Uint32.Load) instead. func LoadUint64 --------------- ``` func LoadUint64(addr *uint64) (val uint64) ``` LoadUint64 atomically loads \*addr. Consider using the more ergonomic and less error-prone [Uint64.Load](#Uint64.Load) instead (particularly if you target 32-bit platforms; see the bugs section). func LoadUintptr ---------------- ``` func LoadUintptr(addr *uintptr) (val uintptr) ``` LoadUintptr atomically loads \*addr. Consider using the more ergonomic and less error-prone [Uintptr.Load](#Uintptr.Load) instead. func StoreInt32 --------------- ``` func StoreInt32(addr *int32, val int32) ``` StoreInt32 atomically stores val into \*addr. Consider using the more ergonomic and less error-prone [Int32.Store](#Int32.Store) instead. func StoreInt64 --------------- ``` func StoreInt64(addr *int64, val int64) ``` StoreInt64 atomically stores val into \*addr. Consider using the more ergonomic and less error-prone [Int64.Store](#Int64.Store) instead (particularly if you target 32-bit platforms; see the bugs section). func StorePointer ----------------- ``` func StorePointer(addr *unsafe.Pointer, val unsafe.Pointer) ``` StorePointer atomically stores val into \*addr. Consider using the more ergonomic and less error-prone [Pointer.Store](#Pointer.Store) instead. func StoreUint32 ---------------- ``` func StoreUint32(addr *uint32, val uint32) ``` StoreUint32 atomically stores val into \*addr. Consider using the more ergonomic and less error-prone [Uint32.Store](#Uint32.Store) instead. func StoreUint64 ---------------- ``` func StoreUint64(addr *uint64, val uint64) ``` StoreUint64 atomically stores val into \*addr. Consider using the more ergonomic and less error-prone [Uint64.Store](#Uint64.Store) instead (particularly if you target 32-bit platforms; see the bugs section). func StoreUintptr ----------------- ``` func StoreUintptr(addr *uintptr, val uintptr) ``` StoreUintptr atomically stores val into \*addr. Consider using the more ergonomic and less error-prone [Uintptr.Store](#Uintptr.Store) instead. func SwapInt32 1.2 ------------------ ``` func SwapInt32(addr *int32, new int32) (old int32) ``` SwapInt32 atomically stores new into \*addr and returns the previous \*addr value. Consider using the more ergonomic and less error-prone [Int32.Swap](#Int32.Swap) instead. func SwapInt64 1.2 ------------------ ``` func SwapInt64(addr *int64, new int64) (old int64) ``` SwapInt64 atomically stores new into \*addr and returns the previous \*addr value. Consider using the more ergonomic and less error-prone [Int64.Swap](#Int64.Swap) instead (particularly if you target 32-bit platforms; see the bugs section). func SwapPointer 1.2 -------------------- ``` func SwapPointer(addr *unsafe.Pointer, new unsafe.Pointer) (old unsafe.Pointer) ``` SwapPointer atomically stores new into \*addr and returns the previous \*addr value. Consider using the more ergonomic and less error-prone [Pointer.Swap](#Pointer.Swap) instead. func SwapUint32 1.2 ------------------- ``` func SwapUint32(addr *uint32, new uint32) (old uint32) ``` SwapUint32 atomically stores new into \*addr and returns the previous \*addr value. Consider using the more ergonomic and less error-prone [Uint32.Swap](#Uint32.Swap) instead. func SwapUint64 1.2 ------------------- ``` func SwapUint64(addr *uint64, new uint64) (old uint64) ``` SwapUint64 atomically stores new into \*addr and returns the previous \*addr value. Consider using the more ergonomic and less error-prone [Uint64.Swap](#Uint64.Swap) instead (particularly if you target 32-bit platforms; see the bugs section). func SwapUintptr 1.2 -------------------- ``` func SwapUintptr(addr *uintptr, new uintptr) (old uintptr) ``` SwapUintptr atomically stores new into \*addr and returns the previous \*addr value. Consider using the more ergonomic and less error-prone [Uintptr.Swap](#Uintptr.Swap) instead. type Bool 1.19 -------------- A Bool is an atomic boolean value. The zero value is false. ``` type Bool struct { // contains filtered or unexported fields } ``` ### func (\*Bool) CompareAndSwap 1.19 ``` func (x *Bool) CompareAndSwap(old, new bool) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for the boolean value x. ### func (\*Bool) Load 1.19 ``` func (x *Bool) Load() bool ``` Load atomically loads and returns the value stored in x. ### func (\*Bool) Store 1.19 ``` func (x *Bool) Store(val bool) ``` Store atomically stores val into x. ### func (\*Bool) Swap 1.19 ``` func (x *Bool) Swap(new bool) (old bool) ``` Swap atomically stores new into x and returns the previous value. type Int32 1.19 --------------- An Int32 is an atomic int32. The zero value is zero. ``` type Int32 struct { // contains filtered or unexported fields } ``` ### func (\*Int32) Add 1.19 ``` func (x *Int32) Add(delta int32) (new int32) ``` Add atomically adds delta to x and returns the new value. ### func (\*Int32) CompareAndSwap 1.19 ``` func (x *Int32) CompareAndSwap(old, new int32) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for x. ### func (\*Int32) Load 1.19 ``` func (x *Int32) Load() int32 ``` Load atomically loads and returns the value stored in x. ### func (\*Int32) Store 1.19 ``` func (x *Int32) Store(val int32) ``` Store atomically stores val into x. ### func (\*Int32) Swap 1.19 ``` func (x *Int32) Swap(new int32) (old int32) ``` Swap atomically stores new into x and returns the previous value. type Int64 1.19 --------------- An Int64 is an atomic int64. The zero value is zero. ``` type Int64 struct { // contains filtered or unexported fields } ``` ### func (\*Int64) Add 1.19 ``` func (x *Int64) Add(delta int64) (new int64) ``` Add atomically adds delta to x and returns the new value. ### func (\*Int64) CompareAndSwap 1.19 ``` func (x *Int64) CompareAndSwap(old, new int64) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for x. ### func (\*Int64) Load 1.19 ``` func (x *Int64) Load() int64 ``` Load atomically loads and returns the value stored in x. ### func (\*Int64) Store 1.19 ``` func (x *Int64) Store(val int64) ``` Store atomically stores val into x. ### func (\*Int64) Swap 1.19 ``` func (x *Int64) Swap(new int64) (old int64) ``` Swap atomically stores new into x and returns the previous value. type Pointer ------------ A Pointer is an atomic pointer of type \*T. The zero value is a nil \*T. ``` type Pointer[T any] struct { // contains filtered or unexported fields } ``` ### func (\*Pointer[T]) CompareAndSwap ``` func (x *Pointer[T]) CompareAndSwap(old, new *T) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for x. ### func (\*Pointer[T]) Load ``` func (x *Pointer[T]) Load() *T ``` Load atomically loads and returns the value stored in x. ### func (\*Pointer[T]) Store ``` func (x *Pointer[T]) Store(val *T) ``` Store atomically stores val into x. ### func (\*Pointer[T]) Swap ``` func (x *Pointer[T]) Swap(new *T) (old *T) ``` Swap atomically stores new into x and returns the previous value. type Uint32 1.19 ---------------- An Uint32 is an atomic uint32. The zero value is zero. ``` type Uint32 struct { // contains filtered or unexported fields } ``` ### func (\*Uint32) Add 1.19 ``` func (x *Uint32) Add(delta uint32) (new uint32) ``` Add atomically adds delta to x and returns the new value. ### func (\*Uint32) CompareAndSwap 1.19 ``` func (x *Uint32) CompareAndSwap(old, new uint32) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for x. ### func (\*Uint32) Load 1.19 ``` func (x *Uint32) Load() uint32 ``` Load atomically loads and returns the value stored in x. ### func (\*Uint32) Store 1.19 ``` func (x *Uint32) Store(val uint32) ``` Store atomically stores val into x. ### func (\*Uint32) Swap 1.19 ``` func (x *Uint32) Swap(new uint32) (old uint32) ``` Swap atomically stores new into x and returns the previous value. type Uint64 1.19 ---------------- An Uint64 is an atomic uint64. The zero value is zero. ``` type Uint64 struct { // contains filtered or unexported fields } ``` ### func (\*Uint64) Add 1.19 ``` func (x *Uint64) Add(delta uint64) (new uint64) ``` Add atomically adds delta to x and returns the new value. ### func (\*Uint64) CompareAndSwap 1.19 ``` func (x *Uint64) CompareAndSwap(old, new uint64) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for x. ### func (\*Uint64) Load 1.19 ``` func (x *Uint64) Load() uint64 ``` Load atomically loads and returns the value stored in x. ### func (\*Uint64) Store 1.19 ``` func (x *Uint64) Store(val uint64) ``` Store atomically stores val into x. ### func (\*Uint64) Swap 1.19 ``` func (x *Uint64) Swap(new uint64) (old uint64) ``` Swap atomically stores new into x and returns the previous value. type Uintptr 1.19 ----------------- An Uintptr is an atomic uintptr. The zero value is zero. ``` type Uintptr struct { // contains filtered or unexported fields } ``` ### func (\*Uintptr) Add 1.19 ``` func (x *Uintptr) Add(delta uintptr) (new uintptr) ``` Add atomically adds delta to x and returns the new value. ### func (\*Uintptr) CompareAndSwap 1.19 ``` func (x *Uintptr) CompareAndSwap(old, new uintptr) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for x. ### func (\*Uintptr) Load 1.19 ``` func (x *Uintptr) Load() uintptr ``` Load atomically loads and returns the value stored in x. ### func (\*Uintptr) Store 1.19 ``` func (x *Uintptr) Store(val uintptr) ``` Store atomically stores val into x. ### func (\*Uintptr) Swap 1.19 ``` func (x *Uintptr) Swap(new uintptr) (old uintptr) ``` Swap atomically stores new into x and returns the previous value. type Value 1.4 -------------- A Value provides an atomic load and store of a consistently typed value. The zero value for a Value returns nil from Load. Once Store has been called, a Value must not be copied. A Value must not be copied after first use. ``` type Value struct { // contains filtered or unexported fields } ``` #### Example (Config) The following example shows how to use Value for periodic program config updates and propagation of the changes to worker goroutines. Code: ``` var config atomic.Value // holds current server configuration // Create initial config value and store into config. config.Store(loadConfig()) go func() { // Reload config every 10 seconds // and update config value with the new version. for { time.Sleep(10 * time.Second) config.Store(loadConfig()) } }() // Create worker goroutines that handle incoming requests // using the latest config value. for i := 0; i < 10; i++ { go func() { for r := range requests() { c := config.Load() // Handle request r using config c. _, _ = r, c } }() } ``` #### Example (ReadMostly) The following example shows how to maintain a scalable frequently read, but infrequently updated data structure using copy-on-write idiom. Code: ``` type Map map[string]string var m atomic.Value m.Store(make(Map)) var mu sync.Mutex // used only by writers // read function can be used to read the data without further synchronization read := func(key string) (val string) { m1 := m.Load().(Map) return m1[key] } // insert function can be used to update the data without further synchronization insert := func(key, val string) { mu.Lock() // synchronize with other potential writers defer mu.Unlock() m1 := m.Load().(Map) // load current value of the data structure m2 := make(Map) // create a new value for k, v := range m1 { m2[k] = v // copy all data from the current object to the new one } m2[key] = val // do the update that we need m.Store(m2) // atomically replace the current object with the new one // At this point all new readers start working with the new version. // The old version will be garbage collected once the existing readers // (if any) are done with it. } _, _ = read, insert ``` ### func (\*Value) CompareAndSwap 1.17 ``` func (v *Value) CompareAndSwap(old, new any) (swapped bool) ``` CompareAndSwap executes the compare-and-swap operation for the Value. All calls to CompareAndSwap for a given Value must use values of the same concrete type. CompareAndSwap of an inconsistent type panics, as does CompareAndSwap(old, nil). ### func (\*Value) Load 1.4 ``` func (v *Value) Load() (val any) ``` Load returns the value set by the most recent Store. It returns nil if there has been no call to Store for this Value. ### func (\*Value) Store 1.4 ``` func (v *Value) Store(val any) ``` Store sets the value of the Value v to val. All calls to Store for a given Value must use values of the same concrete type. Store of an inconsistent type panics, as does Store(nil). ### func (\*Value) Swap 1.17 ``` func (v *Value) Swap(new any) (old any) ``` Swap stores new into Value and returns the previous value. It returns nil if the Value is empty. All calls to Swap for a given Value must use values of the same concrete type. Swap of an inconsistent type panics, as does Swap(nil). Bugs ---- * ☞ On 386, the 64-bit functions use instructions unavailable before the Pentium MMX. On non-Linux ARM, the 64-bit functions use instructions unavailable before the ARMv6k core. On ARM, 386, and 32-bit MIPS, it is the caller's responsibility to arrange for 64-bit alignment of 64-bit words accessed atomically via the primitive atomic functions (types [Int64](#Int64) and [Uint64](#Uint64) are automatically aligned). The first word in an allocated struct, array, or slice; in a global variable; or in a local variable (because the subject of all atomic operations will escape to the heap) can be relied upon to be 64-bit aligned.
programming_docs
go Package syscall Package syscall ================ * `import "syscall"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Subdirectories](#pkg-subdirectories) Overview -------- Package syscall contains an interface to the low-level operating system primitives. The details vary depending on the underlying system, and by default, godoc will display the syscall documentation for the current system. If you want godoc to display syscall documentation for another system, set $GOOS and $GOARCH to the desired system. For example, if you want to view documentation for freebsd/arm on linux/amd64, set $GOOS to freebsd and $GOARCH to arm. The primary use of syscall is inside other packages that provide a more portable interface to the system, such as "os", "time" and "net". Use those packages rather than this one if you can. For details of the functions and data types in this package consult the manuals for the appropriate operating system. These calls return err == nil to indicate success; otherwise err is an operating system error describing the failure. On most systems, that error has type syscall.Errno. Deprecated: this package is locked down. Callers should use the corresponding package in the golang.org/x/sys repository instead. That is also where updates required by new systems or versions should be applied. See <https://golang.org/s/go1.4-syscall> for more information. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func Access(path string, mode uint32) (err error)](#Access) * [func Acct(path string) (err error)](#Acct) * [func Adjtimex(buf \*Timex) (state int, err error)](#Adjtimex) * [func AttachLsf(fd int, i []SockFilter) error](#AttachLsf) * [func Bind(fd int, sa Sockaddr) (err error)](#Bind) * [func BindToDevice(fd int, device string) (err error)](#BindToDevice) * [func BytePtrFromString(s string) (\*byte, error)](#BytePtrFromString) * [func ByteSliceFromString(s string) ([]byte, error)](#ByteSliceFromString) * [func Chdir(path string) (err error)](#Chdir) * [func Chmod(path string, mode uint32) (err error)](#Chmod) * [func Chown(path string, uid int, gid int) (err error)](#Chown) * [func Chroot(path string) (err error)](#Chroot) * [func Clearenv()](#Clearenv) * [func Close(fd int) (err error)](#Close) * [func CloseOnExec(fd int)](#CloseOnExec) * [func CmsgLen(datalen int) int](#CmsgLen) * [func CmsgSpace(datalen int) int](#CmsgSpace) * [func Connect(fd int, sa Sockaddr) (err error)](#Connect) * [func Creat(path string, mode uint32) (fd int, err error)](#Creat) * [func DetachLsf(fd int) error](#DetachLsf) * [func Dup(oldfd int) (fd int, err error)](#Dup) * [func Dup2(oldfd int, newfd int) (err error)](#Dup2) * [func Dup3(oldfd int, newfd int, flags int) (err error)](#Dup3) * [func Environ() []string](#Environ) * [func EpollCreate(size int) (fd int, err error)](#EpollCreate) * [func EpollCreate1(flag int) (fd int, err error)](#EpollCreate1) * [func EpollCtl(epfd int, op int, fd int, event \*EpollEvent) (err error)](#EpollCtl) * [func EpollWait(epfd int, events []EpollEvent, msec int) (n int, err error)](#EpollWait) * [func Exec(argv0 string, argv []string, envv []string) (err error)](#Exec) * [func Exit(code int)](#Exit) * [func Faccessat(dirfd int, path string, mode uint32, flags int) (err error)](#Faccessat) * [func Fallocate(fd int, mode uint32, off int64, len int64) (err error)](#Fallocate) * [func Fchdir(fd int) (err error)](#Fchdir) * [func Fchmod(fd int, mode uint32) (err error)](#Fchmod) * [func Fchmodat(dirfd int, path string, mode uint32, flags int) (err error)](#Fchmodat) * [func Fchown(fd int, uid int, gid int) (err error)](#Fchown) * [func Fchownat(dirfd int, path string, uid int, gid int, flags int) (err error)](#Fchownat) * [func FcntlFlock(fd uintptr, cmd int, lk \*Flock\_t) error](#FcntlFlock) * [func Fdatasync(fd int) (err error)](#Fdatasync) * [func Flock(fd int, how int) (err error)](#Flock) * [func ForkExec(argv0 string, argv []string, attr \*ProcAttr) (pid int, err error)](#ForkExec) * [func Fstat(fd int, stat \*Stat\_t) (err error)](#Fstat) * [func Fstatfs(fd int, buf \*Statfs\_t) (err error)](#Fstatfs) * [func Fsync(fd int) (err error)](#Fsync) * [func Ftruncate(fd int, length int64) (err error)](#Ftruncate) * [func Futimes(fd int, tv []Timeval) (err error)](#Futimes) * [func Futimesat(dirfd int, path string, tv []Timeval) (err error)](#Futimesat) * [func Getcwd(buf []byte) (n int, err error)](#Getcwd) * [func Getdents(fd int, buf []byte) (n int, err error)](#Getdents) * [func Getegid() (egid int)](#Getegid) * [func Getenv(key string) (value string, found bool)](#Getenv) * [func Geteuid() (euid int)](#Geteuid) * [func Getgid() (gid int)](#Getgid) * [func Getgroups() (gids []int, err error)](#Getgroups) * [func Getpagesize() int](#Getpagesize) * [func Getpgid(pid int) (pgid int, err error)](#Getpgid) * [func Getpgrp() (pid int)](#Getpgrp) * [func Getpid() (pid int)](#Getpid) * [func Getppid() (ppid int)](#Getppid) * [func Getpriority(which int, who int) (prio int, err error)](#Getpriority) * [func Getrlimit(resource int, rlim \*Rlimit) (err error)](#Getrlimit) * [func Getrusage(who int, rusage \*Rusage) (err error)](#Getrusage) * [func GetsockoptInet4Addr(fd, level, opt int) (value [4]byte, err error)](#GetsockoptInet4Addr) * [func GetsockoptInt(fd, level, opt int) (value int, err error)](#GetsockoptInt) * [func Gettid() (tid int)](#Gettid) * [func Gettimeofday(tv \*Timeval) (err error)](#Gettimeofday) * [func Getuid() (uid int)](#Getuid) * [func Getwd() (wd string, err error)](#Getwd) * [func Getxattr(path string, attr string, dest []byte) (sz int, err error)](#Getxattr) * [func InotifyAddWatch(fd int, pathname string, mask uint32) (watchdesc int, err error)](#InotifyAddWatch) * [func InotifyInit() (fd int, err error)](#InotifyInit) * [func InotifyInit1(flags int) (fd int, err error)](#InotifyInit1) * [func InotifyRmWatch(fd int, watchdesc uint32) (success int, err error)](#InotifyRmWatch) * [func Ioperm(from int, num int, on int) (err error)](#Ioperm) * [func Iopl(level int) (err error)](#Iopl) * [func Kill(pid int, sig Signal) (err error)](#Kill) * [func Klogctl(typ int, buf []byte) (n int, err error)](#Klogctl) * [func Lchown(path string, uid int, gid int) (err error)](#Lchown) * [func Link(oldpath string, newpath string) (err error)](#Link) * [func Listen(s int, n int) (err error)](#Listen) * [func Listxattr(path string, dest []byte) (sz int, err error)](#Listxattr) * [func LsfSocket(ifindex, proto int) (int, error)](#LsfSocket) * [func Lstat(path string, stat \*Stat\_t) (err error)](#Lstat) * [func Madvise(b []byte, advice int) (err error)](#Madvise) * [func Mkdir(path string, mode uint32) (err error)](#Mkdir) * [func Mkdirat(dirfd int, path string, mode uint32) (err error)](#Mkdirat) * [func Mkfifo(path string, mode uint32) (err error)](#Mkfifo) * [func Mknod(path string, mode uint32, dev int) (err error)](#Mknod) * [func Mknodat(dirfd int, path string, mode uint32, dev int) (err error)](#Mknodat) * [func Mlock(b []byte) (err error)](#Mlock) * [func Mlockall(flags int) (err error)](#Mlockall) * [func Mmap(fd int, offset int64, length int, prot int, flags int) (data []byte, err error)](#Mmap) * [func Mount(source string, target string, fstype string, flags uintptr, data string) (err error)](#Mount) * [func Mprotect(b []byte, prot int) (err error)](#Mprotect) * [func Munlock(b []byte) (err error)](#Munlock) * [func Munlockall() (err error)](#Munlockall) * [func Munmap(b []byte) (err error)](#Munmap) * [func Nanosleep(time \*Timespec, leftover \*Timespec) (err error)](#Nanosleep) * [func NetlinkRIB(proto, family int) ([]byte, error)](#NetlinkRIB) * [func Open(path string, mode int, perm uint32) (fd int, err error)](#Open) * [func Openat(dirfd int, path string, flags int, mode uint32) (fd int, err error)](#Openat) * [func ParseDirent(buf []byte, max int, names []string) (consumed int, count int, newnames []string)](#ParseDirent) * [func ParseUnixRights(m \*SocketControlMessage) ([]int, error)](#ParseUnixRights) * [func Pause() (err error)](#Pause) * [func Pipe(p []int) error](#Pipe) * [func Pipe2(p []int, flags int) error](#Pipe2) * [func PivotRoot(newroot string, putold string) (err error)](#PivotRoot) * [func Pread(fd int, p []byte, offset int64) (n int, err error)](#Pread) * [func PtraceAttach(pid int) (err error)](#PtraceAttach) * [func PtraceCont(pid int, signal int) (err error)](#PtraceCont) * [func PtraceDetach(pid int) (err error)](#PtraceDetach) * [func PtraceGetEventMsg(pid int) (msg uint, err error)](#PtraceGetEventMsg) * [func PtraceGetRegs(pid int, regsout \*PtraceRegs) (err error)](#PtraceGetRegs) * [func PtracePeekData(pid int, addr uintptr, out []byte) (count int, err error)](#PtracePeekData) * [func PtracePeekText(pid int, addr uintptr, out []byte) (count int, err error)](#PtracePeekText) * [func PtracePokeData(pid int, addr uintptr, data []byte) (count int, err error)](#PtracePokeData) * [func PtracePokeText(pid int, addr uintptr, data []byte) (count int, err error)](#PtracePokeText) * [func PtraceSetOptions(pid int, options int) (err error)](#PtraceSetOptions) * [func PtraceSetRegs(pid int, regs \*PtraceRegs) (err error)](#PtraceSetRegs) * [func PtraceSingleStep(pid int) (err error)](#PtraceSingleStep) * [func PtraceSyscall(pid int, signal int) (err error)](#PtraceSyscall) * [func Pwrite(fd int, p []byte, offset int64) (n int, err error)](#Pwrite) * [func Read(fd int, p []byte) (n int, err error)](#Read) * [func ReadDirent(fd int, buf []byte) (n int, err error)](#ReadDirent) * [func Readlink(path string, buf []byte) (n int, err error)](#Readlink) * [func Reboot(cmd int) (err error)](#Reboot) * [func Removexattr(path string, attr string) (err error)](#Removexattr) * [func Rename(oldpath string, newpath string) (err error)](#Rename) * [func Renameat(olddirfd int, oldpath string, newdirfd int, newpath string) (err error)](#Renameat) * [func Rmdir(path string) error](#Rmdir) * [func Seek(fd int, offset int64, whence int) (off int64, err error)](#Seek) * [func Select(nfd int, r \*FdSet, w \*FdSet, e \*FdSet, timeout \*Timeval) (n int, err error)](#Select) * [func Sendfile(outfd int, infd int, offset \*int64, count int) (written int, err error)](#Sendfile) * [func Sendmsg(fd int, p, oob []byte, to Sockaddr, flags int) (err error)](#Sendmsg) * [func SendmsgN(fd int, p, oob []byte, to Sockaddr, flags int) (n int, err error)](#SendmsgN) * [func Sendto(fd int, p []byte, flags int, to Sockaddr) (err error)](#Sendto) * [func SetLsfPromisc(name string, m bool) error](#SetLsfPromisc) * [func SetNonblock(fd int, nonblocking bool) (err error)](#SetNonblock) * [func Setdomainname(p []byte) (err error)](#Setdomainname) * [func Setegid(egid int) (err error)](#Setegid) * [func Setenv(key, value string) error](#Setenv) * [func Seteuid(euid int) (err error)](#Seteuid) * [func Setfsgid(gid int) (err error)](#Setfsgid) * [func Setfsuid(uid int) (err error)](#Setfsuid) * [func Setgid(gid int) (err error)](#Setgid) * [func Setgroups(gids []int) (err error)](#Setgroups) * [func Sethostname(p []byte) (err error)](#Sethostname) * [func Setpgid(pid int, pgid int) (err error)](#Setpgid) * [func Setpriority(which int, who int, prio int) (err error)](#Setpriority) * [func Setregid(rgid, egid int) (err error)](#Setregid) * [func Setresgid(rgid, egid, sgid int) (err error)](#Setresgid) * [func Setresuid(ruid, euid, suid int) (err error)](#Setresuid) * [func Setreuid(ruid, euid int) (err error)](#Setreuid) * [func Setrlimit(resource int, rlim \*Rlimit) (err error)](#Setrlimit) * [func Setsid() (pid int, err error)](#Setsid) * [func SetsockoptByte(fd, level, opt int, value byte) (err error)](#SetsockoptByte) * [func SetsockoptICMPv6Filter(fd, level, opt int, filter \*ICMPv6Filter) error](#SetsockoptICMPv6Filter) * [func SetsockoptIPMreq(fd, level, opt int, mreq \*IPMreq) (err error)](#SetsockoptIPMreq) * [func SetsockoptIPMreqn(fd, level, opt int, mreq \*IPMreqn) (err error)](#SetsockoptIPMreqn) * [func SetsockoptIPv6Mreq(fd, level, opt int, mreq \*IPv6Mreq) (err error)](#SetsockoptIPv6Mreq) * [func SetsockoptInet4Addr(fd, level, opt int, value [4]byte) (err error)](#SetsockoptInet4Addr) * [func SetsockoptInt(fd, level, opt int, value int) (err error)](#SetsockoptInt) * [func SetsockoptLinger(fd, level, opt int, l \*Linger) (err error)](#SetsockoptLinger) * [func SetsockoptString(fd, level, opt int, s string) (err error)](#SetsockoptString) * [func SetsockoptTimeval(fd, level, opt int, tv \*Timeval) (err error)](#SetsockoptTimeval) * [func Settimeofday(tv \*Timeval) (err error)](#Settimeofday) * [func Setuid(uid int) (err error)](#Setuid) * [func Setxattr(path string, attr string, data []byte, flags int) (err error)](#Setxattr) * [func Shutdown(fd int, how int) (err error)](#Shutdown) * [func SlicePtrFromStrings(ss []string) ([]\*byte, error)](#SlicePtrFromStrings) * [func Socket(domain, typ, proto int) (fd int, err error)](#Socket) * [func Socketpair(domain, typ, proto int) (fd [2]int, err error)](#Socketpair) * [func Splice(rfd int, roff \*int64, wfd int, woff \*int64, len int, flags int) (n int64, err error)](#Splice) * [func StartProcess(argv0 string, argv []string, attr \*ProcAttr) (pid int, handle uintptr, err error)](#StartProcess) * [func Stat(path string, stat \*Stat\_t) (err error)](#Stat) * [func Statfs(path string, buf \*Statfs\_t) (err error)](#Statfs) * [func StringBytePtr(s string) \*byte](#StringBytePtr) * [func StringByteSlice(s string) []byte](#StringByteSlice) * [func StringSlicePtr(ss []string) []\*byte](#StringSlicePtr) * [func Symlink(oldpath string, newpath string) (err error)](#Symlink) * [func Sync()](#Sync) * [func SyncFileRange(fd int, off int64, n int64, flags int) (err error)](#SyncFileRange) * [func Sysinfo(info \*Sysinfo\_t) (err error)](#Sysinfo) * [func Tee(rfd int, wfd int, len int, flags int) (n int64, err error)](#Tee) * [func Tgkill(tgid int, tid int, sig Signal) (err error)](#Tgkill) * [func Times(tms \*Tms) (ticks uintptr, err error)](#Times) * [func TimespecToNsec(ts Timespec) int64](#TimespecToNsec) * [func TimevalToNsec(tv Timeval) int64](#TimevalToNsec) * [func Truncate(path string, length int64) (err error)](#Truncate) * [func Umask(mask int) (oldmask int)](#Umask) * [func Uname(buf \*Utsname) (err error)](#Uname) * [func UnixCredentials(ucred \*Ucred) []byte](#UnixCredentials) * [func UnixRights(fds ...int) []byte](#UnixRights) * [func Unlink(path string) error](#Unlink) * [func Unlinkat(dirfd int, path string) error](#Unlinkat) * [func Unmount(target string, flags int) (err error)](#Unmount) * [func Unsetenv(key string) error](#Unsetenv) * [func Unshare(flags int) (err error)](#Unshare) * [func Ustat(dev int, ubuf \*Ustat\_t) (err error)](#Ustat) * [func Utime(path string, buf \*Utimbuf) (err error)](#Utime) * [func Utimes(path string, tv []Timeval) (err error)](#Utimes) * [func UtimesNano(path string, ts []Timespec) (err error)](#UtimesNano) * [func Wait4(pid int, wstatus \*WaitStatus, options int, rusage \*Rusage) (wpid int, err error)](#Wait4) * [func Write(fd int, p []byte) (n int, err error)](#Write) * [type Cmsghdr](#Cmsghdr) * [func (cmsg \*Cmsghdr) SetLen(length int)](#Cmsghdr.SetLen) * [type Conn](#Conn) * [type Credential](#Credential) * [type Dirent](#Dirent) * [type EpollEvent](#EpollEvent) * [type Errno](#Errno) * [func AllThreadsSyscall(trap, a1, a2, a3 uintptr) (r1, r2 uintptr, err Errno)](#AllThreadsSyscall) * [func AllThreadsSyscall6(trap, a1, a2, a3, a4, a5, a6 uintptr) (r1, r2 uintptr, err Errno)](#AllThreadsSyscall6) * [func RawSyscall(trap, a1, a2, a3 uintptr) (r1, r2 uintptr, err Errno)](#RawSyscall) * [func RawSyscall6(trap, a1, a2, a3, a4, a5, a6 uintptr) (r1, r2 uintptr, err Errno)](#RawSyscall6) * [func Syscall(trap, a1, a2, a3 uintptr) (r1, r2 uintptr, err Errno)](#Syscall) * [func Syscall6(trap, a1, a2, a3, a4, a5, a6 uintptr) (r1, r2 uintptr, err Errno)](#Syscall6) * [func (e Errno) Error() string](#Errno.Error) * [func (e Errno) Is(target error) bool](#Errno.Is) * [func (e Errno) Temporary() bool](#Errno.Temporary) * [func (e Errno) Timeout() bool](#Errno.Timeout) * [type FdSet](#FdSet) * [type Flock\_t](#Flock_t) * [type Fsid](#Fsid) * [type ICMPv6Filter](#ICMPv6Filter) * [func GetsockoptICMPv6Filter(fd, level, opt int) (\*ICMPv6Filter, error)](#GetsockoptICMPv6Filter) * [type IPMreq](#IPMreq) * [func GetsockoptIPMreq(fd, level, opt int) (\*IPMreq, error)](#GetsockoptIPMreq) * [type IPMreqn](#IPMreqn) * [func GetsockoptIPMreqn(fd, level, opt int) (\*IPMreqn, error)](#GetsockoptIPMreqn) * [type IPv6MTUInfo](#IPv6MTUInfo) * [func GetsockoptIPv6MTUInfo(fd, level, opt int) (\*IPv6MTUInfo, error)](#GetsockoptIPv6MTUInfo) * [type IPv6Mreq](#IPv6Mreq) * [func GetsockoptIPv6Mreq(fd, level, opt int) (\*IPv6Mreq, error)](#GetsockoptIPv6Mreq) * [type IfAddrmsg](#IfAddrmsg) * [type IfInfomsg](#IfInfomsg) * [type Inet4Pktinfo](#Inet4Pktinfo) * [type Inet6Pktinfo](#Inet6Pktinfo) * [type InotifyEvent](#InotifyEvent) * [type Iovec](#Iovec) * [func (iov \*Iovec) SetLen(length int)](#Iovec.SetLen) * [type Linger](#Linger) * [type Msghdr](#Msghdr) * [func (msghdr \*Msghdr) SetControllen(length int)](#Msghdr.SetControllen) * [type NetlinkMessage](#NetlinkMessage) * [func ParseNetlinkMessage(b []byte) ([]NetlinkMessage, error)](#ParseNetlinkMessage) * [type NetlinkRouteAttr](#NetlinkRouteAttr) * [func ParseNetlinkRouteAttr(m \*NetlinkMessage) ([]NetlinkRouteAttr, error)](#ParseNetlinkRouteAttr) * [type NetlinkRouteRequest](#NetlinkRouteRequest) * [type NlAttr](#NlAttr) * [type NlMsgerr](#NlMsgerr) * [type NlMsghdr](#NlMsghdr) * [type ProcAttr](#ProcAttr) * [type PtraceRegs](#PtraceRegs) * [func (r \*PtraceRegs) PC() uint64](#PtraceRegs.PC) * [func (r \*PtraceRegs) SetPC(pc uint64)](#PtraceRegs.SetPC) * [type RawConn](#RawConn) * [type RawSockaddr](#RawSockaddr) * [type RawSockaddrAny](#RawSockaddrAny) * [type RawSockaddrInet4](#RawSockaddrInet4) * [type RawSockaddrInet6](#RawSockaddrInet6) * [type RawSockaddrLinklayer](#RawSockaddrLinklayer) * [type RawSockaddrNetlink](#RawSockaddrNetlink) * [type RawSockaddrUnix](#RawSockaddrUnix) * [type Rlimit](#Rlimit) * [type RtAttr](#RtAttr) * [type RtGenmsg](#RtGenmsg) * [type RtMsg](#RtMsg) * [type RtNexthop](#RtNexthop) * [type Rusage](#Rusage) * [type Signal](#Signal) * [func (s Signal) Signal()](#Signal.Signal) * [func (s Signal) String() string](#Signal.String) * [type SockFilter](#SockFilter) * [func LsfJump(code, k, jt, jf int) \*SockFilter](#LsfJump) * [func LsfStmt(code, k int) \*SockFilter](#LsfStmt) * [type SockFprog](#SockFprog) * [type Sockaddr](#Sockaddr) * [func Accept(fd int) (nfd int, sa Sockaddr, err error)](#Accept) * [func Accept4(fd int, flags int) (nfd int, sa Sockaddr, err error)](#Accept4) * [func Getpeername(fd int) (sa Sockaddr, err error)](#Getpeername) * [func Getsockname(fd int) (sa Sockaddr, err error)](#Getsockname) * [func Recvfrom(fd int, p []byte, flags int) (n int, from Sockaddr, err error)](#Recvfrom) * [func Recvmsg(fd int, p, oob []byte, flags int) (n, oobn int, recvflags int, from Sockaddr, err error)](#Recvmsg) * [type SockaddrInet4](#SockaddrInet4) * [type SockaddrInet6](#SockaddrInet6) * [type SockaddrLinklayer](#SockaddrLinklayer) * [type SockaddrNetlink](#SockaddrNetlink) * [type SockaddrUnix](#SockaddrUnix) * [type SocketControlMessage](#SocketControlMessage) * [func ParseSocketControlMessage(b []byte) ([]SocketControlMessage, error)](#ParseSocketControlMessage) * [type Stat\_t](#Stat_t) * [type Statfs\_t](#Statfs_t) * [type SysProcAttr](#SysProcAttr) * [type SysProcIDMap](#SysProcIDMap) * [type Sysinfo\_t](#Sysinfo_t) * [type TCPInfo](#TCPInfo) * [type Termios](#Termios) * [type Time\_t](#Time_t) * [func Time(t \*Time\_t) (tt Time\_t, err error)](#Time) * [type Timespec](#Timespec) * [func NsecToTimespec(nsec int64) Timespec](#NsecToTimespec) * [func (ts \*Timespec) Nano() int64](#Timespec.Nano) * [func (ts \*Timespec) Unix() (sec int64, nsec int64)](#Timespec.Unix) * [type Timeval](#Timeval) * [func NsecToTimeval(nsec int64) Timeval](#NsecToTimeval) * [func (tv \*Timeval) Nano() int64](#Timeval.Nano) * [func (tv \*Timeval) Unix() (sec int64, nsec int64)](#Timeval.Unix) * [type Timex](#Timex) * [type Tms](#Tms) * [type Ucred](#Ucred) * [func GetsockoptUcred(fd, level, opt int) (\*Ucred, error)](#GetsockoptUcred) * [func ParseUnixCredentials(m \*SocketControlMessage) (\*Ucred, error)](#ParseUnixCredentials) * [type Ustat\_t](#Ustat_t) * [type Utimbuf](#Utimbuf) * [type Utsname](#Utsname) * [type WaitStatus](#WaitStatus) * [func (w WaitStatus) Continued() bool](#WaitStatus.Continued) * [func (w WaitStatus) CoreDump() bool](#WaitStatus.CoreDump) * [func (w WaitStatus) ExitStatus() int](#WaitStatus.ExitStatus) * [func (w WaitStatus) Exited() bool](#WaitStatus.Exited) * [func (w WaitStatus) Signal() Signal](#WaitStatus.Signal) * [func (w WaitStatus) Signaled() bool](#WaitStatus.Signaled) * [func (w WaitStatus) StopSignal() Signal](#WaitStatus.StopSignal) * [func (w WaitStatus) Stopped() bool](#WaitStatus.Stopped) * [func (w WaitStatus) TrapCause() int](#WaitStatus.TrapCause) ### Package files asan0.go dirent.go endian\_little.go env\_unix.go exec\_linux.go exec\_unix.go flock.go lsf\_linux.go msan0.go net.go netlink\_linux.go setuidgid\_linux.go sockcmsg\_linux.go sockcmsg\_unix.go sockcmsg\_unix\_other.go syscall.go syscall\_linux.go syscall\_linux\_accept4.go syscall\_linux\_amd64.go syscall\_unix.go time\_nofake.go timestruct.go zerrors\_linux\_amd64.go zsyscall\_linux\_amd64.go zsysnum\_linux\_amd64.go ztypes\_linux\_amd64.go Constants --------- Linux unshare/clone/clone2/clone3 flags, architecture-independent, copied from linux/sched.h. ``` const ( CLONE_VM = 0x00000100 // set if VM shared between processes CLONE_FS = 0x00000200 // set if fs info shared between processes CLONE_FILES = 0x00000400 // set if open files shared between processes CLONE_SIGHAND = 0x00000800 // set if signal handlers and blocked signals shared CLONE_PIDFD = 0x00001000 // set if a pidfd should be placed in parent CLONE_PTRACE = 0x00002000 // set if we want to let tracing continue on the child too CLONE_VFORK = 0x00004000 // set if the parent wants the child to wake it up on mm_release CLONE_PARENT = 0x00008000 // set if we want to have the same parent as the cloner CLONE_THREAD = 0x00010000 // Same thread group? CLONE_NEWNS = 0x00020000 // New mount namespace group CLONE_SYSVSEM = 0x00040000 // share system V SEM_UNDO semantics CLONE_SETTLS = 0x00080000 // create a new TLS for the child CLONE_PARENT_SETTID = 0x00100000 // set the TID in the parent CLONE_CHILD_CLEARTID = 0x00200000 // clear the TID in the child CLONE_DETACHED = 0x00400000 // Unused, ignored CLONE_UNTRACED = 0x00800000 // set if the tracing process can't force CLONE_PTRACE on this clone CLONE_CHILD_SETTID = 0x01000000 // set the TID in the child CLONE_NEWCGROUP = 0x02000000 // New cgroup namespace CLONE_NEWUTS = 0x04000000 // New utsname namespace CLONE_NEWIPC = 0x08000000 // New ipc namespace CLONE_NEWUSER = 0x10000000 // New user namespace CLONE_NEWPID = 0x20000000 // New pid namespace CLONE_NEWNET = 0x40000000 // New network namespace CLONE_IO = 0x80000000 // Clone io context CLONE_CLEAR_SIGHAND = 0x100000000 // Clear any signal handler and reset to SIG_DFL. CLONE_INTO_CGROUP = 0x200000000 // Clone into a specific cgroup given the right permissions. CLONE_NEWTIME = 0x00000080 // New time namespace ) ``` ``` const ( AF_ALG = 0x26 AF_APPLETALK = 0x5 AF_ASH = 0x12 AF_ATMPVC = 0x8 AF_ATMSVC = 0x14 AF_AX25 = 0x3 AF_BLUETOOTH = 0x1f AF_BRIDGE = 0x7 AF_CAIF = 0x25 AF_CAN = 0x1d AF_DECnet = 0xc AF_ECONET = 0x13 AF_FILE = 0x1 AF_IEEE802154 = 0x24 AF_INET = 0x2 AF_INET6 = 0xa AF_IPX = 0x4 AF_IRDA = 0x17 AF_ISDN = 0x22 AF_IUCV = 0x20 AF_KEY = 0xf AF_LLC = 0x1a AF_LOCAL = 0x1 AF_MAX = 0x27 AF_NETBEUI = 0xd AF_NETLINK = 0x10 AF_NETROM = 0x6 AF_PACKET = 0x11 AF_PHONET = 0x23 AF_PPPOX = 0x18 AF_RDS = 0x15 AF_ROSE = 0xb AF_ROUTE = 0x10 AF_RXRPC = 0x21 AF_SECURITY = 0xe AF_SNA = 0x16 AF_TIPC = 0x1e AF_UNIX = 0x1 AF_UNSPEC = 0x0 AF_WANPIPE = 0x19 AF_X25 = 0x9 ARPHRD_ADAPT = 0x108 ARPHRD_APPLETLK = 0x8 ARPHRD_ARCNET = 0x7 ARPHRD_ASH = 0x30d ARPHRD_ATM = 0x13 ARPHRD_AX25 = 0x3 ARPHRD_BIF = 0x307 ARPHRD_CHAOS = 0x5 ARPHRD_CISCO = 0x201 ARPHRD_CSLIP = 0x101 ARPHRD_CSLIP6 = 0x103 ARPHRD_DDCMP = 0x205 ARPHRD_DLCI = 0xf ARPHRD_ECONET = 0x30e ARPHRD_EETHER = 0x2 ARPHRD_ETHER = 0x1 ARPHRD_EUI64 = 0x1b ARPHRD_FCAL = 0x311 ARPHRD_FCFABRIC = 0x313 ARPHRD_FCPL = 0x312 ARPHRD_FCPP = 0x310 ARPHRD_FDDI = 0x306 ARPHRD_FRAD = 0x302 ARPHRD_HDLC = 0x201 ARPHRD_HIPPI = 0x30c ARPHRD_HWX25 = 0x110 ARPHRD_IEEE1394 = 0x18 ARPHRD_IEEE802 = 0x6 ARPHRD_IEEE80211 = 0x321 ARPHRD_IEEE80211_PRISM = 0x322 ARPHRD_IEEE80211_RADIOTAP = 0x323 ARPHRD_IEEE802154 = 0x324 ARPHRD_IEEE802154_PHY = 0x325 ARPHRD_IEEE802_TR = 0x320 ARPHRD_INFINIBAND = 0x20 ARPHRD_IPDDP = 0x309 ARPHRD_IPGRE = 0x30a ARPHRD_IRDA = 0x30f ARPHRD_LAPB = 0x204 ARPHRD_LOCALTLK = 0x305 ARPHRD_LOOPBACK = 0x304 ARPHRD_METRICOM = 0x17 ARPHRD_NETROM = 0x0 ARPHRD_NONE = 0xfffe ARPHRD_PIMREG = 0x30b ARPHRD_PPP = 0x200 ARPHRD_PRONET = 0x4 ARPHRD_RAWHDLC = 0x206 ARPHRD_ROSE = 0x10e ARPHRD_RSRVD = 0x104 ARPHRD_SIT = 0x308 ARPHRD_SKIP = 0x303 ARPHRD_SLIP = 0x100 ARPHRD_SLIP6 = 0x102 ARPHRD_TUNNEL = 0x300 ARPHRD_TUNNEL6 = 0x301 ARPHRD_VOID = 0xffff ARPHRD_X25 = 0x10f BPF_A = 0x10 BPF_ABS = 0x20 BPF_ADD = 0x0 BPF_ALU = 0x4 BPF_AND = 0x50 BPF_B = 0x10 BPF_DIV = 0x30 BPF_H = 0x8 BPF_IMM = 0x0 BPF_IND = 0x40 BPF_JA = 0x0 BPF_JEQ = 0x10 BPF_JGE = 0x30 BPF_JGT = 0x20 BPF_JMP = 0x5 BPF_JSET = 0x40 BPF_K = 0x0 BPF_LD = 0x0 BPF_LDX = 0x1 BPF_LEN = 0x80 BPF_LSH = 0x60 BPF_MAJOR_VERSION = 0x1 BPF_MAXINSNS = 0x1000 BPF_MEM = 0x60 BPF_MEMWORDS = 0x10 BPF_MINOR_VERSION = 0x1 BPF_MISC = 0x7 BPF_MSH = 0xa0 BPF_MUL = 0x20 BPF_NEG = 0x80 BPF_OR = 0x40 BPF_RET = 0x6 BPF_RSH = 0x70 BPF_ST = 0x2 BPF_STX = 0x3 BPF_SUB = 0x10 BPF_TAX = 0x0 BPF_TXA = 0x80 BPF_W = 0x0 BPF_X = 0x8 DT_BLK = 0x6 DT_CHR = 0x2 DT_DIR = 0x4 DT_FIFO = 0x1 DT_LNK = 0xa DT_REG = 0x8 DT_SOCK = 0xc DT_UNKNOWN = 0x0 DT_WHT = 0xe EPOLLERR = 0x8 EPOLLET = -0x80000000 EPOLLHUP = 0x10 EPOLLIN = 0x1 EPOLLMSG = 0x400 EPOLLONESHOT = 0x40000000 EPOLLOUT = 0x4 EPOLLPRI = 0x2 EPOLLRDBAND = 0x80 EPOLLRDHUP = 0x2000 EPOLLRDNORM = 0x40 EPOLLWRBAND = 0x200 EPOLLWRNORM = 0x100 EPOLL_CLOEXEC = 0x80000 EPOLL_CTL_ADD = 0x1 EPOLL_CTL_DEL = 0x2 EPOLL_CTL_MOD = 0x3 EPOLL_NONBLOCK = 0x800 ETH_P_1588 = 0x88f7 ETH_P_8021Q = 0x8100 ETH_P_802_2 = 0x4 ETH_P_802_3 = 0x1 ETH_P_AARP = 0x80f3 ETH_P_ALL = 0x3 ETH_P_AOE = 0x88a2 ETH_P_ARCNET = 0x1a ETH_P_ARP = 0x806 ETH_P_ATALK = 0x809b ETH_P_ATMFATE = 0x8884 ETH_P_ATMMPOA = 0x884c ETH_P_AX25 = 0x2 ETH_P_BPQ = 0x8ff ETH_P_CAIF = 0xf7 ETH_P_CAN = 0xc ETH_P_CONTROL = 0x16 ETH_P_CUST = 0x6006 ETH_P_DDCMP = 0x6 ETH_P_DEC = 0x6000 ETH_P_DIAG = 0x6005 ETH_P_DNA_DL = 0x6001 ETH_P_DNA_RC = 0x6002 ETH_P_DNA_RT = 0x6003 ETH_P_DSA = 0x1b ETH_P_ECONET = 0x18 ETH_P_EDSA = 0xdada ETH_P_FCOE = 0x8906 ETH_P_FIP = 0x8914 ETH_P_HDLC = 0x19 ETH_P_IEEE802154 = 0xf6 ETH_P_IEEEPUP = 0xa00 ETH_P_IEEEPUPAT = 0xa01 ETH_P_IP = 0x800 ETH_P_IPV6 = 0x86dd ETH_P_IPX = 0x8137 ETH_P_IRDA = 0x17 ETH_P_LAT = 0x6004 ETH_P_LINK_CTL = 0x886c ETH_P_LOCALTALK = 0x9 ETH_P_LOOP = 0x60 ETH_P_MOBITEX = 0x15 ETH_P_MPLS_MC = 0x8848 ETH_P_MPLS_UC = 0x8847 ETH_P_PAE = 0x888e ETH_P_PAUSE = 0x8808 ETH_P_PHONET = 0xf5 ETH_P_PPPTALK = 0x10 ETH_P_PPP_DISC = 0x8863 ETH_P_PPP_MP = 0x8 ETH_P_PPP_SES = 0x8864 ETH_P_PUP = 0x200 ETH_P_PUPAT = 0x201 ETH_P_RARP = 0x8035 ETH_P_SCA = 0x6007 ETH_P_SLOW = 0x8809 ETH_P_SNAP = 0x5 ETH_P_TEB = 0x6558 ETH_P_TIPC = 0x88ca ETH_P_TRAILER = 0x1c ETH_P_TR_802_2 = 0x11 ETH_P_WAN_PPP = 0x7 ETH_P_WCCP = 0x883e ETH_P_X25 = 0x805 FD_CLOEXEC = 0x1 FD_SETSIZE = 0x400 F_DUPFD = 0x0 F_DUPFD_CLOEXEC = 0x406 F_EXLCK = 0x4 F_GETFD = 0x1 F_GETFL = 0x3 F_GETLEASE = 0x401 F_GETLK = 0x5 F_GETLK64 = 0x5 F_GETOWN = 0x9 F_GETOWN_EX = 0x10 F_GETPIPE_SZ = 0x408 F_GETSIG = 0xb F_LOCK = 0x1 F_NOTIFY = 0x402 F_OK = 0x0 F_RDLCK = 0x0 F_SETFD = 0x2 F_SETFL = 0x4 F_SETLEASE = 0x400 F_SETLK = 0x6 F_SETLK64 = 0x6 F_SETLKW = 0x7 F_SETLKW64 = 0x7 F_SETOWN = 0x8 F_SETOWN_EX = 0xf F_SETPIPE_SZ = 0x407 F_SETSIG = 0xa F_SHLCK = 0x8 F_TEST = 0x3 F_TLOCK = 0x2 F_ULOCK = 0x0 F_UNLCK = 0x2 F_WRLCK = 0x1 ICMPV6_FILTER = 0x1 IFA_F_DADFAILED = 0x8 IFA_F_DEPRECATED = 0x20 IFA_F_HOMEADDRESS = 0x10 IFA_F_NODAD = 0x2 IFA_F_OPTIMISTIC = 0x4 IFA_F_PERMANENT = 0x80 IFA_F_SECONDARY = 0x1 IFA_F_TEMPORARY = 0x1 IFA_F_TENTATIVE = 0x40 IFA_MAX = 0x7 IFF_ALLMULTI = 0x200 IFF_AUTOMEDIA = 0x4000 IFF_BROADCAST = 0x2 IFF_DEBUG = 0x4 IFF_DYNAMIC = 0x8000 IFF_LOOPBACK = 0x8 IFF_MASTER = 0x400 IFF_MULTICAST = 0x1000 IFF_NOARP = 0x80 IFF_NOTRAILERS = 0x20 IFF_NO_PI = 0x1000 IFF_ONE_QUEUE = 0x2000 IFF_POINTOPOINT = 0x10 IFF_PORTSEL = 0x2000 IFF_PROMISC = 0x100 IFF_RUNNING = 0x40 IFF_SLAVE = 0x800 IFF_TAP = 0x2 IFF_TUN = 0x1 IFF_TUN_EXCL = 0x8000 IFF_UP = 0x1 IFF_VNET_HDR = 0x4000 IFNAMSIZ = 0x10 IN_ACCESS = 0x1 IN_ALL_EVENTS = 0xfff IN_ATTRIB = 0x4 IN_CLASSA_HOST = 0xffffff IN_CLASSA_MAX = 0x80 IN_CLASSA_NET = 0xff000000 IN_CLASSA_NSHIFT = 0x18 IN_CLASSB_HOST = 0xffff IN_CLASSB_MAX = 0x10000 IN_CLASSB_NET = 0xffff0000 IN_CLASSB_NSHIFT = 0x10 IN_CLASSC_HOST = 0xff IN_CLASSC_NET = 0xffffff00 IN_CLASSC_NSHIFT = 0x8 IN_CLOEXEC = 0x80000 IN_CLOSE = 0x18 IN_CLOSE_NOWRITE = 0x10 IN_CLOSE_WRITE = 0x8 IN_CREATE = 0x100 IN_DELETE = 0x200 IN_DELETE_SELF = 0x400 IN_DONT_FOLLOW = 0x2000000 IN_EXCL_UNLINK = 0x4000000 IN_IGNORED = 0x8000 IN_ISDIR = 0x40000000 IN_LOOPBACKNET = 0x7f IN_MASK_ADD = 0x20000000 IN_MODIFY = 0x2 IN_MOVE = 0xc0 IN_MOVED_FROM = 0x40 IN_MOVED_TO = 0x80 IN_MOVE_SELF = 0x800 IN_NONBLOCK = 0x800 IN_ONESHOT = 0x80000000 IN_ONLYDIR = 0x1000000 IN_OPEN = 0x20 IN_Q_OVERFLOW = 0x4000 IN_UNMOUNT = 0x2000 IPPROTO_AH = 0x33 IPPROTO_COMP = 0x6c IPPROTO_DCCP = 0x21 IPPROTO_DSTOPTS = 0x3c IPPROTO_EGP = 0x8 IPPROTO_ENCAP = 0x62 IPPROTO_ESP = 0x32 IPPROTO_FRAGMENT = 0x2c IPPROTO_GRE = 0x2f IPPROTO_HOPOPTS = 0x0 IPPROTO_ICMP = 0x1 IPPROTO_ICMPV6 = 0x3a IPPROTO_IDP = 0x16 IPPROTO_IGMP = 0x2 IPPROTO_IP = 0x0 IPPROTO_IPIP = 0x4 IPPROTO_IPV6 = 0x29 IPPROTO_MTP = 0x5c IPPROTO_NONE = 0x3b IPPROTO_PIM = 0x67 IPPROTO_PUP = 0xc IPPROTO_RAW = 0xff IPPROTO_ROUTING = 0x2b IPPROTO_RSVP = 0x2e IPPROTO_SCTP = 0x84 IPPROTO_TCP = 0x6 IPPROTO_TP = 0x1d IPPROTO_UDP = 0x11 IPPROTO_UDPLITE = 0x88 IPV6_2292DSTOPTS = 0x4 IPV6_2292HOPLIMIT = 0x8 IPV6_2292HOPOPTS = 0x3 IPV6_2292PKTINFO = 0x2 IPV6_2292PKTOPTIONS = 0x6 IPV6_2292RTHDR = 0x5 IPV6_ADDRFORM = 0x1 IPV6_ADD_MEMBERSHIP = 0x14 IPV6_AUTHHDR = 0xa IPV6_CHECKSUM = 0x7 IPV6_DROP_MEMBERSHIP = 0x15 IPV6_DSTOPTS = 0x3b IPV6_HOPLIMIT = 0x34 IPV6_HOPOPTS = 0x36 IPV6_IPSEC_POLICY = 0x22 IPV6_JOIN_ANYCAST = 0x1b IPV6_JOIN_GROUP = 0x14 IPV6_LEAVE_ANYCAST = 0x1c IPV6_LEAVE_GROUP = 0x15 IPV6_MTU = 0x18 IPV6_MTU_DISCOVER = 0x17 IPV6_MULTICAST_HOPS = 0x12 IPV6_MULTICAST_IF = 0x11 IPV6_MULTICAST_LOOP = 0x13 IPV6_NEXTHOP = 0x9 IPV6_PKTINFO = 0x32 IPV6_PMTUDISC_DO = 0x2 IPV6_PMTUDISC_DONT = 0x0 IPV6_PMTUDISC_PROBE = 0x3 IPV6_PMTUDISC_WANT = 0x1 IPV6_RECVDSTOPTS = 0x3a IPV6_RECVERR = 0x19 IPV6_RECVHOPLIMIT = 0x33 IPV6_RECVHOPOPTS = 0x35 IPV6_RECVPKTINFO = 0x31 IPV6_RECVRTHDR = 0x38 IPV6_RECVTCLASS = 0x42 IPV6_ROUTER_ALERT = 0x16 IPV6_RTHDR = 0x39 IPV6_RTHDRDSTOPTS = 0x37 IPV6_RTHDR_LOOSE = 0x0 IPV6_RTHDR_STRICT = 0x1 IPV6_RTHDR_TYPE_0 = 0x0 IPV6_RXDSTOPTS = 0x3b IPV6_RXHOPOPTS = 0x36 IPV6_TCLASS = 0x43 IPV6_UNICAST_HOPS = 0x10 IPV6_V6ONLY = 0x1a IPV6_XFRM_POLICY = 0x23 IP_ADD_MEMBERSHIP = 0x23 IP_ADD_SOURCE_MEMBERSHIP = 0x27 IP_BLOCK_SOURCE = 0x26 IP_DEFAULT_MULTICAST_LOOP = 0x1 IP_DEFAULT_MULTICAST_TTL = 0x1 IP_DF = 0x4000 IP_DROP_MEMBERSHIP = 0x24 IP_DROP_SOURCE_MEMBERSHIP = 0x28 IP_FREEBIND = 0xf IP_HDRINCL = 0x3 IP_IPSEC_POLICY = 0x10 IP_MAXPACKET = 0xffff IP_MAX_MEMBERSHIPS = 0x14 IP_MF = 0x2000 IP_MINTTL = 0x15 IP_MSFILTER = 0x29 IP_MSS = 0x240 IP_MTU = 0xe IP_MTU_DISCOVER = 0xa IP_MULTICAST_IF = 0x20 IP_MULTICAST_LOOP = 0x22 IP_MULTICAST_TTL = 0x21 IP_OFFMASK = 0x1fff IP_OPTIONS = 0x4 IP_ORIGDSTADDR = 0x14 IP_PASSSEC = 0x12 IP_PKTINFO = 0x8 IP_PKTOPTIONS = 0x9 IP_PMTUDISC = 0xa IP_PMTUDISC_DO = 0x2 IP_PMTUDISC_DONT = 0x0 IP_PMTUDISC_PROBE = 0x3 IP_PMTUDISC_WANT = 0x1 IP_RECVERR = 0xb IP_RECVOPTS = 0x6 IP_RECVORIGDSTADDR = 0x14 IP_RECVRETOPTS = 0x7 IP_RECVTOS = 0xd IP_RECVTTL = 0xc IP_RETOPTS = 0x7 IP_RF = 0x8000 IP_ROUTER_ALERT = 0x5 IP_TOS = 0x1 IP_TRANSPARENT = 0x13 IP_TTL = 0x2 IP_UNBLOCK_SOURCE = 0x25 IP_XFRM_POLICY = 0x11 LINUX_REBOOT_CMD_CAD_OFF = 0x0 LINUX_REBOOT_CMD_CAD_ON = 0x89abcdef LINUX_REBOOT_CMD_HALT = 0xcdef0123 LINUX_REBOOT_CMD_KEXEC = 0x45584543 LINUX_REBOOT_CMD_POWER_OFF = 0x4321fedc LINUX_REBOOT_CMD_RESTART = 0x1234567 LINUX_REBOOT_CMD_RESTART2 = 0xa1b2c3d4 LINUX_REBOOT_CMD_SW_SUSPEND = 0xd000fce2 LINUX_REBOOT_MAGIC1 = 0xfee1dead LINUX_REBOOT_MAGIC2 = 0x28121969 LOCK_EX = 0x2 LOCK_NB = 0x4 LOCK_SH = 0x1 LOCK_UN = 0x8 MADV_DOFORK = 0xb MADV_DONTFORK = 0xa MADV_DONTNEED = 0x4 MADV_HUGEPAGE = 0xe MADV_HWPOISON = 0x64 MADV_MERGEABLE = 0xc MADV_NOHUGEPAGE = 0xf MADV_NORMAL = 0x0 MADV_RANDOM = 0x1 MADV_REMOVE = 0x9 MADV_SEQUENTIAL = 0x2 MADV_UNMERGEABLE = 0xd MADV_WILLNEED = 0x3 MAP_32BIT = 0x40 MAP_ANON = 0x20 MAP_ANONYMOUS = 0x20 MAP_DENYWRITE = 0x800 MAP_EXECUTABLE = 0x1000 MAP_FILE = 0x0 MAP_FIXED = 0x10 MAP_GROWSDOWN = 0x100 MAP_HUGETLB = 0x40000 MAP_LOCKED = 0x2000 MAP_NONBLOCK = 0x10000 MAP_NORESERVE = 0x4000 MAP_POPULATE = 0x8000 MAP_PRIVATE = 0x2 MAP_SHARED = 0x1 MAP_STACK = 0x20000 MAP_TYPE = 0xf MCL_CURRENT = 0x1 MCL_FUTURE = 0x2 MNT_DETACH = 0x2 MNT_EXPIRE = 0x4 MNT_FORCE = 0x1 MSG_CMSG_CLOEXEC = 0x40000000 MSG_CONFIRM = 0x800 MSG_CTRUNC = 0x8 MSG_DONTROUTE = 0x4 MSG_DONTWAIT = 0x40 MSG_EOR = 0x80 MSG_ERRQUEUE = 0x2000 MSG_FASTOPEN = 0x20000000 MSG_FIN = 0x200 MSG_MORE = 0x8000 MSG_NOSIGNAL = 0x4000 MSG_OOB = 0x1 MSG_PEEK = 0x2 MSG_PROXY = 0x10 MSG_RST = 0x1000 MSG_SYN = 0x400 MSG_TRUNC = 0x20 MSG_TRYHARD = 0x4 MSG_WAITALL = 0x100 MSG_WAITFORONE = 0x10000 MS_ACTIVE = 0x40000000 MS_ASYNC = 0x1 MS_BIND = 0x1000 MS_DIRSYNC = 0x80 MS_INVALIDATE = 0x2 MS_I_VERSION = 0x800000 MS_KERNMOUNT = 0x400000 MS_MANDLOCK = 0x40 MS_MGC_MSK = 0xffff0000 MS_MGC_VAL = 0xc0ed0000 MS_MOVE = 0x2000 MS_NOATIME = 0x400 MS_NODEV = 0x4 MS_NODIRATIME = 0x800 MS_NOEXEC = 0x8 MS_NOSUID = 0x2 MS_NOUSER = -0x80000000 MS_POSIXACL = 0x10000 MS_PRIVATE = 0x40000 MS_RDONLY = 0x1 MS_REC = 0x4000 MS_RELATIME = 0x200000 MS_REMOUNT = 0x20 MS_RMT_MASK = 0x800051 MS_SHARED = 0x100000 MS_SILENT = 0x8000 MS_SLAVE = 0x80000 MS_STRICTATIME = 0x1000000 MS_SYNC = 0x4 MS_SYNCHRONOUS = 0x10 MS_UNBINDABLE = 0x20000 NAME_MAX = 0xff NETLINK_ADD_MEMBERSHIP = 0x1 NETLINK_AUDIT = 0x9 NETLINK_BROADCAST_ERROR = 0x4 NETLINK_CONNECTOR = 0xb NETLINK_DNRTMSG = 0xe NETLINK_DROP_MEMBERSHIP = 0x2 NETLINK_ECRYPTFS = 0x13 NETLINK_FIB_LOOKUP = 0xa NETLINK_FIREWALL = 0x3 NETLINK_GENERIC = 0x10 NETLINK_INET_DIAG = 0x4 NETLINK_IP6_FW = 0xd NETLINK_ISCSI = 0x8 NETLINK_KOBJECT_UEVENT = 0xf NETLINK_NETFILTER = 0xc NETLINK_NFLOG = 0x5 NETLINK_NO_ENOBUFS = 0x5 NETLINK_PKTINFO = 0x3 NETLINK_ROUTE = 0x0 NETLINK_SCSITRANSPORT = 0x12 NETLINK_SELINUX = 0x7 NETLINK_UNUSED = 0x1 NETLINK_USERSOCK = 0x2 NETLINK_XFRM = 0x6 NLA_ALIGNTO = 0x4 NLA_F_NESTED = 0x8000 NLA_F_NET_BYTEORDER = 0x4000 NLA_HDRLEN = 0x4 NLMSG_ALIGNTO = 0x4 NLMSG_DONE = 0x3 NLMSG_ERROR = 0x2 NLMSG_HDRLEN = 0x10 NLMSG_MIN_TYPE = 0x10 NLMSG_NOOP = 0x1 NLMSG_OVERRUN = 0x4 NLM_F_ACK = 0x4 NLM_F_APPEND = 0x800 NLM_F_ATOMIC = 0x400 NLM_F_CREATE = 0x400 NLM_F_DUMP = 0x300 NLM_F_ECHO = 0x8 NLM_F_EXCL = 0x200 NLM_F_MATCH = 0x200 NLM_F_MULTI = 0x2 NLM_F_REPLACE = 0x100 NLM_F_REQUEST = 0x1 NLM_F_ROOT = 0x100 O_ACCMODE = 0x3 O_APPEND = 0x400 O_ASYNC = 0x2000 O_CLOEXEC = 0x80000 O_CREAT = 0x40 O_DIRECT = 0x4000 O_DIRECTORY = 0x10000 O_DSYNC = 0x1000 O_EXCL = 0x80 O_FSYNC = 0x101000 O_LARGEFILE = 0x0 O_NDELAY = 0x800 O_NOATIME = 0x40000 O_NOCTTY = 0x100 O_NOFOLLOW = 0x20000 O_NONBLOCK = 0x800 O_RDONLY = 0x0 O_RDWR = 0x2 O_RSYNC = 0x101000 O_SYNC = 0x101000 O_TRUNC = 0x200 O_WRONLY = 0x1 PACKET_ADD_MEMBERSHIP = 0x1 PACKET_BROADCAST = 0x1 PACKET_DROP_MEMBERSHIP = 0x2 PACKET_FASTROUTE = 0x6 PACKET_HOST = 0x0 PACKET_LOOPBACK = 0x5 PACKET_MR_ALLMULTI = 0x2 PACKET_MR_MULTICAST = 0x0 PACKET_MR_PROMISC = 0x1 PACKET_MULTICAST = 0x2 PACKET_OTHERHOST = 0x3 PACKET_OUTGOING = 0x4 PACKET_RECV_OUTPUT = 0x3 PACKET_RX_RING = 0x5 PACKET_STATISTICS = 0x6 PRIO_PGRP = 0x1 PRIO_PROCESS = 0x0 PRIO_USER = 0x2 PROT_EXEC = 0x4 PROT_GROWSDOWN = 0x1000000 PROT_GROWSUP = 0x2000000 PROT_NONE = 0x0 PROT_READ = 0x1 PROT_WRITE = 0x2 PR_CAPBSET_DROP = 0x18 PR_CAPBSET_READ = 0x17 PR_ENDIAN_BIG = 0x0 PR_ENDIAN_LITTLE = 0x1 PR_ENDIAN_PPC_LITTLE = 0x2 PR_FPEMU_NOPRINT = 0x1 PR_FPEMU_SIGFPE = 0x2 PR_FP_EXC_ASYNC = 0x2 PR_FP_EXC_DISABLED = 0x0 PR_FP_EXC_DIV = 0x10000 PR_FP_EXC_INV = 0x100000 PR_FP_EXC_NONRECOV = 0x1 PR_FP_EXC_OVF = 0x20000 PR_FP_EXC_PRECISE = 0x3 PR_FP_EXC_RES = 0x80000 PR_FP_EXC_SW_ENABLE = 0x80 PR_FP_EXC_UND = 0x40000 PR_GET_DUMPABLE = 0x3 PR_GET_ENDIAN = 0x13 PR_GET_FPEMU = 0x9 PR_GET_FPEXC = 0xb PR_GET_KEEPCAPS = 0x7 PR_GET_NAME = 0x10 PR_GET_PDEATHSIG = 0x2 PR_GET_SECCOMP = 0x15 PR_GET_SECUREBITS = 0x1b PR_GET_TIMERSLACK = 0x1e PR_GET_TIMING = 0xd PR_GET_TSC = 0x19 PR_GET_UNALIGN = 0x5 PR_MCE_KILL = 0x21 PR_MCE_KILL_CLEAR = 0x0 PR_MCE_KILL_DEFAULT = 0x2 PR_MCE_KILL_EARLY = 0x1 PR_MCE_KILL_GET = 0x22 PR_MCE_KILL_LATE = 0x0 PR_MCE_KILL_SET = 0x1 PR_SET_DUMPABLE = 0x4 PR_SET_ENDIAN = 0x14 PR_SET_FPEMU = 0xa PR_SET_FPEXC = 0xc PR_SET_KEEPCAPS = 0x8 PR_SET_NAME = 0xf PR_SET_PDEATHSIG = 0x1 PR_SET_PTRACER = 0x59616d61 PR_SET_SECCOMP = 0x16 PR_SET_SECUREBITS = 0x1c PR_SET_TIMERSLACK = 0x1d PR_SET_TIMING = 0xe PR_SET_TSC = 0x1a PR_SET_UNALIGN = 0x6 PR_TASK_PERF_EVENTS_DISABLE = 0x1f PR_TASK_PERF_EVENTS_ENABLE = 0x20 PR_TIMING_STATISTICAL = 0x0 PR_TIMING_TIMESTAMP = 0x1 PR_TSC_ENABLE = 0x1 PR_TSC_SIGSEGV = 0x2 PR_UNALIGN_NOPRINT = 0x1 PR_UNALIGN_SIGBUS = 0x2 PTRACE_ARCH_PRCTL = 0x1e PTRACE_ATTACH = 0x10 PTRACE_CONT = 0x7 PTRACE_DETACH = 0x11 PTRACE_EVENT_CLONE = 0x3 PTRACE_EVENT_EXEC = 0x4 PTRACE_EVENT_EXIT = 0x6 PTRACE_EVENT_FORK = 0x1 PTRACE_EVENT_VFORK = 0x2 PTRACE_EVENT_VFORK_DONE = 0x5 PTRACE_GETEVENTMSG = 0x4201 PTRACE_GETFPREGS = 0xe PTRACE_GETFPXREGS = 0x12 PTRACE_GETREGS = 0xc PTRACE_GETREGSET = 0x4204 PTRACE_GETSIGINFO = 0x4202 PTRACE_GET_THREAD_AREA = 0x19 PTRACE_KILL = 0x8 PTRACE_OLDSETOPTIONS = 0x15 PTRACE_O_MASK = 0x7f PTRACE_O_TRACECLONE = 0x8 PTRACE_O_TRACEEXEC = 0x10 PTRACE_O_TRACEEXIT = 0x40 PTRACE_O_TRACEFORK = 0x2 PTRACE_O_TRACESYSGOOD = 0x1 PTRACE_O_TRACEVFORK = 0x4 PTRACE_O_TRACEVFORKDONE = 0x20 PTRACE_PEEKDATA = 0x2 PTRACE_PEEKTEXT = 0x1 PTRACE_PEEKUSR = 0x3 PTRACE_POKEDATA = 0x5 PTRACE_POKETEXT = 0x4 PTRACE_POKEUSR = 0x6 PTRACE_SETFPREGS = 0xf PTRACE_SETFPXREGS = 0x13 PTRACE_SETOPTIONS = 0x4200 PTRACE_SETREGS = 0xd PTRACE_SETREGSET = 0x4205 PTRACE_SETSIGINFO = 0x4203 PTRACE_SET_THREAD_AREA = 0x1a PTRACE_SINGLEBLOCK = 0x21 PTRACE_SINGLESTEP = 0x9 PTRACE_SYSCALL = 0x18 PTRACE_SYSEMU = 0x1f PTRACE_SYSEMU_SINGLESTEP = 0x20 PTRACE_TRACEME = 0x0 RLIMIT_AS = 0x9 RLIMIT_CORE = 0x4 RLIMIT_CPU = 0x0 RLIMIT_DATA = 0x2 RLIMIT_FSIZE = 0x1 RLIMIT_NOFILE = 0x7 RLIMIT_STACK = 0x3 RLIM_INFINITY = -0x1 RTAX_ADVMSS = 0x8 RTAX_CWND = 0x7 RTAX_FEATURES = 0xc RTAX_FEATURE_ALLFRAG = 0x8 RTAX_FEATURE_ECN = 0x1 RTAX_FEATURE_SACK = 0x2 RTAX_FEATURE_TIMESTAMP = 0x4 RTAX_HOPLIMIT = 0xa RTAX_INITCWND = 0xb RTAX_INITRWND = 0xe RTAX_LOCK = 0x1 RTAX_MAX = 0xe RTAX_MTU = 0x2 RTAX_REORDERING = 0x9 RTAX_RTO_MIN = 0xd RTAX_RTT = 0x4 RTAX_RTTVAR = 0x5 RTAX_SSTHRESH = 0x6 RTAX_UNSPEC = 0x0 RTAX_WINDOW = 0x3 RTA_ALIGNTO = 0x4 RTA_MAX = 0x10 RTCF_DIRECTSRC = 0x4000000 RTCF_DOREDIRECT = 0x1000000 RTCF_LOG = 0x2000000 RTCF_MASQ = 0x400000 RTCF_NAT = 0x800000 RTCF_VALVE = 0x200000 RTF_ADDRCLASSMASK = 0xf8000000 RTF_ADDRCONF = 0x40000 RTF_ALLONLINK = 0x20000 RTF_BROADCAST = 0x10000000 RTF_CACHE = 0x1000000 RTF_DEFAULT = 0x10000 RTF_DYNAMIC = 0x10 RTF_FLOW = 0x2000000 RTF_GATEWAY = 0x2 RTF_HOST = 0x4 RTF_INTERFACE = 0x40000000 RTF_IRTT = 0x100 RTF_LINKRT = 0x100000 RTF_LOCAL = 0x80000000 RTF_MODIFIED = 0x20 RTF_MSS = 0x40 RTF_MTU = 0x40 RTF_MULTICAST = 0x20000000 RTF_NAT = 0x8000000 RTF_NOFORWARD = 0x1000 RTF_NONEXTHOP = 0x200000 RTF_NOPMTUDISC = 0x4000 RTF_POLICY = 0x4000000 RTF_REINSTATE = 0x8 RTF_REJECT = 0x200 RTF_STATIC = 0x400 RTF_THROW = 0x2000 RTF_UP = 0x1 RTF_WINDOW = 0x80 RTF_XRESOLVE = 0x800 RTM_BASE = 0x10 RTM_DELACTION = 0x31 RTM_DELADDR = 0x15 RTM_DELADDRLABEL = 0x49 RTM_DELLINK = 0x11 RTM_DELNEIGH = 0x1d RTM_DELQDISC = 0x25 RTM_DELROUTE = 0x19 RTM_DELRULE = 0x21 RTM_DELTCLASS = 0x29 RTM_DELTFILTER = 0x2d RTM_F_CLONED = 0x200 RTM_F_EQUALIZE = 0x400 RTM_F_NOTIFY = 0x100 RTM_F_PREFIX = 0x800 RTM_GETACTION = 0x32 RTM_GETADDR = 0x16 RTM_GETADDRLABEL = 0x4a RTM_GETANYCAST = 0x3e RTM_GETDCB = 0x4e RTM_GETLINK = 0x12 RTM_GETMULTICAST = 0x3a RTM_GETNEIGH = 0x1e RTM_GETNEIGHTBL = 0x42 RTM_GETQDISC = 0x26 RTM_GETROUTE = 0x1a RTM_GETRULE = 0x22 RTM_GETTCLASS = 0x2a RTM_GETTFILTER = 0x2e RTM_MAX = 0x4f RTM_NEWACTION = 0x30 RTM_NEWADDR = 0x14 RTM_NEWADDRLABEL = 0x48 RTM_NEWLINK = 0x10 RTM_NEWNDUSEROPT = 0x44 RTM_NEWNEIGH = 0x1c RTM_NEWNEIGHTBL = 0x40 RTM_NEWPREFIX = 0x34 RTM_NEWQDISC = 0x24 RTM_NEWROUTE = 0x18 RTM_NEWRULE = 0x20 RTM_NEWTCLASS = 0x28 RTM_NEWTFILTER = 0x2c RTM_NR_FAMILIES = 0x10 RTM_NR_MSGTYPES = 0x40 RTM_SETDCB = 0x4f RTM_SETLINK = 0x13 RTM_SETNEIGHTBL = 0x43 RTNH_ALIGNTO = 0x4 RTNH_F_DEAD = 0x1 RTNH_F_ONLINK = 0x4 RTNH_F_PERVASIVE = 0x2 RTN_MAX = 0xb RTPROT_BIRD = 0xc RTPROT_BOOT = 0x3 RTPROT_DHCP = 0x10 RTPROT_DNROUTED = 0xd RTPROT_GATED = 0x8 RTPROT_KERNEL = 0x2 RTPROT_MRT = 0xa RTPROT_NTK = 0xf RTPROT_RA = 0x9 RTPROT_REDIRECT = 0x1 RTPROT_STATIC = 0x4 RTPROT_UNSPEC = 0x0 RTPROT_XORP = 0xe RTPROT_ZEBRA = 0xb RT_CLASS_DEFAULT = 0xfd RT_CLASS_LOCAL = 0xff RT_CLASS_MAIN = 0xfe RT_CLASS_MAX = 0xff RT_CLASS_UNSPEC = 0x0 RUSAGE_CHILDREN = -0x1 RUSAGE_SELF = 0x0 RUSAGE_THREAD = 0x1 SCM_CREDENTIALS = 0x2 SCM_RIGHTS = 0x1 SCM_TIMESTAMP = 0x1d SCM_TIMESTAMPING = 0x25 SCM_TIMESTAMPNS = 0x23 SHUT_RD = 0x0 SHUT_RDWR = 0x2 SHUT_WR = 0x1 SIOCADDDLCI = 0x8980 SIOCADDMULTI = 0x8931 SIOCADDRT = 0x890b SIOCATMARK = 0x8905 SIOCDARP = 0x8953 SIOCDELDLCI = 0x8981 SIOCDELMULTI = 0x8932 SIOCDELRT = 0x890c SIOCDEVPRIVATE = 0x89f0 SIOCDIFADDR = 0x8936 SIOCDRARP = 0x8960 SIOCGARP = 0x8954 SIOCGIFADDR = 0x8915 SIOCGIFBR = 0x8940 SIOCGIFBRDADDR = 0x8919 SIOCGIFCONF = 0x8912 SIOCGIFCOUNT = 0x8938 SIOCGIFDSTADDR = 0x8917 SIOCGIFENCAP = 0x8925 SIOCGIFFLAGS = 0x8913 SIOCGIFHWADDR = 0x8927 SIOCGIFINDEX = 0x8933 SIOCGIFMAP = 0x8970 SIOCGIFMEM = 0x891f SIOCGIFMETRIC = 0x891d SIOCGIFMTU = 0x8921 SIOCGIFNAME = 0x8910 SIOCGIFNETMASK = 0x891b SIOCGIFPFLAGS = 0x8935 SIOCGIFSLAVE = 0x8929 SIOCGIFTXQLEN = 0x8942 SIOCGPGRP = 0x8904 SIOCGRARP = 0x8961 SIOCGSTAMP = 0x8906 SIOCGSTAMPNS = 0x8907 SIOCPROTOPRIVATE = 0x89e0 SIOCRTMSG = 0x890d SIOCSARP = 0x8955 SIOCSIFADDR = 0x8916 SIOCSIFBR = 0x8941 SIOCSIFBRDADDR = 0x891a SIOCSIFDSTADDR = 0x8918 SIOCSIFENCAP = 0x8926 SIOCSIFFLAGS = 0x8914 SIOCSIFHWADDR = 0x8924 SIOCSIFHWBROADCAST = 0x8937 SIOCSIFLINK = 0x8911 SIOCSIFMAP = 0x8971 SIOCSIFMEM = 0x8920 SIOCSIFMETRIC = 0x891e SIOCSIFMTU = 0x8922 SIOCSIFNAME = 0x8923 SIOCSIFNETMASK = 0x891c SIOCSIFPFLAGS = 0x8934 SIOCSIFSLAVE = 0x8930 SIOCSIFTXQLEN = 0x8943 SIOCSPGRP = 0x8902 SIOCSRARP = 0x8962 SOCK_CLOEXEC = 0x80000 SOCK_DCCP = 0x6 SOCK_DGRAM = 0x2 SOCK_NONBLOCK = 0x800 SOCK_PACKET = 0xa SOCK_RAW = 0x3 SOCK_RDM = 0x4 SOCK_SEQPACKET = 0x5 SOCK_STREAM = 0x1 SOL_AAL = 0x109 SOL_ATM = 0x108 SOL_DECNET = 0x105 SOL_ICMPV6 = 0x3a SOL_IP = 0x0 SOL_IPV6 = 0x29 SOL_IRDA = 0x10a SOL_PACKET = 0x107 SOL_RAW = 0xff SOL_SOCKET = 0x1 SOL_TCP = 0x6 SOL_X25 = 0x106 SOMAXCONN = 0x80 SO_ACCEPTCONN = 0x1e SO_ATTACH_FILTER = 0x1a SO_BINDTODEVICE = 0x19 SO_BROADCAST = 0x6 SO_BSDCOMPAT = 0xe SO_DEBUG = 0x1 SO_DETACH_FILTER = 0x1b SO_DOMAIN = 0x27 SO_DONTROUTE = 0x5 SO_ERROR = 0x4 SO_KEEPALIVE = 0x9 SO_LINGER = 0xd SO_MARK = 0x24 SO_NO_CHECK = 0xb SO_OOBINLINE = 0xa SO_PASSCRED = 0x10 SO_PASSSEC = 0x22 SO_PEERCRED = 0x11 SO_PEERNAME = 0x1c SO_PEERSEC = 0x1f SO_PRIORITY = 0xc SO_PROTOCOL = 0x26 SO_RCVBUF = 0x8 SO_RCVBUFFORCE = 0x21 SO_RCVLOWAT = 0x12 SO_RCVTIMEO = 0x14 SO_REUSEADDR = 0x2 SO_RXQ_OVFL = 0x28 SO_SECURITY_AUTHENTICATION = 0x16 SO_SECURITY_ENCRYPTION_NETWORK = 0x18 SO_SECURITY_ENCRYPTION_TRANSPORT = 0x17 SO_SNDBUF = 0x7 SO_SNDBUFFORCE = 0x20 SO_SNDLOWAT = 0x13 SO_SNDTIMEO = 0x15 SO_TIMESTAMP = 0x1d SO_TIMESTAMPING = 0x25 SO_TIMESTAMPNS = 0x23 SO_TYPE = 0x3 S_BLKSIZE = 0x200 S_IEXEC = 0x40 S_IFBLK = 0x6000 S_IFCHR = 0x2000 S_IFDIR = 0x4000 S_IFIFO = 0x1000 S_IFLNK = 0xa000 S_IFMT = 0xf000 S_IFREG = 0x8000 S_IFSOCK = 0xc000 S_IREAD = 0x100 S_IRGRP = 0x20 S_IROTH = 0x4 S_IRUSR = 0x100 S_IRWXG = 0x38 S_IRWXO = 0x7 S_IRWXU = 0x1c0 S_ISGID = 0x400 S_ISUID = 0x800 S_ISVTX = 0x200 S_IWGRP = 0x10 S_IWOTH = 0x2 S_IWRITE = 0x80 S_IWUSR = 0x80 S_IXGRP = 0x8 S_IXOTH = 0x1 S_IXUSR = 0x40 TCIFLUSH = 0x0 TCIOFLUSH = 0x2 TCOFLUSH = 0x1 TCP_CONGESTION = 0xd TCP_CORK = 0x3 TCP_DEFER_ACCEPT = 0x9 TCP_INFO = 0xb TCP_KEEPCNT = 0x6 TCP_KEEPIDLE = 0x4 TCP_KEEPINTVL = 0x5 TCP_LINGER2 = 0x8 TCP_MAXSEG = 0x2 TCP_MAXWIN = 0xffff TCP_MAX_WINSHIFT = 0xe TCP_MD5SIG = 0xe TCP_MD5SIG_MAXKEYLEN = 0x50 TCP_MSS = 0x200 TCP_NODELAY = 0x1 TCP_QUICKACK = 0xc TCP_SYNCNT = 0x7 TCP_WINDOW_CLAMP = 0xa TIOCCBRK = 0x5428 TIOCCONS = 0x541d TIOCEXCL = 0x540c TIOCGDEV = 0x80045432 TIOCGETD = 0x5424 TIOCGICOUNT = 0x545d TIOCGLCKTRMIOS = 0x5456 TIOCGPGRP = 0x540f TIOCGPTN = 0x80045430 TIOCGRS485 = 0x542e TIOCGSERIAL = 0x541e TIOCGSID = 0x5429 TIOCGSOFTCAR = 0x5419 TIOCGWINSZ = 0x5413 TIOCINQ = 0x541b TIOCLINUX = 0x541c TIOCMBIC = 0x5417 TIOCMBIS = 0x5416 TIOCMGET = 0x5415 TIOCMIWAIT = 0x545c TIOCMSET = 0x5418 TIOCM_CAR = 0x40 TIOCM_CD = 0x40 TIOCM_CTS = 0x20 TIOCM_DSR = 0x100 TIOCM_DTR = 0x2 TIOCM_LE = 0x1 TIOCM_RI = 0x80 TIOCM_RNG = 0x80 TIOCM_RTS = 0x4 TIOCM_SR = 0x10 TIOCM_ST = 0x8 TIOCNOTTY = 0x5422 TIOCNXCL = 0x540d TIOCOUTQ = 0x5411 TIOCPKT = 0x5420 TIOCPKT_DATA = 0x0 TIOCPKT_DOSTOP = 0x20 TIOCPKT_FLUSHREAD = 0x1 TIOCPKT_FLUSHWRITE = 0x2 TIOCPKT_IOCTL = 0x40 TIOCPKT_NOSTOP = 0x10 TIOCPKT_START = 0x8 TIOCPKT_STOP = 0x4 TIOCSBRK = 0x5427 TIOCSCTTY = 0x540e TIOCSERCONFIG = 0x5453 TIOCSERGETLSR = 0x5459 TIOCSERGETMULTI = 0x545a TIOCSERGSTRUCT = 0x5458 TIOCSERGWILD = 0x5454 TIOCSERSETMULTI = 0x545b TIOCSERSWILD = 0x5455 TIOCSER_TEMT = 0x1 TIOCSETD = 0x5423 TIOCSIG = 0x40045436 TIOCSLCKTRMIOS = 0x5457 TIOCSPGRP = 0x5410 TIOCSPTLCK = 0x40045431 TIOCSRS485 = 0x542f TIOCSSERIAL = 0x541f TIOCSSOFTCAR = 0x541a TIOCSTI = 0x5412 TIOCSWINSZ = 0x5414 TUNATTACHFILTER = 0x401054d5 TUNDETACHFILTER = 0x401054d6 TUNGETFEATURES = 0x800454cf TUNGETIFF = 0x800454d2 TUNGETSNDBUF = 0x800454d3 TUNGETVNETHDRSZ = 0x800454d7 TUNSETDEBUG = 0x400454c9 TUNSETGROUP = 0x400454ce TUNSETIFF = 0x400454ca TUNSETLINK = 0x400454cd TUNSETNOCSUM = 0x400454c8 TUNSETOFFLOAD = 0x400454d0 TUNSETOWNER = 0x400454cc TUNSETPERSIST = 0x400454cb TUNSETSNDBUF = 0x400454d4 TUNSETTXFILTER = 0x400454d1 TUNSETVNETHDRSZ = 0x400454d8 WALL = 0x40000000 WCLONE = 0x80000000 WCONTINUED = 0x8 WEXITED = 0x4 WNOHANG = 0x1 WNOTHREAD = 0x20000000 WNOWAIT = 0x1000000 WORDSIZE = 0x40 WSTOPPED = 0x2 WUNTRACED = 0x2 ) ``` Errors ``` const ( E2BIG = Errno(0x7) EACCES = Errno(0xd) EADDRINUSE = Errno(0x62) EADDRNOTAVAIL = Errno(0x63) EADV = Errno(0x44) EAFNOSUPPORT = Errno(0x61) EAGAIN = Errno(0xb) EALREADY = Errno(0x72) EBADE = Errno(0x34) EBADF = Errno(0x9) EBADFD = Errno(0x4d) EBADMSG = Errno(0x4a) EBADR = Errno(0x35) EBADRQC = Errno(0x38) EBADSLT = Errno(0x39) EBFONT = Errno(0x3b) EBUSY = Errno(0x10) ECANCELED = Errno(0x7d) ECHILD = Errno(0xa) ECHRNG = Errno(0x2c) ECOMM = Errno(0x46) ECONNABORTED = Errno(0x67) ECONNREFUSED = Errno(0x6f) ECONNRESET = Errno(0x68) EDEADLK = Errno(0x23) EDEADLOCK = Errno(0x23) EDESTADDRREQ = Errno(0x59) EDOM = Errno(0x21) EDOTDOT = Errno(0x49) EDQUOT = Errno(0x7a) EEXIST = Errno(0x11) EFAULT = Errno(0xe) EFBIG = Errno(0x1b) EHOSTDOWN = Errno(0x70) EHOSTUNREACH = Errno(0x71) EIDRM = Errno(0x2b) EILSEQ = Errno(0x54) EINPROGRESS = Errno(0x73) EINTR = Errno(0x4) EINVAL = Errno(0x16) EIO = Errno(0x5) EISCONN = Errno(0x6a) EISDIR = Errno(0x15) EISNAM = Errno(0x78) EKEYEXPIRED = Errno(0x7f) EKEYREJECTED = Errno(0x81) EKEYREVOKED = Errno(0x80) EL2HLT = Errno(0x33) EL2NSYNC = Errno(0x2d) EL3HLT = Errno(0x2e) EL3RST = Errno(0x2f) ELIBACC = Errno(0x4f) ELIBBAD = Errno(0x50) ELIBEXEC = Errno(0x53) ELIBMAX = Errno(0x52) ELIBSCN = Errno(0x51) ELNRNG = Errno(0x30) ELOOP = Errno(0x28) EMEDIUMTYPE = Errno(0x7c) EMFILE = Errno(0x18) EMLINK = Errno(0x1f) EMSGSIZE = Errno(0x5a) EMULTIHOP = Errno(0x48) ENAMETOOLONG = Errno(0x24) ENAVAIL = Errno(0x77) ENETDOWN = Errno(0x64) ENETRESET = Errno(0x66) ENETUNREACH = Errno(0x65) ENFILE = Errno(0x17) ENOANO = Errno(0x37) ENOBUFS = Errno(0x69) ENOCSI = Errno(0x32) ENODATA = Errno(0x3d) ENODEV = Errno(0x13) ENOENT = Errno(0x2) ENOEXEC = Errno(0x8) ENOKEY = Errno(0x7e) ENOLCK = Errno(0x25) ENOLINK = Errno(0x43) ENOMEDIUM = Errno(0x7b) ENOMEM = Errno(0xc) ENOMSG = Errno(0x2a) ENONET = Errno(0x40) ENOPKG = Errno(0x41) ENOPROTOOPT = Errno(0x5c) ENOSPC = Errno(0x1c) ENOSR = Errno(0x3f) ENOSTR = Errno(0x3c) ENOSYS = Errno(0x26) ENOTBLK = Errno(0xf) ENOTCONN = Errno(0x6b) ENOTDIR = Errno(0x14) ENOTEMPTY = Errno(0x27) ENOTNAM = Errno(0x76) ENOTRECOVERABLE = Errno(0x83) ENOTSOCK = Errno(0x58) ENOTSUP = Errno(0x5f) ENOTTY = Errno(0x19) ENOTUNIQ = Errno(0x4c) ENXIO = Errno(0x6) EOPNOTSUPP = Errno(0x5f) EOVERFLOW = Errno(0x4b) EOWNERDEAD = Errno(0x82) EPERM = Errno(0x1) EPFNOSUPPORT = Errno(0x60) EPIPE = Errno(0x20) EPROTO = Errno(0x47) EPROTONOSUPPORT = Errno(0x5d) EPROTOTYPE = Errno(0x5b) ERANGE = Errno(0x22) EREMCHG = Errno(0x4e) EREMOTE = Errno(0x42) EREMOTEIO = Errno(0x79) ERESTART = Errno(0x55) ERFKILL = Errno(0x84) EROFS = Errno(0x1e) ESHUTDOWN = Errno(0x6c) ESOCKTNOSUPPORT = Errno(0x5e) ESPIPE = Errno(0x1d) ESRCH = Errno(0x3) ESRMNT = Errno(0x45) ESTALE = Errno(0x74) ESTRPIPE = Errno(0x56) ETIME = Errno(0x3e) ETIMEDOUT = Errno(0x6e) ETOOMANYREFS = Errno(0x6d) ETXTBSY = Errno(0x1a) EUCLEAN = Errno(0x75) EUNATCH = Errno(0x31) EUSERS = Errno(0x57) EWOULDBLOCK = Errno(0xb) EXDEV = Errno(0x12) EXFULL = Errno(0x36) ) ``` Signals ``` const ( SIGABRT = Signal(0x6) SIGALRM = Signal(0xe) SIGBUS = Signal(0x7) SIGCHLD = Signal(0x11) SIGCLD = Signal(0x11) SIGCONT = Signal(0x12) SIGFPE = Signal(0x8) SIGHUP = Signal(0x1) SIGILL = Signal(0x4) SIGINT = Signal(0x2) SIGIO = Signal(0x1d) SIGIOT = Signal(0x6) SIGKILL = Signal(0x9) SIGPIPE = Signal(0xd) SIGPOLL = Signal(0x1d) SIGPROF = Signal(0x1b) SIGPWR = Signal(0x1e) SIGQUIT = Signal(0x3) SIGSEGV = Signal(0xb) SIGSTKFLT = Signal(0x10) SIGSTOP = Signal(0x13) SIGSYS = Signal(0x1f) SIGTERM = Signal(0xf) SIGTRAP = Signal(0x5) SIGTSTP = Signal(0x14) SIGTTIN = Signal(0x15) SIGTTOU = Signal(0x16) SIGUNUSED = Signal(0x1f) SIGURG = Signal(0x17) SIGUSR1 = Signal(0xa) SIGUSR2 = Signal(0xc) SIGVTALRM = Signal(0x1a) SIGWINCH = Signal(0x1c) SIGXCPU = Signal(0x18) SIGXFSZ = Signal(0x19) ) ``` ``` const ( SYS_READ = 0 SYS_WRITE = 1 SYS_OPEN = 2 SYS_CLOSE = 3 SYS_STAT = 4 SYS_FSTAT = 5 SYS_LSTAT = 6 SYS_POLL = 7 SYS_LSEEK = 8 SYS_MMAP = 9 SYS_MPROTECT = 10 SYS_MUNMAP = 11 SYS_BRK = 12 SYS_RT_SIGACTION = 13 SYS_RT_SIGPROCMASK = 14 SYS_RT_SIGRETURN = 15 SYS_IOCTL = 16 SYS_PREAD64 = 17 SYS_PWRITE64 = 18 SYS_READV = 19 SYS_WRITEV = 20 SYS_ACCESS = 21 SYS_PIPE = 22 SYS_SELECT = 23 SYS_SCHED_YIELD = 24 SYS_MREMAP = 25 SYS_MSYNC = 26 SYS_MINCORE = 27 SYS_MADVISE = 28 SYS_SHMGET = 29 SYS_SHMAT = 30 SYS_SHMCTL = 31 SYS_DUP = 32 SYS_DUP2 = 33 SYS_PAUSE = 34 SYS_NANOSLEEP = 35 SYS_GETITIMER = 36 SYS_ALARM = 37 SYS_SETITIMER = 38 SYS_GETPID = 39 SYS_SENDFILE = 40 SYS_SOCKET = 41 SYS_CONNECT = 42 SYS_ACCEPT = 43 SYS_SENDTO = 44 SYS_RECVFROM = 45 SYS_SENDMSG = 46 SYS_RECVMSG = 47 SYS_SHUTDOWN = 48 SYS_BIND = 49 SYS_LISTEN = 50 SYS_GETSOCKNAME = 51 SYS_GETPEERNAME = 52 SYS_SOCKETPAIR = 53 SYS_SETSOCKOPT = 54 SYS_GETSOCKOPT = 55 SYS_CLONE = 56 SYS_FORK = 57 SYS_VFORK = 58 SYS_EXECVE = 59 SYS_EXIT = 60 SYS_WAIT4 = 61 SYS_KILL = 62 SYS_UNAME = 63 SYS_SEMGET = 64 SYS_SEMOP = 65 SYS_SEMCTL = 66 SYS_SHMDT = 67 SYS_MSGGET = 68 SYS_MSGSND = 69 SYS_MSGRCV = 70 SYS_MSGCTL = 71 SYS_FCNTL = 72 SYS_FLOCK = 73 SYS_FSYNC = 74 SYS_FDATASYNC = 75 SYS_TRUNCATE = 76 SYS_FTRUNCATE = 77 SYS_GETDENTS = 78 SYS_GETCWD = 79 SYS_CHDIR = 80 SYS_FCHDIR = 81 SYS_RENAME = 82 SYS_MKDIR = 83 SYS_RMDIR = 84 SYS_CREAT = 85 SYS_LINK = 86 SYS_UNLINK = 87 SYS_SYMLINK = 88 SYS_READLINK = 89 SYS_CHMOD = 90 SYS_FCHMOD = 91 SYS_CHOWN = 92 SYS_FCHOWN = 93 SYS_LCHOWN = 94 SYS_UMASK = 95 SYS_GETTIMEOFDAY = 96 SYS_GETRLIMIT = 97 SYS_GETRUSAGE = 98 SYS_SYSINFO = 99 SYS_TIMES = 100 SYS_PTRACE = 101 SYS_GETUID = 102 SYS_SYSLOG = 103 SYS_GETGID = 104 SYS_SETUID = 105 SYS_SETGID = 106 SYS_GETEUID = 107 SYS_GETEGID = 108 SYS_SETPGID = 109 SYS_GETPPID = 110 SYS_GETPGRP = 111 SYS_SETSID = 112 SYS_SETREUID = 113 SYS_SETREGID = 114 SYS_GETGROUPS = 115 SYS_SETGROUPS = 116 SYS_SETRESUID = 117 SYS_GETRESUID = 118 SYS_SETRESGID = 119 SYS_GETRESGID = 120 SYS_GETPGID = 121 SYS_SETFSUID = 122 SYS_SETFSGID = 123 SYS_GETSID = 124 SYS_CAPGET = 125 SYS_CAPSET = 126 SYS_RT_SIGPENDING = 127 SYS_RT_SIGTIMEDWAIT = 128 SYS_RT_SIGQUEUEINFO = 129 SYS_RT_SIGSUSPEND = 130 SYS_SIGALTSTACK = 131 SYS_UTIME = 132 SYS_MKNOD = 133 SYS_USELIB = 134 SYS_PERSONALITY = 135 SYS_USTAT = 136 SYS_STATFS = 137 SYS_FSTATFS = 138 SYS_SYSFS = 139 SYS_GETPRIORITY = 140 SYS_SETPRIORITY = 141 SYS_SCHED_SETPARAM = 142 SYS_SCHED_GETPARAM = 143 SYS_SCHED_SETSCHEDULER = 144 SYS_SCHED_GETSCHEDULER = 145 SYS_SCHED_GET_PRIORITY_MAX = 146 SYS_SCHED_GET_PRIORITY_MIN = 147 SYS_SCHED_RR_GET_INTERVAL = 148 SYS_MLOCK = 149 SYS_MUNLOCK = 150 SYS_MLOCKALL = 151 SYS_MUNLOCKALL = 152 SYS_VHANGUP = 153 SYS_MODIFY_LDT = 154 SYS_PIVOT_ROOT = 155 SYS__SYSCTL = 156 SYS_PRCTL = 157 SYS_ARCH_PRCTL = 158 SYS_ADJTIMEX = 159 SYS_SETRLIMIT = 160 SYS_CHROOT = 161 SYS_SYNC = 162 SYS_ACCT = 163 SYS_SETTIMEOFDAY = 164 SYS_MOUNT = 165 SYS_UMOUNT2 = 166 SYS_SWAPON = 167 SYS_SWAPOFF = 168 SYS_REBOOT = 169 SYS_SETHOSTNAME = 170 SYS_SETDOMAINNAME = 171 SYS_IOPL = 172 SYS_IOPERM = 173 SYS_CREATE_MODULE = 174 SYS_INIT_MODULE = 175 SYS_DELETE_MODULE = 176 SYS_GET_KERNEL_SYMS = 177 SYS_QUERY_MODULE = 178 SYS_QUOTACTL = 179 SYS_NFSSERVCTL = 180 SYS_GETPMSG = 181 SYS_PUTPMSG = 182 SYS_AFS_SYSCALL = 183 SYS_TUXCALL = 184 SYS_SECURITY = 185 SYS_GETTID = 186 SYS_READAHEAD = 187 SYS_SETXATTR = 188 SYS_LSETXATTR = 189 SYS_FSETXATTR = 190 SYS_GETXATTR = 191 SYS_LGETXATTR = 192 SYS_FGETXATTR = 193 SYS_LISTXATTR = 194 SYS_LLISTXATTR = 195 SYS_FLISTXATTR = 196 SYS_REMOVEXATTR = 197 SYS_LREMOVEXATTR = 198 SYS_FREMOVEXATTR = 199 SYS_TKILL = 200 SYS_TIME = 201 SYS_FUTEX = 202 SYS_SCHED_SETAFFINITY = 203 SYS_SCHED_GETAFFINITY = 204 SYS_SET_THREAD_AREA = 205 SYS_IO_SETUP = 206 SYS_IO_DESTROY = 207 SYS_IO_GETEVENTS = 208 SYS_IO_SUBMIT = 209 SYS_IO_CANCEL = 210 SYS_GET_THREAD_AREA = 211 SYS_LOOKUP_DCOOKIE = 212 SYS_EPOLL_CREATE = 213 SYS_EPOLL_CTL_OLD = 214 SYS_EPOLL_WAIT_OLD = 215 SYS_REMAP_FILE_PAGES = 216 SYS_GETDENTS64 = 217 SYS_SET_TID_ADDRESS = 218 SYS_RESTART_SYSCALL = 219 SYS_SEMTIMEDOP = 220 SYS_FADVISE64 = 221 SYS_TIMER_CREATE = 222 SYS_TIMER_SETTIME = 223 SYS_TIMER_GETTIME = 224 SYS_TIMER_GETOVERRUN = 225 SYS_TIMER_DELETE = 226 SYS_CLOCK_SETTIME = 227 SYS_CLOCK_GETTIME = 228 SYS_CLOCK_GETRES = 229 SYS_CLOCK_NANOSLEEP = 230 SYS_EXIT_GROUP = 231 SYS_EPOLL_WAIT = 232 SYS_EPOLL_CTL = 233 SYS_TGKILL = 234 SYS_UTIMES = 235 SYS_VSERVER = 236 SYS_MBIND = 237 SYS_SET_MEMPOLICY = 238 SYS_GET_MEMPOLICY = 239 SYS_MQ_OPEN = 240 SYS_MQ_UNLINK = 241 SYS_MQ_TIMEDSEND = 242 SYS_MQ_TIMEDRECEIVE = 243 SYS_MQ_NOTIFY = 244 SYS_MQ_GETSETATTR = 245 SYS_KEXEC_LOAD = 246 SYS_WAITID = 247 SYS_ADD_KEY = 248 SYS_REQUEST_KEY = 249 SYS_KEYCTL = 250 SYS_IOPRIO_SET = 251 SYS_IOPRIO_GET = 252 SYS_INOTIFY_INIT = 253 SYS_INOTIFY_ADD_WATCH = 254 SYS_INOTIFY_RM_WATCH = 255 SYS_MIGRATE_PAGES = 256 SYS_OPENAT = 257 SYS_MKDIRAT = 258 SYS_MKNODAT = 259 SYS_FCHOWNAT = 260 SYS_FUTIMESAT = 261 SYS_NEWFSTATAT = 262 SYS_UNLINKAT = 263 SYS_RENAMEAT = 264 SYS_LINKAT = 265 SYS_SYMLINKAT = 266 SYS_READLINKAT = 267 SYS_FCHMODAT = 268 SYS_FACCESSAT = 269 SYS_PSELECT6 = 270 SYS_PPOLL = 271 SYS_UNSHARE = 272 SYS_SET_ROBUST_LIST = 273 SYS_GET_ROBUST_LIST = 274 SYS_SPLICE = 275 SYS_TEE = 276 SYS_SYNC_FILE_RANGE = 277 SYS_VMSPLICE = 278 SYS_MOVE_PAGES = 279 SYS_UTIMENSAT = 280 SYS_EPOLL_PWAIT = 281 SYS_SIGNALFD = 282 SYS_TIMERFD_CREATE = 283 SYS_EVENTFD = 284 SYS_FALLOCATE = 285 SYS_TIMERFD_SETTIME = 286 SYS_TIMERFD_GETTIME = 287 SYS_ACCEPT4 = 288 SYS_SIGNALFD4 = 289 SYS_EVENTFD2 = 290 SYS_EPOLL_CREATE1 = 291 SYS_DUP3 = 292 SYS_PIPE2 = 293 SYS_INOTIFY_INIT1 = 294 SYS_PREADV = 295 SYS_PWRITEV = 296 SYS_RT_TGSIGQUEUEINFO = 297 SYS_PERF_EVENT_OPEN = 298 SYS_RECVMMSG = 299 SYS_FANOTIFY_INIT = 300 SYS_FANOTIFY_MARK = 301 SYS_PRLIMIT64 = 302 ) ``` ``` const ( SizeofSockaddrInet4 = 0x10 SizeofSockaddrInet6 = 0x1c SizeofSockaddrAny = 0x70 SizeofSockaddrUnix = 0x6e SizeofSockaddrLinklayer = 0x14 SizeofSockaddrNetlink = 0xc SizeofLinger = 0x8 SizeofIPMreq = 0x8 SizeofIPMreqn = 0xc SizeofIPv6Mreq = 0x14 SizeofMsghdr = 0x38 SizeofCmsghdr = 0x10 SizeofInet4Pktinfo = 0xc SizeofInet6Pktinfo = 0x14 SizeofIPv6MTUInfo = 0x20 SizeofICMPv6Filter = 0x20 SizeofUcred = 0xc SizeofTCPInfo = 0x68 ) ``` ``` const ( IFA_UNSPEC = 0x0 IFA_ADDRESS = 0x1 IFA_LOCAL = 0x2 IFA_LABEL = 0x3 IFA_BROADCAST = 0x4 IFA_ANYCAST = 0x5 IFA_CACHEINFO = 0x6 IFA_MULTICAST = 0x7 IFLA_UNSPEC = 0x0 IFLA_ADDRESS = 0x1 IFLA_BROADCAST = 0x2 IFLA_IFNAME = 0x3 IFLA_MTU = 0x4 IFLA_LINK = 0x5 IFLA_QDISC = 0x6 IFLA_STATS = 0x7 IFLA_COST = 0x8 IFLA_PRIORITY = 0x9 IFLA_MASTER = 0xa IFLA_WIRELESS = 0xb IFLA_PROTINFO = 0xc IFLA_TXQLEN = 0xd IFLA_MAP = 0xe IFLA_WEIGHT = 0xf IFLA_OPERSTATE = 0x10 IFLA_LINKMODE = 0x11 IFLA_LINKINFO = 0x12 IFLA_NET_NS_PID = 0x13 IFLA_IFALIAS = 0x14 IFLA_MAX = 0x1d RT_SCOPE_UNIVERSE = 0x0 RT_SCOPE_SITE = 0xc8 RT_SCOPE_LINK = 0xfd RT_SCOPE_HOST = 0xfe RT_SCOPE_NOWHERE = 0xff RT_TABLE_UNSPEC = 0x0 RT_TABLE_COMPAT = 0xfc RT_TABLE_DEFAULT = 0xfd RT_TABLE_MAIN = 0xfe RT_TABLE_LOCAL = 0xff RT_TABLE_MAX = 0xffffffff RTA_UNSPEC = 0x0 RTA_DST = 0x1 RTA_SRC = 0x2 RTA_IIF = 0x3 RTA_OIF = 0x4 RTA_GATEWAY = 0x5 RTA_PRIORITY = 0x6 RTA_PREFSRC = 0x7 RTA_METRICS = 0x8 RTA_MULTIPATH = 0x9 RTA_FLOW = 0xb RTA_CACHEINFO = 0xc RTA_TABLE = 0xf RTN_UNSPEC = 0x0 RTN_UNICAST = 0x1 RTN_LOCAL = 0x2 RTN_BROADCAST = 0x3 RTN_ANYCAST = 0x4 RTN_MULTICAST = 0x5 RTN_BLACKHOLE = 0x6 RTN_UNREACHABLE = 0x7 RTN_PROHIBIT = 0x8 RTN_THROW = 0x9 RTN_NAT = 0xa RTN_XRESOLVE = 0xb RTNLGRP_NONE = 0x0 RTNLGRP_LINK = 0x1 RTNLGRP_NOTIFY = 0x2 RTNLGRP_NEIGH = 0x3 RTNLGRP_TC = 0x4 RTNLGRP_IPV4_IFADDR = 0x5 RTNLGRP_IPV4_MROUTE = 0x6 RTNLGRP_IPV4_ROUTE = 0x7 RTNLGRP_IPV4_RULE = 0x8 RTNLGRP_IPV6_IFADDR = 0x9 RTNLGRP_IPV6_MROUTE = 0xa RTNLGRP_IPV6_ROUTE = 0xb RTNLGRP_IPV6_IFINFO = 0xc RTNLGRP_IPV6_PREFIX = 0x12 RTNLGRP_IPV6_RULE = 0x13 RTNLGRP_ND_USEROPT = 0x14 SizeofNlMsghdr = 0x10 SizeofNlMsgerr = 0x14 SizeofRtGenmsg = 0x1 SizeofNlAttr = 0x4 SizeofRtAttr = 0x4 SizeofIfInfomsg = 0x10 SizeofIfAddrmsg = 0x8 SizeofRtMsg = 0xc SizeofRtNexthop = 0x8 ) ``` ``` const ( SizeofSockFilter = 0x8 SizeofSockFprog = 0x10 ) ``` ``` const ( VINTR = 0x0 VQUIT = 0x1 VERASE = 0x2 VKILL = 0x3 VEOF = 0x4 VTIME = 0x5 VMIN = 0x6 VSWTC = 0x7 VSTART = 0x8 VSTOP = 0x9 VSUSP = 0xa VEOL = 0xb VREPRINT = 0xc VDISCARD = 0xd VWERASE = 0xe VLNEXT = 0xf VEOL2 = 0x10 IGNBRK = 0x1 BRKINT = 0x2 IGNPAR = 0x4 PARMRK = 0x8 INPCK = 0x10 ISTRIP = 0x20 INLCR = 0x40 IGNCR = 0x80 ICRNL = 0x100 IUCLC = 0x200 IXON = 0x400 IXANY = 0x800 IXOFF = 0x1000 IMAXBEL = 0x2000 IUTF8 = 0x4000 OPOST = 0x1 OLCUC = 0x2 ONLCR = 0x4 OCRNL = 0x8 ONOCR = 0x10 ONLRET = 0x20 OFILL = 0x40 OFDEL = 0x80 B0 = 0x0 B50 = 0x1 B75 = 0x2 B110 = 0x3 B134 = 0x4 B150 = 0x5 B200 = 0x6 B300 = 0x7 B600 = 0x8 B1200 = 0x9 B1800 = 0xa B2400 = 0xb B4800 = 0xc B9600 = 0xd B19200 = 0xe B38400 = 0xf CSIZE = 0x30 CS5 = 0x0 CS6 = 0x10 CS7 = 0x20 CS8 = 0x30 CSTOPB = 0x40 CREAD = 0x80 PARENB = 0x100 PARODD = 0x200 HUPCL = 0x400 CLOCAL = 0x800 B57600 = 0x1001 B115200 = 0x1002 B230400 = 0x1003 B460800 = 0x1004 B500000 = 0x1005 B576000 = 0x1006 B921600 = 0x1007 B1000000 = 0x1008 B1152000 = 0x1009 B1500000 = 0x100a B2000000 = 0x100b B2500000 = 0x100c B3000000 = 0x100d B3500000 = 0x100e B4000000 = 0x100f ISIG = 0x1 ICANON = 0x2 XCASE = 0x4 ECHO = 0x8 ECHOE = 0x10 ECHOK = 0x20 ECHONL = 0x40 NOFLSH = 0x80 TOSTOP = 0x100 ECHOCTL = 0x200 ECHOPRT = 0x400 ECHOKE = 0x800 FLUSHO = 0x1000 PENDIN = 0x4000 IEXTEN = 0x8000 TCGETS = 0x5401 TCSETS = 0x5402 ) ``` ``` const ImplementsGetwd = true ``` ``` const ( PathMax = 0x1000 ) ``` ``` const SizeofInotifyEvent = 0x10 ``` Variables --------- ``` var ( Stdin = 0 Stdout = 1 Stderr = 2 ) ``` ``` var ForkLock sync.RWMutex ``` For testing: clients can set this flag to force creation of IPv6 sockets to return EAFNOSUPPORT. ``` var SocketDisableIPv6 bool ``` func Access ----------- ``` func Access(path string, mode uint32) (err error) ``` func Acct --------- ``` func Acct(path string) (err error) ``` func Adjtimex ------------- ``` func Adjtimex(buf *Timex) (state int, err error) ``` func AttachLsf -------------- ``` func AttachLsf(fd int, i []SockFilter) error ``` Deprecated: Use golang.org/x/net/bpf instead. func Bind --------- ``` func Bind(fd int, sa Sockaddr) (err error) ``` func BindToDevice ----------------- ``` func BindToDevice(fd int, device string) (err error) ``` BindToDevice binds the socket associated with fd to device. func BytePtrFromString 1.1 -------------------------- ``` func BytePtrFromString(s string) (*byte, error) ``` BytePtrFromString returns a pointer to a NUL-terminated array of bytes containing the text of s. If s contains a NUL byte at any location, it returns (nil, EINVAL). func ByteSliceFromString 1.1 ---------------------------- ``` func ByteSliceFromString(s string) ([]byte, error) ``` ByteSliceFromString returns a NUL-terminated slice of bytes containing the text of s. If s contains a NUL byte at any location, it returns (nil, EINVAL). func Chdir ---------- ``` func Chdir(path string) (err error) ``` func Chmod ---------- ``` func Chmod(path string, mode uint32) (err error) ``` func Chown ---------- ``` func Chown(path string, uid int, gid int) (err error) ``` func Chroot ----------- ``` func Chroot(path string) (err error) ``` func Clearenv ------------- ``` func Clearenv() ``` func Close ---------- ``` func Close(fd int) (err error) ``` func CloseOnExec ---------------- ``` func CloseOnExec(fd int) ``` func CmsgLen ------------ ``` func CmsgLen(datalen int) int ``` CmsgLen returns the value to store in the Len field of the Cmsghdr structure, taking into account any necessary alignment. func CmsgSpace -------------- ``` func CmsgSpace(datalen int) int ``` CmsgSpace returns the number of bytes an ancillary element with payload of the passed data length occupies. func Connect ------------ ``` func Connect(fd int, sa Sockaddr) (err error) ``` func Creat ---------- ``` func Creat(path string, mode uint32) (fd int, err error) ``` func DetachLsf -------------- ``` func DetachLsf(fd int) error ``` Deprecated: Use golang.org/x/net/bpf instead. func Dup -------- ``` func Dup(oldfd int) (fd int, err error) ``` func Dup2 --------- ``` func Dup2(oldfd int, newfd int) (err error) ``` func Dup3 --------- ``` func Dup3(oldfd int, newfd int, flags int) (err error) ``` func Environ ------------ ``` func Environ() []string ``` func EpollCreate ---------------- ``` func EpollCreate(size int) (fd int, err error) ``` func EpollCreate1 ----------------- ``` func EpollCreate1(flag int) (fd int, err error) ``` func EpollCtl ------------- ``` func EpollCtl(epfd int, op int, fd int, event *EpollEvent) (err error) ``` func EpollWait -------------- ``` func EpollWait(epfd int, events []EpollEvent, msec int) (n int, err error) ``` func Exec --------- ``` func Exec(argv0 string, argv []string, envv []string) (err error) ``` Exec invokes the execve(2) system call. func Exit --------- ``` func Exit(code int) ``` func Faccessat -------------- ``` func Faccessat(dirfd int, path string, mode uint32, flags int) (err error) ``` func Fallocate -------------- ``` func Fallocate(fd int, mode uint32, off int64, len int64) (err error) ``` func Fchdir ----------- ``` func Fchdir(fd int) (err error) ``` func Fchmod ----------- ``` func Fchmod(fd int, mode uint32) (err error) ``` func Fchmodat ------------- ``` func Fchmodat(dirfd int, path string, mode uint32, flags int) (err error) ``` func Fchown ----------- ``` func Fchown(fd int, uid int, gid int) (err error) ``` func Fchownat ------------- ``` func Fchownat(dirfd int, path string, uid int, gid int, flags int) (err error) ``` func FcntlFlock --------------- ``` func FcntlFlock(fd uintptr, cmd int, lk *Flock_t) error ``` FcntlFlock performs a fcntl syscall for the F\_GETLK, F\_SETLK or F\_SETLKW command. func Fdatasync -------------- ``` func Fdatasync(fd int) (err error) ``` func Flock ---------- ``` func Flock(fd int, how int) (err error) ``` func ForkExec ------------- ``` func ForkExec(argv0 string, argv []string, attr *ProcAttr) (pid int, err error) ``` Combination of fork and exec, careful to be thread safe. func Fstat ---------- ``` func Fstat(fd int, stat *Stat_t) (err error) ``` func Fstatfs ------------ ``` func Fstatfs(fd int, buf *Statfs_t) (err error) ``` func Fsync ---------- ``` func Fsync(fd int) (err error) ``` func Ftruncate -------------- ``` func Ftruncate(fd int, length int64) (err error) ``` func Futimes ------------ ``` func Futimes(fd int, tv []Timeval) (err error) ``` func Futimesat -------------- ``` func Futimesat(dirfd int, path string, tv []Timeval) (err error) ``` func Getcwd ----------- ``` func Getcwd(buf []byte) (n int, err error) ``` func Getdents ------------- ``` func Getdents(fd int, buf []byte) (n int, err error) ``` func Getegid ------------ ``` func Getegid() (egid int) ``` func Getenv ----------- ``` func Getenv(key string) (value string, found bool) ``` func Geteuid ------------ ``` func Geteuid() (euid int) ``` func Getgid ----------- ``` func Getgid() (gid int) ``` func Getgroups -------------- ``` func Getgroups() (gids []int, err error) ``` func Getpagesize ---------------- ``` func Getpagesize() int ``` func Getpgid ------------ ``` func Getpgid(pid int) (pgid int, err error) ``` func Getpgrp ------------ ``` func Getpgrp() (pid int) ``` func Getpid ----------- ``` func Getpid() (pid int) ``` func Getppid ------------ ``` func Getppid() (ppid int) ``` func Getpriority ---------------- ``` func Getpriority(which int, who int) (prio int, err error) ``` func Getrlimit -------------- ``` func Getrlimit(resource int, rlim *Rlimit) (err error) ``` func Getrusage -------------- ``` func Getrusage(who int, rusage *Rusage) (err error) ``` func GetsockoptInet4Addr ------------------------ ``` func GetsockoptInet4Addr(fd, level, opt int) (value [4]byte, err error) ``` func GetsockoptInt ------------------ ``` func GetsockoptInt(fd, level, opt int) (value int, err error) ``` func Gettid ----------- ``` func Gettid() (tid int) ``` func Gettimeofday ----------------- ``` func Gettimeofday(tv *Timeval) (err error) ``` func Getuid ----------- ``` func Getuid() (uid int) ``` func Getwd ---------- ``` func Getwd() (wd string, err error) ``` func Getxattr ------------- ``` func Getxattr(path string, attr string, dest []byte) (sz int, err error) ``` func InotifyAddWatch -------------------- ``` func InotifyAddWatch(fd int, pathname string, mask uint32) (watchdesc int, err error) ``` func InotifyInit ---------------- ``` func InotifyInit() (fd int, err error) ``` func InotifyInit1 ----------------- ``` func InotifyInit1(flags int) (fd int, err error) ``` func InotifyRmWatch ------------------- ``` func InotifyRmWatch(fd int, watchdesc uint32) (success int, err error) ``` func Ioperm ----------- ``` func Ioperm(from int, num int, on int) (err error) ``` func Iopl --------- ``` func Iopl(level int) (err error) ``` func Kill --------- ``` func Kill(pid int, sig Signal) (err error) ``` func Klogctl ------------ ``` func Klogctl(typ int, buf []byte) (n int, err error) ``` func Lchown ----------- ``` func Lchown(path string, uid int, gid int) (err error) ``` func Link --------- ``` func Link(oldpath string, newpath string) (err error) ``` func Listen ----------- ``` func Listen(s int, n int) (err error) ``` func Listxattr -------------- ``` func Listxattr(path string, dest []byte) (sz int, err error) ``` func LsfSocket -------------- ``` func LsfSocket(ifindex, proto int) (int, error) ``` Deprecated: Use golang.org/x/net/bpf instead. func Lstat ---------- ``` func Lstat(path string, stat *Stat_t) (err error) ``` func Madvise ------------ ``` func Madvise(b []byte, advice int) (err error) ``` func Mkdir ---------- ``` func Mkdir(path string, mode uint32) (err error) ``` func Mkdirat ------------ ``` func Mkdirat(dirfd int, path string, mode uint32) (err error) ``` func Mkfifo ----------- ``` func Mkfifo(path string, mode uint32) (err error) ``` func Mknod ---------- ``` func Mknod(path string, mode uint32, dev int) (err error) ``` func Mknodat ------------ ``` func Mknodat(dirfd int, path string, mode uint32, dev int) (err error) ``` func Mlock ---------- ``` func Mlock(b []byte) (err error) ``` func Mlockall ------------- ``` func Mlockall(flags int) (err error) ``` func Mmap --------- ``` func Mmap(fd int, offset int64, length int, prot int, flags int) (data []byte, err error) ``` func Mount ---------- ``` func Mount(source string, target string, fstype string, flags uintptr, data string) (err error) ``` func Mprotect ------------- ``` func Mprotect(b []byte, prot int) (err error) ``` func Munlock ------------ ``` func Munlock(b []byte) (err error) ``` func Munlockall --------------- ``` func Munlockall() (err error) ``` func Munmap ----------- ``` func Munmap(b []byte) (err error) ``` func Nanosleep -------------- ``` func Nanosleep(time *Timespec, leftover *Timespec) (err error) ``` func NetlinkRIB --------------- ``` func NetlinkRIB(proto, family int) ([]byte, error) ``` NetlinkRIB returns routing information base, as known as RIB, which consists of network facility information, states and parameters. func Open --------- ``` func Open(path string, mode int, perm uint32) (fd int, err error) ``` func Openat ----------- ``` func Openat(dirfd int, path string, flags int, mode uint32) (fd int, err error) ``` func ParseDirent ---------------- ``` func ParseDirent(buf []byte, max int, names []string) (consumed int, count int, newnames []string) ``` ParseDirent parses up to max directory entries in buf, appending the names to names. It returns the number of bytes consumed from buf, the number of entries added to names, and the new names slice. func ParseUnixRights -------------------- ``` func ParseUnixRights(m *SocketControlMessage) ([]int, error) ``` ParseUnixRights decodes a socket control message that contains an integer array of open file descriptors from another process. func Pause ---------- ``` func Pause() (err error) ``` func Pipe --------- ``` func Pipe(p []int) error ``` func Pipe2 ---------- ``` func Pipe2(p []int, flags int) error ``` func PivotRoot -------------- ``` func PivotRoot(newroot string, putold string) (err error) ``` func Pread ---------- ``` func Pread(fd int, p []byte, offset int64) (n int, err error) ``` func PtraceAttach ----------------- ``` func PtraceAttach(pid int) (err error) ``` func PtraceCont --------------- ``` func PtraceCont(pid int, signal int) (err error) ``` func PtraceDetach ----------------- ``` func PtraceDetach(pid int) (err error) ``` func PtraceGetEventMsg ---------------------- ``` func PtraceGetEventMsg(pid int) (msg uint, err error) ``` func PtraceGetRegs ------------------ ``` func PtraceGetRegs(pid int, regsout *PtraceRegs) (err error) ``` func PtracePeekData ------------------- ``` func PtracePeekData(pid int, addr uintptr, out []byte) (count int, err error) ``` func PtracePeekText ------------------- ``` func PtracePeekText(pid int, addr uintptr, out []byte) (count int, err error) ``` func PtracePokeData ------------------- ``` func PtracePokeData(pid int, addr uintptr, data []byte) (count int, err error) ``` func PtracePokeText ------------------- ``` func PtracePokeText(pid int, addr uintptr, data []byte) (count int, err error) ``` func PtraceSetOptions --------------------- ``` func PtraceSetOptions(pid int, options int) (err error) ``` func PtraceSetRegs ------------------ ``` func PtraceSetRegs(pid int, regs *PtraceRegs) (err error) ``` func PtraceSingleStep --------------------- ``` func PtraceSingleStep(pid int) (err error) ``` func PtraceSyscall ------------------ ``` func PtraceSyscall(pid int, signal int) (err error) ``` func Pwrite ----------- ``` func Pwrite(fd int, p []byte, offset int64) (n int, err error) ``` func Read --------- ``` func Read(fd int, p []byte) (n int, err error) ``` func ReadDirent --------------- ``` func ReadDirent(fd int, buf []byte) (n int, err error) ``` func Readlink ------------- ``` func Readlink(path string, buf []byte) (n int, err error) ``` func Reboot ----------- ``` func Reboot(cmd int) (err error) ``` func Removexattr ---------------- ``` func Removexattr(path string, attr string) (err error) ``` func Rename ----------- ``` func Rename(oldpath string, newpath string) (err error) ``` func Renameat ------------- ``` func Renameat(olddirfd int, oldpath string, newdirfd int, newpath string) (err error) ``` func Rmdir ---------- ``` func Rmdir(path string) error ``` func Seek --------- ``` func Seek(fd int, offset int64, whence int) (off int64, err error) ``` func Select ----------- ``` func Select(nfd int, r *FdSet, w *FdSet, e *FdSet, timeout *Timeval) (n int, err error) ``` func Sendfile ------------- ``` func Sendfile(outfd int, infd int, offset *int64, count int) (written int, err error) ``` func Sendmsg ------------ ``` func Sendmsg(fd int, p, oob []byte, to Sockaddr, flags int) (err error) ``` func SendmsgN ------------- ``` func SendmsgN(fd int, p, oob []byte, to Sockaddr, flags int) (n int, err error) ``` func Sendto ----------- ``` func Sendto(fd int, p []byte, flags int, to Sockaddr) (err error) ``` func SetLsfPromisc ------------------ ``` func SetLsfPromisc(name string, m bool) error ``` Deprecated: Use golang.org/x/net/bpf instead. func SetNonblock ---------------- ``` func SetNonblock(fd int, nonblocking bool) (err error) ``` func Setdomainname ------------------ ``` func Setdomainname(p []byte) (err error) ``` func Setegid ------------ ``` func Setegid(egid int) (err error) ``` func Setenv ----------- ``` func Setenv(key, value string) error ``` func Seteuid ------------ ``` func Seteuid(euid int) (err error) ``` func Setfsgid ------------- ``` func Setfsgid(gid int) (err error) ``` func Setfsuid ------------- ``` func Setfsuid(uid int) (err error) ``` func Setgid ----------- ``` func Setgid(gid int) (err error) ``` func Setgroups -------------- ``` func Setgroups(gids []int) (err error) ``` func Sethostname ---------------- ``` func Sethostname(p []byte) (err error) ``` func Setpgid ------------ ``` func Setpgid(pid int, pgid int) (err error) ``` func Setpriority ---------------- ``` func Setpriority(which int, who int, prio int) (err error) ``` func Setregid ------------- ``` func Setregid(rgid, egid int) (err error) ``` func Setresgid -------------- ``` func Setresgid(rgid, egid, sgid int) (err error) ``` func Setresuid -------------- ``` func Setresuid(ruid, euid, suid int) (err error) ``` func Setreuid ------------- ``` func Setreuid(ruid, euid int) (err error) ``` func Setrlimit -------------- ``` func Setrlimit(resource int, rlim *Rlimit) (err error) ``` func Setsid ----------- ``` func Setsid() (pid int, err error) ``` func SetsockoptByte ------------------- ``` func SetsockoptByte(fd, level, opt int, value byte) (err error) ``` func SetsockoptICMPv6Filter --------------------------- ``` func SetsockoptICMPv6Filter(fd, level, opt int, filter *ICMPv6Filter) error ``` func SetsockoptIPMreq --------------------- ``` func SetsockoptIPMreq(fd, level, opt int, mreq *IPMreq) (err error) ``` func SetsockoptIPMreqn ---------------------- ``` func SetsockoptIPMreqn(fd, level, opt int, mreq *IPMreqn) (err error) ``` func SetsockoptIPv6Mreq ----------------------- ``` func SetsockoptIPv6Mreq(fd, level, opt int, mreq *IPv6Mreq) (err error) ``` func SetsockoptInet4Addr ------------------------ ``` func SetsockoptInet4Addr(fd, level, opt int, value [4]byte) (err error) ``` func SetsockoptInt ------------------ ``` func SetsockoptInt(fd, level, opt int, value int) (err error) ``` func SetsockoptLinger --------------------- ``` func SetsockoptLinger(fd, level, opt int, l *Linger) (err error) ``` func SetsockoptString --------------------- ``` func SetsockoptString(fd, level, opt int, s string) (err error) ``` func SetsockoptTimeval ---------------------- ``` func SetsockoptTimeval(fd, level, opt int, tv *Timeval) (err error) ``` func Settimeofday ----------------- ``` func Settimeofday(tv *Timeval) (err error) ``` func Setuid ----------- ``` func Setuid(uid int) (err error) ``` func Setxattr ------------- ``` func Setxattr(path string, attr string, data []byte, flags int) (err error) ``` func Shutdown ------------- ``` func Shutdown(fd int, how int) (err error) ``` func SlicePtrFromStrings ------------------------ ``` func SlicePtrFromStrings(ss []string) ([]*byte, error) ``` SlicePtrFromStrings converts a slice of strings to a slice of pointers to NUL-terminated byte arrays. If any string contains a NUL byte, it returns (nil, EINVAL). func Socket ----------- ``` func Socket(domain, typ, proto int) (fd int, err error) ``` func Socketpair --------------- ``` func Socketpair(domain, typ, proto int) (fd [2]int, err error) ``` func Splice ----------- ``` func Splice(rfd int, roff *int64, wfd int, woff *int64, len int, flags int) (n int64, err error) ``` func StartProcess ----------------- ``` func StartProcess(argv0 string, argv []string, attr *ProcAttr) (pid int, handle uintptr, err error) ``` StartProcess wraps ForkExec for package os. func Stat --------- ``` func Stat(path string, stat *Stat_t) (err error) ``` func Statfs ----------- ``` func Statfs(path string, buf *Statfs_t) (err error) ``` func StringBytePtr ------------------ ``` func StringBytePtr(s string) *byte ``` StringBytePtr returns a pointer to a NUL-terminated array of bytes. If s contains a NUL byte this function panics instead of returning an error. Deprecated: Use BytePtrFromString instead. func StringByteSlice -------------------- ``` func StringByteSlice(s string) []byte ``` StringByteSlice converts a string to a NUL-terminated []byte, If s contains a NUL byte this function panics instead of returning an error. Deprecated: Use ByteSliceFromString instead. func StringSlicePtr ------------------- ``` func StringSlicePtr(ss []string) []*byte ``` StringSlicePtr converts a slice of strings to a slice of pointers to NUL-terminated byte arrays. If any string contains a NUL byte this function panics instead of returning an error. Deprecated: Use SlicePtrFromStrings instead. func Symlink ------------ ``` func Symlink(oldpath string, newpath string) (err error) ``` func Sync --------- ``` func Sync() ``` func SyncFileRange ------------------ ``` func SyncFileRange(fd int, off int64, n int64, flags int) (err error) ``` func Sysinfo ------------ ``` func Sysinfo(info *Sysinfo_t) (err error) ``` func Tee -------- ``` func Tee(rfd int, wfd int, len int, flags int) (n int64, err error) ``` func Tgkill ----------- ``` func Tgkill(tgid int, tid int, sig Signal) (err error) ``` func Times ---------- ``` func Times(tms *Tms) (ticks uintptr, err error) ``` func TimespecToNsec 1.1 ----------------------- ``` func TimespecToNsec(ts Timespec) int64 ``` TimespecToNSec returns the time stored in ts as nanoseconds. func TimevalToNsec ------------------ ``` func TimevalToNsec(tv Timeval) int64 ``` TimevalToNsec returns the time stored in tv as nanoseconds. func Truncate ------------- ``` func Truncate(path string, length int64) (err error) ``` func Umask ---------- ``` func Umask(mask int) (oldmask int) ``` func Uname ---------- ``` func Uname(buf *Utsname) (err error) ``` func UnixCredentials -------------------- ``` func UnixCredentials(ucred *Ucred) []byte ``` UnixCredentials encodes credentials into a socket control message for sending to another process. This can be used for authentication. func UnixRights --------------- ``` func UnixRights(fds ...int) []byte ``` UnixRights encodes a set of open file descriptors into a socket control message for sending to another process. func Unlink ----------- ``` func Unlink(path string) error ``` func Unlinkat ------------- ``` func Unlinkat(dirfd int, path string) error ``` func Unmount ------------ ``` func Unmount(target string, flags int) (err error) ``` func Unsetenv 1.4 ----------------- ``` func Unsetenv(key string) error ``` func Unshare ------------ ``` func Unshare(flags int) (err error) ``` func Ustat ---------- ``` func Ustat(dev int, ubuf *Ustat_t) (err error) ``` func Utime ---------- ``` func Utime(path string, buf *Utimbuf) (err error) ``` func Utimes ----------- ``` func Utimes(path string, tv []Timeval) (err error) ``` func UtimesNano 1.1 ------------------- ``` func UtimesNano(path string, ts []Timespec) (err error) ``` func Wait4 ---------- ``` func Wait4(pid int, wstatus *WaitStatus, options int, rusage *Rusage) (wpid int, err error) ``` func Write ---------- ``` func Write(fd int, p []byte) (n int, err error) ``` type Cmsghdr ------------ ``` type Cmsghdr struct { Len uint64 Level int32 Type int32 } ``` ### func (\*Cmsghdr) SetLen ``` func (cmsg *Cmsghdr) SetLen(length int) ``` type Conn 1.9 ------------- Conn is implemented by some types in the net and os packages to provide access to the underlying file descriptor or handle. ``` type Conn interface { // SyscallConn returns a raw network connection. SyscallConn() (RawConn, error) } ``` type Credential --------------- Credential holds user and group identities to be assumed by a child process started by StartProcess. ``` type Credential struct { Uid uint32 // User ID. Gid uint32 // Group ID. Groups []uint32 // Supplementary group IDs. NoSetGroups bool // If true, don't set supplementary groups } ``` type Dirent ----------- ``` type Dirent struct { Ino uint64 Off int64 Reclen uint16 Type uint8 Name [256]int8 Pad_cgo_0 [5]byte } ``` type EpollEvent --------------- ``` type EpollEvent struct { Events uint32 Fd int32 Pad int32 } ``` type Errno ---------- An Errno is an unsigned number describing an error condition. It implements the error interface. The zero Errno is by convention a non-error, so code to convert from Errno to error should use: ``` err = nil if errno != 0 { err = errno } ``` Errno values can be tested against error values from the os package using errors.Is. For example: ``` _, _, err := syscall.Syscall(...) if errors.Is(err, fs.ErrNotExist) ... ``` ``` type Errno uintptr ``` ### func AllThreadsSyscall ``` func AllThreadsSyscall(trap, a1, a2, a3 uintptr) (r1, r2 uintptr, err Errno) ``` AllThreadsSyscall performs a syscall on each OS thread of the Go runtime. It first invokes the syscall on one thread. Should that invocation fail, it returns immediately with the error status. Otherwise, it invokes the syscall on all of the remaining threads in parallel. It will terminate the program if it observes any invoked syscall's return value differs from that of the first invocation. AllThreadsSyscall is intended for emulating simultaneous process-wide state changes that require consistently modifying per-thread state of the Go runtime. AllThreadsSyscall is unaware of any threads that are launched explicitly by cgo linked code, so the function always returns ENOTSUP in binaries that use cgo. ### func AllThreadsSyscall6 ``` func AllThreadsSyscall6(trap, a1, a2, a3, a4, a5, a6 uintptr) (r1, r2 uintptr, err Errno) ``` AllThreadsSyscall6 is like AllThreadsSyscall, but extended to six arguments. ### func RawSyscall ``` func RawSyscall(trap, a1, a2, a3 uintptr) (r1, r2 uintptr, err Errno) ``` ### func RawSyscall6 ``` func RawSyscall6(trap, a1, a2, a3, a4, a5, a6 uintptr) (r1, r2 uintptr, err Errno) ``` ### func Syscall ``` func Syscall(trap, a1, a2, a3 uintptr) (r1, r2 uintptr, err Errno) ``` ### func Syscall6 ``` func Syscall6(trap, a1, a2, a3, a4, a5, a6 uintptr) (r1, r2 uintptr, err Errno) ``` ### func (Errno) Error ``` func (e Errno) Error() string ``` ### func (Errno) Is 1.13 ``` func (e Errno) Is(target error) bool ``` ### func (Errno) Temporary ``` func (e Errno) Temporary() bool ``` ### func (Errno) Timeout ``` func (e Errno) Timeout() bool ``` type FdSet ---------- ``` type FdSet struct { Bits [16]int64 } ``` type Flock\_t ------------- ``` type Flock_t struct { Type int16 Whence int16 Pad_cgo_0 [4]byte Start int64 Len int64 Pid int32 Pad_cgo_1 [4]byte } ``` type Fsid --------- ``` type Fsid struct { X__val [2]int32 } ``` type ICMPv6Filter ----------------- ``` type ICMPv6Filter struct { Data [8]uint32 } ``` ### func GetsockoptICMPv6Filter ``` func GetsockoptICMPv6Filter(fd, level, opt int) (*ICMPv6Filter, error) ``` type IPMreq ----------- ``` type IPMreq struct { Multiaddr [4]byte /* in_addr */ Interface [4]byte /* in_addr */ } ``` ### func GetsockoptIPMreq ``` func GetsockoptIPMreq(fd, level, opt int) (*IPMreq, error) ``` type IPMreqn ------------ ``` type IPMreqn struct { Multiaddr [4]byte /* in_addr */ Address [4]byte /* in_addr */ Ifindex int32 } ``` ### func GetsockoptIPMreqn ``` func GetsockoptIPMreqn(fd, level, opt int) (*IPMreqn, error) ``` type IPv6MTUInfo ---------------- ``` type IPv6MTUInfo struct { Addr RawSockaddrInet6 Mtu uint32 } ``` ### func GetsockoptIPv6MTUInfo ``` func GetsockoptIPv6MTUInfo(fd, level, opt int) (*IPv6MTUInfo, error) ``` type IPv6Mreq ------------- ``` type IPv6Mreq struct { Multiaddr [16]byte /* in6_addr */ Interface uint32 } ``` ### func GetsockoptIPv6Mreq ``` func GetsockoptIPv6Mreq(fd, level, opt int) (*IPv6Mreq, error) ``` type IfAddrmsg -------------- ``` type IfAddrmsg struct { Family uint8 Prefixlen uint8 Flags uint8 Scope uint8 Index uint32 } ``` type IfInfomsg -------------- ``` type IfInfomsg struct { Family uint8 X__ifi_pad uint8 Type uint16 Index int32 Flags uint32 Change uint32 } ``` type Inet4Pktinfo ----------------- ``` type Inet4Pktinfo struct { Ifindex int32 Spec_dst [4]byte /* in_addr */ Addr [4]byte /* in_addr */ } ``` type Inet6Pktinfo ----------------- ``` type Inet6Pktinfo struct { Addr [16]byte /* in6_addr */ Ifindex uint32 } ``` type InotifyEvent ----------------- ``` type InotifyEvent struct { Wd int32 Mask uint32 Cookie uint32 Len uint32 Name [0]uint8 } ``` type Iovec ---------- ``` type Iovec struct { Base *byte Len uint64 } ``` ### func (\*Iovec) SetLen ``` func (iov *Iovec) SetLen(length int) ``` type Linger ----------- ``` type Linger struct { Onoff int32 Linger int32 } ``` type Msghdr ----------- ``` type Msghdr struct { Name *byte Namelen uint32 Pad_cgo_0 [4]byte Iov *Iovec Iovlen uint64 Control *byte Controllen uint64 Flags int32 Pad_cgo_1 [4]byte } ``` ### func (\*Msghdr) SetControllen ``` func (msghdr *Msghdr) SetControllen(length int) ``` type NetlinkMessage ------------------- NetlinkMessage represents a netlink message. ``` type NetlinkMessage struct { Header NlMsghdr Data []byte } ``` ### func ParseNetlinkMessage ``` func ParseNetlinkMessage(b []byte) ([]NetlinkMessage, error) ``` ParseNetlinkMessage parses b as an array of netlink messages and returns the slice containing the NetlinkMessage structures. type NetlinkRouteAttr --------------------- NetlinkRouteAttr represents a netlink route attribute. ``` type NetlinkRouteAttr struct { Attr RtAttr Value []byte } ``` ### func ParseNetlinkRouteAttr ``` func ParseNetlinkRouteAttr(m *NetlinkMessage) ([]NetlinkRouteAttr, error) ``` ParseNetlinkRouteAttr parses m's payload as an array of netlink route attributes and returns the slice containing the NetlinkRouteAttr structures. type NetlinkRouteRequest ------------------------ NetlinkRouteRequest represents a request message to receive routing and link states from the kernel. ``` type NetlinkRouteRequest struct { Header NlMsghdr Data RtGenmsg } ``` type NlAttr ----------- ``` type NlAttr struct { Len uint16 Type uint16 } ``` type NlMsgerr ------------- ``` type NlMsgerr struct { Error int32 Msg NlMsghdr } ``` type NlMsghdr ------------- ``` type NlMsghdr struct { Len uint32 Type uint16 Flags uint16 Seq uint32 Pid uint32 } ``` type ProcAttr ------------- ProcAttr holds attributes that will be applied to a new process started by StartProcess. ``` type ProcAttr struct { Dir string // Current working directory. Env []string // Environment. Files []uintptr // File descriptors. Sys *SysProcAttr } ``` type PtraceRegs --------------- ``` type PtraceRegs struct { R15 uint64 R14 uint64 R13 uint64 R12 uint64 Rbp uint64 Rbx uint64 R11 uint64 R10 uint64 R9 uint64 R8 uint64 Rax uint64 Rcx uint64 Rdx uint64 Rsi uint64 Rdi uint64 Orig_rax uint64 Rip uint64 Cs uint64 Eflags uint64 Rsp uint64 Ss uint64 Fs_base uint64 Gs_base uint64 Ds uint64 Es uint64 Fs uint64 Gs uint64 } ``` ### func (\*PtraceRegs) PC ``` func (r *PtraceRegs) PC() uint64 ``` ### func (\*PtraceRegs) SetPC ``` func (r *PtraceRegs) SetPC(pc uint64) ``` type RawConn 1.9 ---------------- A RawConn is a raw network connection. ``` type RawConn interface { // Control invokes f on the underlying connection's file // descriptor or handle. // The file descriptor fd is guaranteed to remain valid while // f executes but not after f returns. Control(f func(fd uintptr)) error // Read invokes f on the underlying connection's file // descriptor or handle; f is expected to try to read from the // file descriptor. // If f returns true, Read returns. Otherwise Read blocks // waiting for the connection to be ready for reading and // tries again repeatedly. // The file descriptor is guaranteed to remain valid while f // executes but not after f returns. Read(f func(fd uintptr) (done bool)) error // Write is like Read but for writing. Write(f func(fd uintptr) (done bool)) error } ``` type RawSockaddr ---------------- ``` type RawSockaddr struct { Family uint16 Data [14]int8 } ``` type RawSockaddrAny ------------------- ``` type RawSockaddrAny struct { Addr RawSockaddr Pad [96]int8 } ``` type RawSockaddrInet4 --------------------- ``` type RawSockaddrInet4 struct { Family uint16 Port uint16 Addr [4]byte /* in_addr */ Zero [8]uint8 } ``` type RawSockaddrInet6 1.1 ------------------------- ``` type RawSockaddrInet6 struct { Family uint16 Port uint16 Flowinfo uint32 Addr [16]byte /* in6_addr */ Scope_id uint32 } ``` type RawSockaddrLinklayer ------------------------- ``` type RawSockaddrLinklayer struct { Family uint16 Protocol uint16 Ifindex int32 Hatype uint16 Pkttype uint8 Halen uint8 Addr [8]uint8 } ``` type RawSockaddrNetlink ----------------------- ``` type RawSockaddrNetlink struct { Family uint16 Pad uint16 Pid uint32 Groups uint32 } ``` type RawSockaddrUnix 1.12 ------------------------- ``` type RawSockaddrUnix struct { Family uint16 Path [108]int8 } ``` type Rlimit ----------- ``` type Rlimit struct { Cur uint64 Max uint64 } ``` type RtAttr ----------- ``` type RtAttr struct { Len uint16 Type uint16 } ``` type RtGenmsg ------------- ``` type RtGenmsg struct { Family uint8 } ``` type RtMsg ---------- ``` type RtMsg struct { Family uint8 Dst_len uint8 Src_len uint8 Tos uint8 Table uint8 Protocol uint8 Scope uint8 Type uint8 Flags uint32 } ``` type RtNexthop -------------- ``` type RtNexthop struct { Len uint16 Flags uint8 Hops uint8 Ifindex int32 } ``` type Rusage ----------- ``` type Rusage struct { Utime Timeval Stime Timeval Maxrss int64 Ixrss int64 Idrss int64 Isrss int64 Minflt int64 Majflt int64 Nswap int64 Inblock int64 Oublock int64 Msgsnd int64 Msgrcv int64 Nsignals int64 Nvcsw int64 Nivcsw int64 } ``` type Signal ----------- A Signal is a number describing a process signal. It implements the os.Signal interface. ``` type Signal int ``` ### func (Signal) Signal ``` func (s Signal) Signal() ``` ### func (Signal) String ``` func (s Signal) String() string ``` type SockFilter --------------- ``` type SockFilter struct { Code uint16 Jt uint8 Jf uint8 K uint32 } ``` ### func LsfJump ``` func LsfJump(code, k, jt, jf int) *SockFilter ``` Deprecated: Use golang.org/x/net/bpf instead. ### func LsfStmt ``` func LsfStmt(code, k int) *SockFilter ``` Deprecated: Use golang.org/x/net/bpf instead. type SockFprog -------------- ``` type SockFprog struct { Len uint16 Pad_cgo_0 [6]byte Filter *SockFilter } ``` type Sockaddr ------------- ``` type Sockaddr interface { // contains filtered or unexported methods } ``` ### func Accept ``` func Accept(fd int) (nfd int, sa Sockaddr, err error) ``` ### func Accept4 ``` func Accept4(fd int, flags int) (nfd int, sa Sockaddr, err error) ``` ### func Getpeername ``` func Getpeername(fd int) (sa Sockaddr, err error) ``` ### func Getsockname ``` func Getsockname(fd int) (sa Sockaddr, err error) ``` ### func Recvfrom ``` func Recvfrom(fd int, p []byte, flags int) (n int, from Sockaddr, err error) ``` ### func Recvmsg ``` func Recvmsg(fd int, p, oob []byte, flags int) (n, oobn int, recvflags int, from Sockaddr, err error) ``` type SockaddrInet4 ------------------ ``` type SockaddrInet4 struct { Port int Addr [4]byte // contains filtered or unexported fields } ``` type SockaddrInet6 ------------------ ``` type SockaddrInet6 struct { Port int ZoneId uint32 Addr [16]byte // contains filtered or unexported fields } ``` type SockaddrLinklayer ---------------------- ``` type SockaddrLinklayer struct { Protocol uint16 Ifindex int Hatype uint16 Pkttype uint8 Halen uint8 Addr [8]byte // contains filtered or unexported fields } ``` type SockaddrNetlink -------------------- ``` type SockaddrNetlink struct { Family uint16 Pad uint16 Pid uint32 Groups uint32 // contains filtered or unexported fields } ``` type SockaddrUnix ----------------- ``` type SockaddrUnix struct { Name string // contains filtered or unexported fields } ``` type SocketControlMessage ------------------------- SocketControlMessage represents a socket control message. ``` type SocketControlMessage struct { Header Cmsghdr Data []byte } ``` ### func ParseSocketControlMessage ``` func ParseSocketControlMessage(b []byte) ([]SocketControlMessage, error) ``` ParseSocketControlMessage parses b as an array of socket control messages. type Stat\_t ------------ ``` type Stat_t struct { Dev uint64 Ino uint64 Nlink uint64 Mode uint32 Uid uint32 Gid uint32 X__pad0 int32 Rdev uint64 Size int64 Blksize int64 Blocks int64 Atim Timespec Mtim Timespec Ctim Timespec X__unused [3]int64 } ``` type Statfs\_t -------------- ``` type Statfs_t struct { Type int64 Bsize int64 Blocks uint64 Bfree uint64 Bavail uint64 Files uint64 Ffree uint64 Fsid Fsid Namelen int64 Frsize int64 Flags int64 Spare [4]int64 } ``` type SysProcAttr ---------------- ``` type SysProcAttr struct { Chroot string // Chroot. Credential *Credential // Credential. // Ptrace tells the child to call ptrace(PTRACE_TRACEME). // Call runtime.LockOSThread before starting a process with this set, // and don't call UnlockOSThread until done with PtraceSyscall calls. Ptrace bool Setsid bool // Create session. // Setpgid sets the process group ID of the child to Pgid, // or, if Pgid == 0, to the new child's process ID. Setpgid bool // Setctty sets the controlling terminal of the child to // file descriptor Ctty. Ctty must be a descriptor number // in the child process: an index into ProcAttr.Files. // This is only meaningful if Setsid is true. Setctty bool Noctty bool // Detach fd 0 from controlling terminal Ctty int // Controlling TTY fd // Foreground places the child process group in the foreground. // This implies Setpgid. The Ctty field must be set to // the descriptor of the controlling TTY. // Unlike Setctty, in this case Ctty must be a descriptor // number in the parent process. Foreground bool Pgid int // Child's process group ID if Setpgid. // Pdeathsig, if non-zero, is a signal that the kernel will send to // the child process when the creating thread dies. Note that the signal // is sent on thread termination, which may happen before process termination. // There are more details at https://go.dev/issue/27505. Pdeathsig Signal Cloneflags uintptr // Flags for clone calls (Linux only) Unshareflags uintptr // Flags for unshare calls (Linux only) UidMappings []SysProcIDMap // User ID mappings for user namespaces. GidMappings []SysProcIDMap // Group ID mappings for user namespaces. // GidMappingsEnableSetgroups enabling setgroups syscall. // If false, then setgroups syscall will be disabled for the child process. // This parameter is no-op if GidMappings == nil. Otherwise for unprivileged // users this should be set to false for mappings work. GidMappingsEnableSetgroups bool AmbientCaps []uintptr // Ambient capabilities (Linux only) UseCgroupFD bool // Whether to make use of the CgroupFD field. CgroupFD int // File descriptor of a cgroup to put the new process into. } ``` type SysProcIDMap ----------------- SysProcIDMap holds Container ID to Host ID mappings used for User Namespaces in Linux. See user\_namespaces(7). ``` type SysProcIDMap struct { ContainerID int // Container ID. HostID int // Host ID. Size int // Size. } ``` type Sysinfo\_t --------------- ``` type Sysinfo_t struct { Uptime int64 Loads [3]uint64 Totalram uint64 Freeram uint64 Sharedram uint64 Bufferram uint64 Totalswap uint64 Freeswap uint64 Procs uint16 Pad uint16 Pad_cgo_0 [4]byte Totalhigh uint64 Freehigh uint64 Unit uint32 X_f [0]byte Pad_cgo_1 [4]byte } ``` type TCPInfo ------------ ``` type TCPInfo struct { State uint8 Ca_state uint8 Retransmits uint8 Probes uint8 Backoff uint8 Options uint8 Pad_cgo_0 [2]byte Rto uint32 Ato uint32 Snd_mss uint32 Rcv_mss uint32 Unacked uint32 Sacked uint32 Lost uint32 Retrans uint32 Fackets uint32 Last_data_sent uint32 Last_ack_sent uint32 Last_data_recv uint32 Last_ack_recv uint32 Pmtu uint32 Rcv_ssthresh uint32 Rtt uint32 Rttvar uint32 Snd_ssthresh uint32 Snd_cwnd uint32 Advmss uint32 Reordering uint32 Rcv_rtt uint32 Rcv_space uint32 Total_retrans uint32 } ``` type Termios ------------ ``` type Termios struct { Iflag uint32 Oflag uint32 Cflag uint32 Lflag uint32 Line uint8 Cc [32]uint8 Pad_cgo_0 [3]byte Ispeed uint32 Ospeed uint32 } ``` type Time\_t ------------ ``` type Time_t int64 ``` ### func Time ``` func Time(t *Time_t) (tt Time_t, err error) ``` type Timespec ------------- ``` type Timespec struct { Sec int64 Nsec int64 } ``` ### func NsecToTimespec 1.1 ``` func NsecToTimespec(nsec int64) Timespec ``` NsecToTimespec converts a number of nanoseconds into a Timespec. ### func (\*Timespec) Nano ``` func (ts *Timespec) Nano() int64 ``` Nano returns the time stored in ts as nanoseconds. ### func (\*Timespec) Unix ``` func (ts *Timespec) Unix() (sec int64, nsec int64) ``` Unix returns the time stored in ts as seconds plus nanoseconds. type Timeval ------------ ``` type Timeval struct { Sec int64 Usec int64 } ``` ### func NsecToTimeval ``` func NsecToTimeval(nsec int64) Timeval ``` NsecToTimeval converts a number of nanoseconds into a Timeval. ### func (\*Timeval) Nano ``` func (tv *Timeval) Nano() int64 ``` Nano returns the time stored in tv as nanoseconds. ### func (\*Timeval) Unix ``` func (tv *Timeval) Unix() (sec int64, nsec int64) ``` Unix returns the time stored in tv as seconds plus nanoseconds. type Timex ---------- ``` type Timex struct { Modes uint32 Pad_cgo_0 [4]byte Offset int64 Freq int64 Maxerror int64 Esterror int64 Status int32 Pad_cgo_1 [4]byte Constant int64 Precision int64 Tolerance int64 Time Timeval Tick int64 Ppsfreq int64 Jitter int64 Shift int32 Pad_cgo_2 [4]byte Stabil int64 Jitcnt int64 Calcnt int64 Errcnt int64 Stbcnt int64 Tai int32 Pad_cgo_3 [44]byte } ``` type Tms -------- ``` type Tms struct { Utime int64 Stime int64 Cutime int64 Cstime int64 } ``` type Ucred ---------- ``` type Ucred struct { Pid int32 Uid uint32 Gid uint32 } ``` ### func GetsockoptUcred ``` func GetsockoptUcred(fd, level, opt int) (*Ucred, error) ``` ### func ParseUnixCredentials ``` func ParseUnixCredentials(m *SocketControlMessage) (*Ucred, error) ``` ParseUnixCredentials decodes a socket control message that contains credentials in a Ucred structure. To receive such a message, the SO\_PASSCRED option must be enabled on the socket. type Ustat\_t ------------- ``` type Ustat_t struct { Tfree int32 Pad_cgo_0 [4]byte Tinode uint64 Fname [6]int8 Fpack [6]int8 Pad_cgo_1 [4]byte } ``` type Utimbuf ------------ ``` type Utimbuf struct { Actime int64 Modtime int64 } ``` type Utsname ------------ ``` type Utsname struct { Sysname [65]int8 Nodename [65]int8 Release [65]int8 Version [65]int8 Machine [65]int8 Domainname [65]int8 } ``` type WaitStatus --------------- ``` type WaitStatus uint32 ``` ### func (WaitStatus) Continued ``` func (w WaitStatus) Continued() bool ``` ### func (WaitStatus) CoreDump ``` func (w WaitStatus) CoreDump() bool ``` ### func (WaitStatus) ExitStatus ``` func (w WaitStatus) ExitStatus() int ``` ### func (WaitStatus) Exited ``` func (w WaitStatus) Exited() bool ``` ### func (WaitStatus) Signal ``` func (w WaitStatus) Signal() Signal ``` ### func (WaitStatus) Signaled ``` func (w WaitStatus) Signaled() bool ``` ### func (WaitStatus) StopSignal ``` func (w WaitStatus) StopSignal() Signal ``` ### func (WaitStatus) Stopped ``` func (w WaitStatus) Stopped() bool ``` ### func (WaitStatus) TrapCause ``` func (w WaitStatus) TrapCause() int ``` Subdirectories -------------- | Name | Synopsis | | --- | --- | | [..](../index) | | [js](js/index) | Package js gives access to the WebAssembly host environment when using the js/wasm architecture. |
programming_docs
go Package js Package js =========== * `import "syscall/js"` * [Overview](#pkg-overview) * [Index](#pkg-index) * [Examples](#pkg-examples) Overview -------- Package js gives access to the WebAssembly host environment when using the js/wasm architecture. Its API is based on JavaScript semantics. This package is EXPERIMENTAL. Its current scope is only to allow tests to run, but not yet to provide a comprehensive API for users. It is exempt from the Go compatibility promise. Index ----- * [func CopyBytesToGo(dst []byte, src Value) int](#CopyBytesToGo) * [func CopyBytesToJS(dst Value, src []byte) int](#CopyBytesToJS) * [type Error](#Error) * [func (e Error) Error() string](#Error.Error) * [type Func](#Func) * [func FuncOf(fn func(this Value, args []Value) any) Func](#FuncOf) * [func (c Func) Release()](#Func.Release) * [type Type](#Type) * [func (t Type) String() string](#Type.String) * [type Value](#Value) * [func Global() Value](#Global) * [func Null() Value](#Null) * [func Undefined() Value](#Undefined) * [func ValueOf(x any) Value](#ValueOf) * [func (v Value) Bool() bool](#Value.Bool) * [func (v Value) Call(m string, args ...any) Value](#Value.Call) * [func (v Value) Delete(p string)](#Value.Delete) * [func (v Value) Equal(w Value) bool](#Value.Equal) * [func (v Value) Float() float64](#Value.Float) * [func (v Value) Get(p string) Value](#Value.Get) * [func (v Value) Index(i int) Value](#Value.Index) * [func (v Value) InstanceOf(t Value) bool](#Value.InstanceOf) * [func (v Value) Int() int](#Value.Int) * [func (v Value) Invoke(args ...any) Value](#Value.Invoke) * [func (v Value) IsNaN() bool](#Value.IsNaN) * [func (v Value) IsNull() bool](#Value.IsNull) * [func (v Value) IsUndefined() bool](#Value.IsUndefined) * [func (v Value) Length() int](#Value.Length) * [func (v Value) New(args ...any) Value](#Value.New) * [func (v Value) Set(p string, x any)](#Value.Set) * [func (v Value) SetIndex(i int, x any)](#Value.SetIndex) * [func (v Value) String() string](#Value.String) * [func (v Value) Truthy() bool](#Value.Truthy) * [func (v Value) Type() Type](#Value.Type) * [type ValueError](#ValueError) * [func (e \*ValueError) Error() string](#ValueError.Error) ### Examples [FuncOf](#example_FuncOf) ### Package files func.go js.go func CopyBytesToGo ------------------ ``` func CopyBytesToGo(dst []byte, src Value) int ``` CopyBytesToGo copies bytes from src to dst. It panics if src is not an Uint8Array or Uint8ClampedArray. It returns the number of bytes copied, which will be the minimum of the lengths of src and dst. func CopyBytesToJS ------------------ ``` func CopyBytesToJS(dst Value, src []byte) int ``` CopyBytesToJS copies bytes from src to dst. It panics if dst is not an Uint8Array or Uint8ClampedArray. It returns the number of bytes copied, which will be the minimum of the lengths of src and dst. type Error ---------- Error wraps a JavaScript error. ``` type Error struct { // Value is the underlying JavaScript error value. Value } ``` ### func (Error) Error ``` func (e Error) Error() string ``` Error implements the error interface. type Func --------- Func is a wrapped Go function to be called by JavaScript. ``` type Func struct { Value // the JavaScript function that invokes the Go function // contains filtered or unexported fields } ``` ### func FuncOf ``` func FuncOf(fn func(this Value, args []Value) any) Func ``` FuncOf returns a function to be used by JavaScript. The Go function fn is called with the value of JavaScript's "this" keyword and the arguments of the invocation. The return value of the invocation is the result of the Go function mapped back to JavaScript according to ValueOf. Invoking the wrapped Go function from JavaScript will pause the event loop and spawn a new goroutine. Other wrapped functions which are triggered during a call from Go to JavaScript get executed on the same goroutine. As a consequence, if one wrapped function blocks, JavaScript's event loop is blocked until that function returns. Hence, calling any async JavaScript API, which requires the event loop, like fetch (http.Client), will cause an immediate deadlock. Therefore a blocking function should explicitly start a new goroutine. Func.Release must be called to free up resources when the function will not be invoked any more. #### Example Code: ``` var cb js.Func cb = js.FuncOf(func(this js.Value, args []js.Value) any { fmt.Println("button clicked") cb.Release() // release the function if the button will not be clicked again return nil }) js.Global().Get("document").Call("getElementById", "myButton").Call("addEventListener", "click", cb) ``` ### func (Func) Release ``` func (c Func) Release() ``` Release frees up resources allocated for the function. The function must not be invoked after calling Release. It is allowed to call Release while the function is still running. type Type --------- Type represents the JavaScript type of a Value. ``` type Type int ``` ``` const ( TypeUndefined Type = iota TypeNull TypeBoolean TypeNumber TypeString TypeSymbol TypeObject TypeFunction ) ``` ### func (Type) String ``` func (t Type) String() string ``` type Value ---------- Value represents a JavaScript value. The zero value is the JavaScript value "undefined". Values can be checked for equality with the Equal method. ``` type Value struct { // contains filtered or unexported fields } ``` ### func Global ``` func Global() Value ``` Global returns the JavaScript global object, usually "window" or "global". ### func Null ``` func Null() Value ``` Null returns the JavaScript value "null". ### func Undefined ``` func Undefined() Value ``` Undefined returns the JavaScript value "undefined". ### func ValueOf ``` func ValueOf(x any) Value ``` ValueOf returns x as a JavaScript value: ``` | Go | JavaScript | | ---------------------- | ---------------------- | | js.Value | [its value] | | js.Func | function | | nil | null | | bool | boolean | | integers and floats | number | | string | string | | []interface{} | new array | | map[string]interface{} | new object | ``` Panics if x is not one of the expected types. ### func (Value) Bool ``` func (v Value) Bool() bool ``` Bool returns the value v as a bool. It panics if v is not a JavaScript boolean. ### func (Value) Call ``` func (v Value) Call(m string, args ...any) Value ``` Call does a JavaScript call to the method m of value v with the given arguments. It panics if v has no method m. The arguments get mapped to JavaScript values according to the ValueOf function. ### func (Value) Delete ``` func (v Value) Delete(p string) ``` Delete deletes the JavaScript property p of value v. It panics if v is not a JavaScript object. ### func (Value) Equal ``` func (v Value) Equal(w Value) bool ``` Equal reports whether v and w are equal according to JavaScript's === operator. ### func (Value) Float ``` func (v Value) Float() float64 ``` Float returns the value v as a float64. It panics if v is not a JavaScript number. ### func (Value) Get ``` func (v Value) Get(p string) Value ``` Get returns the JavaScript property p of value v. It panics if v is not a JavaScript object. ### func (Value) Index ``` func (v Value) Index(i int) Value ``` Index returns JavaScript index i of value v. It panics if v is not a JavaScript object. ### func (Value) InstanceOf ``` func (v Value) InstanceOf(t Value) bool ``` InstanceOf reports whether v is an instance of type t according to JavaScript's instanceof operator. ### func (Value) Int ``` func (v Value) Int() int ``` Int returns the value v truncated to an int. It panics if v is not a JavaScript number. ### func (Value) Invoke ``` func (v Value) Invoke(args ...any) Value ``` Invoke does a JavaScript call of the value v with the given arguments. It panics if v is not a JavaScript function. The arguments get mapped to JavaScript values according to the ValueOf function. ### func (Value) IsNaN ``` func (v Value) IsNaN() bool ``` IsNaN reports whether v is the JavaScript value "NaN". ### func (Value) IsNull ``` func (v Value) IsNull() bool ``` IsNull reports whether v is the JavaScript value "null". ### func (Value) IsUndefined ``` func (v Value) IsUndefined() bool ``` IsUndefined reports whether v is the JavaScript value "undefined". ### func (Value) Length ``` func (v Value) Length() int ``` Length returns the JavaScript property "length" of v. It panics if v is not a JavaScript object. ### func (Value) New ``` func (v Value) New(args ...any) Value ``` New uses JavaScript's "new" operator with value v as constructor and the given arguments. It panics if v is not a JavaScript function. The arguments get mapped to JavaScript values according to the ValueOf function. ### func (Value) Set ``` func (v Value) Set(p string, x any) ``` Set sets the JavaScript property p of value v to ValueOf(x). It panics if v is not a JavaScript object. ### func (Value) SetIndex ``` func (v Value) SetIndex(i int, x any) ``` SetIndex sets the JavaScript index i of value v to ValueOf(x). It panics if v is not a JavaScript object. ### func (Value) String ``` func (v Value) String() string ``` String returns the value v as a string. String is a special case because of Go's String method convention. Unlike the other getters, it does not panic if v's Type is not TypeString. Instead, it returns a string of the form "<T>" or "<T: V>" where T is v's type and V is a string representation of v's value. ### func (Value) Truthy ``` func (v Value) Truthy() bool ``` Truthy returns the JavaScript "truthiness" of the value v. In JavaScript, false, 0, "", null, undefined, and NaN are "falsy", and everything else is "truthy". See <https://developer.mozilla.org/en-US/docs/Glossary/Truthy>. ### func (Value) Type ``` func (v Value) Type() Type ``` Type returns the JavaScript type of the value v. It is similar to JavaScript's typeof operator, except that it returns TypeNull instead of TypeObject for null. type ValueError --------------- A ValueError occurs when a Value method is invoked on a Value that does not support it. Such cases are documented in the description of each method. ``` type ValueError struct { Method string Type Type } ``` ### func (\*ValueError) Error ``` func (e *ValueError) Error() string ``` go Package expvar Package expvar =============== * `import "expvar"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package expvar provides a standardized interface to public variables, such as operation counters in servers. It exposes these variables via HTTP at /debug/vars in JSON format. Operations to set or modify these public variables are atomic. In addition to adding the HTTP handler, this package registers the following variables: ``` cmdline os.Args memstats runtime.Memstats ``` The package is sometimes only imported for the side effect of registering its HTTP handler and the above variables. To use it this way, link this package into your program: ``` import _ "expvar" ``` Index ----- * [func Do(f func(KeyValue))](#Do) * [func Handler() http.Handler](#Handler) * [func Publish(name string, v Var)](#Publish) * [type Float](#Float) * [func NewFloat(name string) \*Float](#NewFloat) * [func (v \*Float) Add(delta float64)](#Float.Add) * [func (v \*Float) Set(value float64)](#Float.Set) * [func (v \*Float) String() string](#Float.String) * [func (v \*Float) Value() float64](#Float.Value) * [type Func](#Func) * [func (f Func) String() string](#Func.String) * [func (f Func) Value() any](#Func.Value) * [type Int](#Int) * [func NewInt(name string) \*Int](#NewInt) * [func (v \*Int) Add(delta int64)](#Int.Add) * [func (v \*Int) Set(value int64)](#Int.Set) * [func (v \*Int) String() string](#Int.String) * [func (v \*Int) Value() int64](#Int.Value) * [type KeyValue](#KeyValue) * [type Map](#Map) * [func NewMap(name string) \*Map](#NewMap) * [func (v \*Map) Add(key string, delta int64)](#Map.Add) * [func (v \*Map) AddFloat(key string, delta float64)](#Map.AddFloat) * [func (v \*Map) Delete(key string)](#Map.Delete) * [func (v \*Map) Do(f func(KeyValue))](#Map.Do) * [func (v \*Map) Get(key string) Var](#Map.Get) * [func (v \*Map) Init() \*Map](#Map.Init) * [func (v \*Map) Set(key string, av Var)](#Map.Set) * [func (v \*Map) String() string](#Map.String) * [type String](#String) * [func NewString(name string) \*String](#NewString) * [func (v \*String) Set(value string)](#String.Set) * [func (v \*String) String() string](#String.String) * [func (v \*String) Value() string](#String.Value) * [type Var](#Var) * [func Get(name string) Var](#Get) ### Package files expvar.go func Do ------- ``` func Do(f func(KeyValue)) ``` Do calls f for each exported variable. The global variable map is locked during the iteration, but existing entries may be concurrently updated. func Handler 1.8 ---------------- ``` func Handler() http.Handler ``` Handler returns the expvar HTTP Handler. This is only needed to install the handler in a non-standard location. func Publish ------------ ``` func Publish(name string, v Var) ``` Publish declares a named exported variable. This should be called from a package's init function when it creates its Vars. If the name is already registered then this will log.Panic. type Float ---------- Float is a 64-bit float variable that satisfies the Var interface. ``` type Float struct { // contains filtered or unexported fields } ``` ### func NewFloat ``` func NewFloat(name string) *Float ``` ### func (\*Float) Add ``` func (v *Float) Add(delta float64) ``` Add adds delta to v. ### func (\*Float) Set ``` func (v *Float) Set(value float64) ``` Set sets v to value. ### func (\*Float) String ``` func (v *Float) String() string ``` ### func (\*Float) Value 1.8 ``` func (v *Float) Value() float64 ``` type Func --------- Func implements Var by calling the function and formatting the returned value using JSON. ``` type Func func() any ``` ### func (Func) String ``` func (f Func) String() string ``` ### func (Func) Value 1.8 ``` func (f Func) Value() any ``` type Int -------- Int is a 64-bit integer variable that satisfies the Var interface. ``` type Int struct { // contains filtered or unexported fields } ``` ### func NewInt ``` func NewInt(name string) *Int ``` ### func (\*Int) Add ``` func (v *Int) Add(delta int64) ``` ### func (\*Int) Set ``` func (v *Int) Set(value int64) ``` ### func (\*Int) String ``` func (v *Int) String() string ``` ### func (\*Int) Value 1.8 ``` func (v *Int) Value() int64 ``` type KeyValue ------------- KeyValue represents a single entry in a Map. ``` type KeyValue struct { Key string Value Var } ``` type Map -------- Map is a string-to-Var map variable that satisfies the Var interface. ``` type Map struct { // contains filtered or unexported fields } ``` ### func NewMap ``` func NewMap(name string) *Map ``` ### func (\*Map) Add ``` func (v *Map) Add(key string, delta int64) ``` Add adds delta to the \*Int value stored under the given map key. ### func (\*Map) AddFloat ``` func (v *Map) AddFloat(key string, delta float64) ``` AddFloat adds delta to the \*Float value stored under the given map key. ### func (\*Map) Delete 1.12 ``` func (v *Map) Delete(key string) ``` Delete deletes the given key from the map. ### func (\*Map) Do ``` func (v *Map) Do(f func(KeyValue)) ``` Do calls f for each entry in the map. The map is locked during the iteration, but existing entries may be concurrently updated. ### func (\*Map) Get ``` func (v *Map) Get(key string) Var ``` ### func (\*Map) Init ``` func (v *Map) Init() *Map ``` Init removes all keys from the map. ### func (\*Map) Set ``` func (v *Map) Set(key string, av Var) ``` ### func (\*Map) String ``` func (v *Map) String() string ``` type String ----------- String is a string variable, and satisfies the Var interface. ``` type String struct { // contains filtered or unexported fields } ``` ### func NewString ``` func NewString(name string) *String ``` ### func (\*String) Set ``` func (v *String) Set(value string) ``` ### func (\*String) String ``` func (v *String) String() string ``` String implements the Var interface. To get the unquoted string use Value. ### func (\*String) Value 1.8 ``` func (v *String) Value() string ``` type Var -------- Var is an abstract type for all exported variables. ``` type Var interface { // String returns a valid JSON value for the variable. // Types with String methods that do not return valid JSON // (such as time.Time) must not be used as a Var. String() string } ``` ### func Get ``` func Get(name string) Var ``` Get retrieves a named exported variable. It returns nil if the name has not been registered. go Package builtin Package builtin ================ * `import "builtin"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package builtin provides documentation for Go's predeclared identifiers. The items documented here are not actually in package builtin but their descriptions here allow godoc to present documentation for the language's special identifiers. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func append(slice []Type, elems ...Type) []Type](#append) * [func cap(v Type) int](#cap) * [func close(c chan<- Type)](#close) * [func complex(r, i FloatType) ComplexType](#complex) * [func copy(dst, src []Type) int](#copy) * [func delete(m map[Type]Type1, key Type)](#delete) * [func imag(c ComplexType) FloatType](#imag) * [func len(v Type) int](#len) * [func make(t Type, size ...IntegerType) Type](#make) * [func new(Type) \*Type](#new) * [func panic(v any)](#panic) * [func print(args ...Type)](#print) * [func println(args ...Type)](#println) * [func real(c ComplexType) FloatType](#real) * [func recover() any](#recover) * [type ComplexType](#ComplexType) * [type FloatType](#FloatType) * [type IntegerType](#IntegerType) * [type Type](#Type) * [type Type1](#Type1) * [type any](#any) * [type bool](#bool) * [type byte](#byte) * [type comparable](#comparable) * [type complex128](#complex128) * [type complex64](#complex64) * [type error](#error) * [type float32](#float32) * [type float64](#float64) * [type int](#int) * [type int16](#int16) * [type int32](#int32) * [type int64](#int64) * [type int8](#int8) * [type rune](#rune) * [type string](#string) * [type uint](#uint) * [type uint16](#uint16) * [type uint32](#uint32) * [type uint64](#uint64) * [type uint8](#uint8) * [type uintptr](#uintptr) ### Package files builtin.go Constants --------- true and false are the two untyped boolean values. ``` const ( true = 0 == 0 // Untyped bool. false = 0 != 0 // Untyped bool. ) ``` iota is a predeclared identifier representing the untyped integer ordinal number of the current const specification in a (usually parenthesized) const declaration. It is zero-indexed. ``` const iota = 0 // Untyped int. ``` Variables --------- nil is a predeclared identifier representing the zero value for a pointer, channel, func, interface, map, or slice type. ``` var nil Type // Type must be a pointer, channel, func, interface, map, or slice type ``` func append ----------- ``` func append(slice []Type, elems ...Type) []Type ``` The append built-in function appends elements to the end of a slice. If it has sufficient capacity, the destination is resliced to accommodate the new elements. If it does not, a new underlying array will be allocated. Append returns the updated slice. It is therefore necessary to store the result of append, often in the variable holding the slice itself: ``` slice = append(slice, elem1, elem2) slice = append(slice, anotherSlice...) ``` As a special case, it is legal to append a string to a byte slice, like this: ``` slice = append([]byte("hello "), "world"...) ``` func cap -------- ``` func cap(v Type) int ``` The cap built-in function returns the capacity of v, according to its type: ``` Array: the number of elements in v (same as len(v)). Pointer to array: the number of elements in *v (same as len(v)). Slice: the maximum length the slice can reach when resliced; if v is nil, cap(v) is zero. Channel: the channel buffer capacity, in units of elements; if v is nil, cap(v) is zero. ``` For some arguments, such as a simple array expression, the result can be a constant. See the Go language specification's "Length and capacity" section for details. func close ---------- ``` func close(c chan<- Type) ``` The close built-in function closes a channel, which must be either bidirectional or send-only. It should be executed only by the sender, never the receiver, and has the effect of shutting down the channel after the last sent value is received. After the last value has been received from a closed channel c, any receive from c will succeed without blocking, returning the zero value for the channel element. The form ``` x, ok := <-c ``` will also set ok to false for a closed and empty channel. func complex ------------ ``` func complex(r, i FloatType) ComplexType ``` The complex built-in function constructs a complex value from two floating-point values. The real and imaginary parts must be of the same size, either float32 or float64 (or assignable to them), and the return value will be the corresponding complex type (complex64 for float32, complex128 for float64). func copy --------- ``` func copy(dst, src []Type) int ``` The copy built-in function copies elements from a source slice into a destination slice. (As a special case, it also will copy bytes from a string to a slice of bytes.) The source and destination may overlap. Copy returns the number of elements copied, which will be the minimum of len(src) and len(dst). func delete ----------- ``` func delete(m map[Type]Type1, key Type) ``` The delete built-in function deletes the element with the specified key (m[key]) from the map. If m is nil or there is no such element, delete is a no-op. func imag --------- ``` func imag(c ComplexType) FloatType ``` The imag built-in function returns the imaginary part of the complex number c. The return value will be floating point type corresponding to the type of c. func len -------- ``` func len(v Type) int ``` The len built-in function returns the length of v, according to its type: ``` Array: the number of elements in v. Pointer to array: the number of elements in *v (even if v is nil). Slice, or map: the number of elements in v; if v is nil, len(v) is zero. String: the number of bytes in v. Channel: the number of elements queued (unread) in the channel buffer; if v is nil, len(v) is zero. ``` For some arguments, such as a string literal or a simple array expression, the result can be a constant. See the Go language specification's "Length and capacity" section for details. func make --------- ``` func make(t Type, size ...IntegerType) Type ``` The make built-in function allocates and initializes an object of type slice, map, or chan (only). Like new, the first argument is a type, not a value. Unlike new, make's return type is the same as the type of its argument, not a pointer to it. The specification of the result depends on the type: ``` Slice: The size specifies the length. The capacity of the slice is equal to its length. A second integer argument may be provided to specify a different capacity; it must be no smaller than the length. For example, make([]int, 0, 10) allocates an underlying array of size 10 and returns a slice of length 0 and capacity 10 that is backed by this underlying array. Map: An empty map is allocated with enough space to hold the specified number of elements. The size may be omitted, in which case a small starting size is allocated. Channel: The channel's buffer is initialized with the specified buffer capacity. If zero, or the size is omitted, the channel is unbuffered. ``` func new -------- ``` func new(Type) *Type ``` The new built-in function allocates memory. The first argument is a type, not a value, and the value returned is a pointer to a newly allocated zero value of that type. func panic ---------- ``` func panic(v any) ``` The panic built-in function stops normal execution of the current goroutine. When a function F calls panic, normal execution of F stops immediately. Any functions whose execution was deferred by F are run in the usual way, and then F returns to its caller. To the caller G, the invocation of F then behaves like a call to panic, terminating G's execution and running any deferred functions. This continues until all functions in the executing goroutine have stopped, in reverse order. At that point, the program is terminated with a non-zero exit code. This termination sequence is called panicking and can be controlled by the built-in function recover. func print ---------- ``` func print(args ...Type) ``` The print built-in function formats its arguments in an implementation-specific way and writes the result to standard error. Print is useful for bootstrapping and debugging; it is not guaranteed to stay in the language. func println ------------ ``` func println(args ...Type) ``` The println built-in function formats its arguments in an implementation-specific way and writes the result to standard error. Spaces are always added between arguments and a newline is appended. Println is useful for bootstrapping and debugging; it is not guaranteed to stay in the language. func real --------- ``` func real(c ComplexType) FloatType ``` The real built-in function returns the real part of the complex number c. The return value will be floating point type corresponding to the type of c. func recover ------------ ``` func recover() any ``` The recover built-in function allows a program to manage behavior of a panicking goroutine. Executing a call to recover inside a deferred function (but not any function called by it) stops the panicking sequence by restoring normal execution and retrieves the error value passed to the call of panic. If recover is called outside the deferred function it will not stop a panicking sequence. In this case, or when the goroutine is not panicking, or if the argument supplied to panic was nil, recover returns nil. Thus the return value from recover reports whether the goroutine is panicking. type ComplexType ---------------- ComplexType is here for the purposes of documentation only. It is a stand-in for either complex type: complex64 or complex128. ``` type ComplexType complex64 ``` type FloatType -------------- FloatType is here for the purposes of documentation only. It is a stand-in for either float type: float32 or float64. ``` type FloatType float32 ``` type IntegerType ---------------- IntegerType is here for the purposes of documentation only. It is a stand-in for any integer type: int, uint, int8 etc. ``` type IntegerType int ``` type Type --------- Type is here for the purposes of documentation only. It is a stand-in for any Go type, but represents the same type for any given function invocation. ``` type Type int ``` type Type1 ---------- Type1 is here for the purposes of documentation only. It is a stand-in for any Go type, but represents the same type for any given function invocation. ``` type Type1 int ``` type any -------- any is an alias for interface{} and is equivalent to interface{} in all ways. ``` type any = interface{} ``` type bool --------- bool is the set of boolean values, true and false. ``` type bool bool ``` type byte --------- byte is an alias for uint8 and is equivalent to uint8 in all ways. It is used, by convention, to distinguish byte values from 8-bit unsigned integer values. ``` type byte = uint8 ``` type comparable --------------- comparable is an interface that is implemented by all comparable types (booleans, numbers, strings, pointers, channels, arrays of comparable types, structs whose fields are all comparable types). The comparable interface may only be used as a type parameter constraint, not as the type of a variable. ``` type comparable interface{ comparable } ``` type complex128 --------------- complex128 is the set of all complex numbers with float64 real and imaginary parts. ``` type complex128 complex128 ``` type complex64 -------------- complex64 is the set of all complex numbers with float32 real and imaginary parts. ``` type complex64 complex64 ``` type error ---------- The error built-in interface type is the conventional interface for representing an error condition, with the nil value representing no error. ``` type error interface { Error() string } ``` type float32 ------------ float32 is the set of all IEEE-754 32-bit floating-point numbers. ``` type float32 float32 ``` type float64 ------------ float64 is the set of all IEEE-754 64-bit floating-point numbers. ``` type float64 float64 ``` type int -------- int is a signed integer type that is at least 32 bits in size. It is a distinct type, however, and not an alias for, say, int32. ``` type int int ``` type int16 ---------- int16 is the set of all signed 16-bit integers. Range: -32768 through 32767. ``` type int16 int16 ``` type int32 ---------- int32 is the set of all signed 32-bit integers. Range: -2147483648 through 2147483647. ``` type int32 int32 ``` type int64 ---------- int64 is the set of all signed 64-bit integers. Range: -9223372036854775808 through 9223372036854775807. ``` type int64 int64 ``` type int8 --------- int8 is the set of all signed 8-bit integers. Range: -128 through 127. ``` type int8 int8 ``` type rune --------- rune is an alias for int32 and is equivalent to int32 in all ways. It is used, by convention, to distinguish character values from integer values. ``` type rune = int32 ``` type string ----------- string is the set of all strings of 8-bit bytes, conventionally but not necessarily representing UTF-8-encoded text. A string may be empty, but not nil. Values of string type are immutable. ``` type string string ``` type uint --------- uint is an unsigned integer type that is at least 32 bits in size. It is a distinct type, however, and not an alias for, say, uint32. ``` type uint uint ``` type uint16 ----------- uint16 is the set of all unsigned 16-bit integers. Range: 0 through 65535. ``` type uint16 uint16 ``` type uint32 ----------- uint32 is the set of all unsigned 32-bit integers. Range: 0 through 4294967295. ``` type uint32 uint32 ``` type uint64 ----------- uint64 is the set of all unsigned 64-bit integers. Range: 0 through 18446744073709551615. ``` type uint64 uint64 ``` type uint8 ---------- uint8 is the set of all unsigned 8-bit integers. Range: 0 through 255. ``` type uint8 uint8 ``` type uintptr ------------ uintptr is an integer type that is large enough to hold the bit pattern of any pointer. ``` type uintptr uintptr ```
programming_docs
go Package pe Package pe =========== * `import "debug/pe"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package pe implements access to PE (Microsoft Windows Portable Executable) files. ### Security This package is not designed to be hardened against adversarial inputs, and is outside the scope of <https://go.dev/security/policy>. In particular, only basic validation is done when parsing object files. As such, care should be taken when parsing untrusted inputs, as parsing malformed files may consume significant resources, or cause panics. Index ----- * [Constants](#pkg-constants) * [type COFFSymbol](#COFFSymbol) * [func (sym \*COFFSymbol) FullName(st StringTable) (string, error)](#COFFSymbol.FullName) * [type COFFSymbolAuxFormat5](#COFFSymbolAuxFormat5) * [type DataDirectory](#DataDirectory) * [type File](#File) * [func NewFile(r io.ReaderAt) (\*File, error)](#NewFile) * [func Open(name string) (\*File, error)](#Open) * [func (f \*File) COFFSymbolReadSectionDefAux(idx int) (\*COFFSymbolAuxFormat5, error)](#File.COFFSymbolReadSectionDefAux) * [func (f \*File) Close() error](#File.Close) * [func (f \*File) DWARF() (\*dwarf.Data, error)](#File.DWARF) * [func (f \*File) ImportedLibraries() ([]string, error)](#File.ImportedLibraries) * [func (f \*File) ImportedSymbols() ([]string, error)](#File.ImportedSymbols) * [func (f \*File) Section(name string) \*Section](#File.Section) * [type FileHeader](#FileHeader) * [type FormatError](#FormatError) * [func (e \*FormatError) Error() string](#FormatError.Error) * [type ImportDirectory](#ImportDirectory) * [type OptionalHeader32](#OptionalHeader32) * [type OptionalHeader64](#OptionalHeader64) * [type Reloc](#Reloc) * [type Section](#Section) * [func (s \*Section) Data() ([]byte, error)](#Section.Data) * [func (s \*Section) Open() io.ReadSeeker](#Section.Open) * [type SectionHeader](#SectionHeader) * [type SectionHeader32](#SectionHeader32) * [type StringTable](#StringTable) * [func (st StringTable) String(start uint32) (string, error)](#StringTable.String) * [type Symbol](#Symbol) ### Package files file.go pe.go section.go string.go symbol.go Constants --------- ``` const ( IMAGE_FILE_MACHINE_UNKNOWN = 0x0 IMAGE_FILE_MACHINE_AM33 = 0x1d3 IMAGE_FILE_MACHINE_AMD64 = 0x8664 IMAGE_FILE_MACHINE_ARM = 0x1c0 IMAGE_FILE_MACHINE_ARMNT = 0x1c4 IMAGE_FILE_MACHINE_ARM64 = 0xaa64 IMAGE_FILE_MACHINE_EBC = 0xebc IMAGE_FILE_MACHINE_I386 = 0x14c IMAGE_FILE_MACHINE_IA64 = 0x200 IMAGE_FILE_MACHINE_LOONGARCH32 = 0x6232 IMAGE_FILE_MACHINE_LOONGARCH64 = 0x6264 IMAGE_FILE_MACHINE_M32R = 0x9041 IMAGE_FILE_MACHINE_MIPS16 = 0x266 IMAGE_FILE_MACHINE_MIPSFPU = 0x366 IMAGE_FILE_MACHINE_MIPSFPU16 = 0x466 IMAGE_FILE_MACHINE_POWERPC = 0x1f0 IMAGE_FILE_MACHINE_POWERPCFP = 0x1f1 IMAGE_FILE_MACHINE_R4000 = 0x166 IMAGE_FILE_MACHINE_SH3 = 0x1a2 IMAGE_FILE_MACHINE_SH3DSP = 0x1a3 IMAGE_FILE_MACHINE_SH4 = 0x1a6 IMAGE_FILE_MACHINE_SH5 = 0x1a8 IMAGE_FILE_MACHINE_THUMB = 0x1c2 IMAGE_FILE_MACHINE_WCEMIPSV2 = 0x169 IMAGE_FILE_MACHINE_RISCV32 = 0x5032 IMAGE_FILE_MACHINE_RISCV64 = 0x5064 IMAGE_FILE_MACHINE_RISCV128 = 0x5128 ) ``` IMAGE\_DIRECTORY\_ENTRY constants ``` const ( IMAGE_DIRECTORY_ENTRY_EXPORT = 0 IMAGE_DIRECTORY_ENTRY_IMPORT = 1 IMAGE_DIRECTORY_ENTRY_RESOURCE = 2 IMAGE_DIRECTORY_ENTRY_EXCEPTION = 3 IMAGE_DIRECTORY_ENTRY_SECURITY = 4 IMAGE_DIRECTORY_ENTRY_BASERELOC = 5 IMAGE_DIRECTORY_ENTRY_DEBUG = 6 IMAGE_DIRECTORY_ENTRY_ARCHITECTURE = 7 IMAGE_DIRECTORY_ENTRY_GLOBALPTR = 8 IMAGE_DIRECTORY_ENTRY_TLS = 9 IMAGE_DIRECTORY_ENTRY_LOAD_CONFIG = 10 IMAGE_DIRECTORY_ENTRY_BOUND_IMPORT = 11 IMAGE_DIRECTORY_ENTRY_IAT = 12 IMAGE_DIRECTORY_ENTRY_DELAY_IMPORT = 13 IMAGE_DIRECTORY_ENTRY_COM_DESCRIPTOR = 14 ) ``` Values of IMAGE\_FILE\_HEADER.Characteristics. These can be combined together. ``` const ( IMAGE_FILE_RELOCS_STRIPPED = 0x0001 IMAGE_FILE_EXECUTABLE_IMAGE = 0x0002 IMAGE_FILE_LINE_NUMS_STRIPPED = 0x0004 IMAGE_FILE_LOCAL_SYMS_STRIPPED = 0x0008 IMAGE_FILE_AGGRESIVE_WS_TRIM = 0x0010 IMAGE_FILE_LARGE_ADDRESS_AWARE = 0x0020 IMAGE_FILE_BYTES_REVERSED_LO = 0x0080 IMAGE_FILE_32BIT_MACHINE = 0x0100 IMAGE_FILE_DEBUG_STRIPPED = 0x0200 IMAGE_FILE_REMOVABLE_RUN_FROM_SWAP = 0x0400 IMAGE_FILE_NET_RUN_FROM_SWAP = 0x0800 IMAGE_FILE_SYSTEM = 0x1000 IMAGE_FILE_DLL = 0x2000 IMAGE_FILE_UP_SYSTEM_ONLY = 0x4000 IMAGE_FILE_BYTES_REVERSED_HI = 0x8000 ) ``` OptionalHeader64.Subsystem and OptionalHeader32.Subsystem values. ``` const ( IMAGE_SUBSYSTEM_UNKNOWN = 0 IMAGE_SUBSYSTEM_NATIVE = 1 IMAGE_SUBSYSTEM_WINDOWS_GUI = 2 IMAGE_SUBSYSTEM_WINDOWS_CUI = 3 IMAGE_SUBSYSTEM_OS2_CUI = 5 IMAGE_SUBSYSTEM_POSIX_CUI = 7 IMAGE_SUBSYSTEM_NATIVE_WINDOWS = 8 IMAGE_SUBSYSTEM_WINDOWS_CE_GUI = 9 IMAGE_SUBSYSTEM_EFI_APPLICATION = 10 IMAGE_SUBSYSTEM_EFI_BOOT_SERVICE_DRIVER = 11 IMAGE_SUBSYSTEM_EFI_RUNTIME_DRIVER = 12 IMAGE_SUBSYSTEM_EFI_ROM = 13 IMAGE_SUBSYSTEM_XBOX = 14 IMAGE_SUBSYSTEM_WINDOWS_BOOT_APPLICATION = 16 ) ``` OptionalHeader64.DllCharacteristics and OptionalHeader32.DllCharacteristics values. These can be combined together. ``` const ( IMAGE_DLLCHARACTERISTICS_HIGH_ENTROPY_VA = 0x0020 IMAGE_DLLCHARACTERISTICS_DYNAMIC_BASE = 0x0040 IMAGE_DLLCHARACTERISTICS_FORCE_INTEGRITY = 0x0080 IMAGE_DLLCHARACTERISTICS_NX_COMPAT = 0x0100 IMAGE_DLLCHARACTERISTICS_NO_ISOLATION = 0x0200 IMAGE_DLLCHARACTERISTICS_NO_SEH = 0x0400 IMAGE_DLLCHARACTERISTICS_NO_BIND = 0x0800 IMAGE_DLLCHARACTERISTICS_APPCONTAINER = 0x1000 IMAGE_DLLCHARACTERISTICS_WDM_DRIVER = 0x2000 IMAGE_DLLCHARACTERISTICS_GUARD_CF = 0x4000 IMAGE_DLLCHARACTERISTICS_TERMINAL_SERVER_AWARE = 0x8000 ) ``` Section characteristics flags. ``` const ( IMAGE_SCN_CNT_CODE = 0x00000020 IMAGE_SCN_CNT_INITIALIZED_DATA = 0x00000040 IMAGE_SCN_CNT_UNINITIALIZED_DATA = 0x00000080 IMAGE_SCN_LNK_COMDAT = 0x00001000 IMAGE_SCN_MEM_DISCARDABLE = 0x02000000 IMAGE_SCN_MEM_EXECUTE = 0x20000000 IMAGE_SCN_MEM_READ = 0x40000000 IMAGE_SCN_MEM_WRITE = 0x80000000 ) ``` These constants make up the possible values for the 'Selection' field in an AuxFormat5. ``` const ( IMAGE_COMDAT_SELECT_NODUPLICATES = 1 IMAGE_COMDAT_SELECT_ANY = 2 IMAGE_COMDAT_SELECT_SAME_SIZE = 3 IMAGE_COMDAT_SELECT_EXACT_MATCH = 4 IMAGE_COMDAT_SELECT_ASSOCIATIVE = 5 IMAGE_COMDAT_SELECT_LARGEST = 6 ) ``` ``` const COFFSymbolSize = 18 ``` type COFFSymbol 1.1 ------------------- COFFSymbol represents single COFF symbol table record. ``` type COFFSymbol struct { Name [8]uint8 Value uint32 SectionNumber int16 Type uint16 StorageClass uint8 NumberOfAuxSymbols uint8 } ``` ### func (\*COFFSymbol) FullName 1.8 ``` func (sym *COFFSymbol) FullName(st StringTable) (string, error) ``` FullName finds real name of symbol sym. Normally name is stored in sym.Name, but if it is longer then 8 characters, it is stored in COFF string table st instead. type COFFSymbolAuxFormat5 1.19 ------------------------------ COFFSymbolAuxFormat5 describes the expected form of an aux symbol attached to a section definition symbol. The PE format defines a number of different aux symbol formats: format 1 for function definitions, format 2 for .be and .ef symbols, and so on. Format 5 holds extra info associated with a section definition, including number of relocations + line numbers, as well as COMDAT info. See <https://docs.microsoft.com/en-us/windows/win32/debug/pe-format#auxiliary-format-5-section-definitions> for more on what's going on here. ``` type COFFSymbolAuxFormat5 struct { Size uint32 NumRelocs uint16 NumLineNumbers uint16 Checksum uint32 SecNum uint16 Selection uint8 // contains filtered or unexported fields } ``` type DataDirectory 1.3 ---------------------- ``` type DataDirectory struct { VirtualAddress uint32 Size uint32 } ``` type File --------- A File represents an open PE file. ``` type File struct { FileHeader OptionalHeader any // of type *OptionalHeader32 or *OptionalHeader64; added in Go 1.3 Sections []*Section Symbols []*Symbol // COFF symbols with auxiliary symbol records removed; added in Go 1.1 COFFSymbols []COFFSymbol // all COFF symbols (including auxiliary symbol records); added in Go 1.8 StringTable StringTable // Go 1.8 // contains filtered or unexported fields } ``` ### func NewFile ``` func NewFile(r io.ReaderAt) (*File, error) ``` NewFile creates a new File for accessing a PE binary in an underlying reader. ### func Open ``` func Open(name string) (*File, error) ``` Open opens the named file using os.Open and prepares it for use as a PE binary. ### func (\*File) COFFSymbolReadSectionDefAux 1.19 ``` func (f *File) COFFSymbolReadSectionDefAux(idx int) (*COFFSymbolAuxFormat5, error) ``` COFFSymbolReadSectionDefAux returns a blob of axiliary information (including COMDAT info) for a section definition symbol. Here 'idx' is the index of a section symbol in the main COFFSymbol array for the File. Return value is a pointer to the appropriate aux symbol struct. For more info, see: auxiliary symbols: <https://docs.microsoft.com/en-us/windows/win32/debug/pe-format#auxiliary-symbol-records> COMDAT sections: <https://docs.microsoft.com/en-us/windows/win32/debug/pe-format#comdat-sections-object-only> auxiliary info for section definitions: <https://docs.microsoft.com/en-us/windows/win32/debug/pe-format#auxiliary-format-5-section-definitions> ### func (\*File) Close ``` func (f *File) Close() error ``` Close closes the File. If the File was created using NewFile directly instead of Open, Close has no effect. ### func (\*File) DWARF ``` func (f *File) DWARF() (*dwarf.Data, error) ``` ### func (\*File) ImportedLibraries ``` func (f *File) ImportedLibraries() ([]string, error) ``` ImportedLibraries returns the names of all libraries referred to by the binary f that are expected to be linked with the binary at dynamic link time. ### func (\*File) ImportedSymbols ``` func (f *File) ImportedSymbols() ([]string, error) ``` ImportedSymbols returns the names of all symbols referred to by the binary f that are expected to be satisfied by other libraries at dynamic load time. It does not return weak symbols. ### func (\*File) Section ``` func (f *File) Section(name string) *Section ``` Section returns the first section with the given name, or nil if no such section exists. type FileHeader --------------- ``` type FileHeader struct { Machine uint16 NumberOfSections uint16 TimeDateStamp uint32 PointerToSymbolTable uint32 NumberOfSymbols uint32 SizeOfOptionalHeader uint16 Characteristics uint16 } ``` type FormatError ---------------- FormatError is unused. The type is retained for compatibility. ``` type FormatError struct { } ``` ### func (\*FormatError) Error ``` func (e *FormatError) Error() string ``` type ImportDirectory -------------------- ``` type ImportDirectory struct { OriginalFirstThunk uint32 TimeDateStamp uint32 ForwarderChain uint32 Name uint32 FirstThunk uint32 // contains filtered or unexported fields } ``` type OptionalHeader32 1.3 ------------------------- ``` type OptionalHeader32 struct { Magic uint16 MajorLinkerVersion uint8 MinorLinkerVersion uint8 SizeOfCode uint32 SizeOfInitializedData uint32 SizeOfUninitializedData uint32 AddressOfEntryPoint uint32 BaseOfCode uint32 BaseOfData uint32 ImageBase uint32 SectionAlignment uint32 FileAlignment uint32 MajorOperatingSystemVersion uint16 MinorOperatingSystemVersion uint16 MajorImageVersion uint16 MinorImageVersion uint16 MajorSubsystemVersion uint16 MinorSubsystemVersion uint16 Win32VersionValue uint32 SizeOfImage uint32 SizeOfHeaders uint32 CheckSum uint32 Subsystem uint16 DllCharacteristics uint16 SizeOfStackReserve uint32 SizeOfStackCommit uint32 SizeOfHeapReserve uint32 SizeOfHeapCommit uint32 LoaderFlags uint32 NumberOfRvaAndSizes uint32 DataDirectory [16]DataDirectory } ``` type OptionalHeader64 1.3 ------------------------- ``` type OptionalHeader64 struct { Magic uint16 MajorLinkerVersion uint8 MinorLinkerVersion uint8 SizeOfCode uint32 SizeOfInitializedData uint32 SizeOfUninitializedData uint32 AddressOfEntryPoint uint32 BaseOfCode uint32 ImageBase uint64 SectionAlignment uint32 FileAlignment uint32 MajorOperatingSystemVersion uint16 MinorOperatingSystemVersion uint16 MajorImageVersion uint16 MinorImageVersion uint16 MajorSubsystemVersion uint16 MinorSubsystemVersion uint16 Win32VersionValue uint32 SizeOfImage uint32 SizeOfHeaders uint32 CheckSum uint32 Subsystem uint16 DllCharacteristics uint16 SizeOfStackReserve uint64 SizeOfStackCommit uint64 SizeOfHeapReserve uint64 SizeOfHeapCommit uint64 LoaderFlags uint32 NumberOfRvaAndSizes uint32 DataDirectory [16]DataDirectory } ``` type Reloc 1.8 -------------- Reloc represents a PE COFF relocation. Each section contains its own relocation list. ``` type Reloc struct { VirtualAddress uint32 SymbolTableIndex uint32 Type uint16 } ``` type Section ------------ Section provides access to PE COFF section. ``` type Section struct { SectionHeader Relocs []Reloc // Go 1.8 // Embed ReaderAt for ReadAt method. // Do not embed SectionReader directly // to avoid having Read and Seek. // If a client wants Read and Seek it must use // Open() to avoid fighting over the seek offset // with other clients. io.ReaderAt // contains filtered or unexported fields } ``` ### func (\*Section) Data ``` func (s *Section) Data() ([]byte, error) ``` Data reads and returns the contents of the PE section s. ### func (\*Section) Open ``` func (s *Section) Open() io.ReadSeeker ``` Open returns a new ReadSeeker reading the PE section s. type SectionHeader ------------------ SectionHeader is similar to SectionHeader32 with Name field replaced by Go string. ``` type SectionHeader struct { Name string VirtualSize uint32 VirtualAddress uint32 Size uint32 Offset uint32 PointerToRelocations uint32 PointerToLineNumbers uint32 NumberOfRelocations uint16 NumberOfLineNumbers uint16 Characteristics uint32 } ``` type SectionHeader32 -------------------- SectionHeader32 represents real PE COFF section header. ``` type SectionHeader32 struct { Name [8]uint8 VirtualSize uint32 VirtualAddress uint32 SizeOfRawData uint32 PointerToRawData uint32 PointerToRelocations uint32 PointerToLineNumbers uint32 NumberOfRelocations uint16 NumberOfLineNumbers uint16 Characteristics uint32 } ``` type StringTable 1.8 -------------------- StringTable is a COFF string table. ``` type StringTable []byte ``` ### func (StringTable) String 1.8 ``` func (st StringTable) String(start uint32) (string, error) ``` String extracts string from COFF string table st at offset start. type Symbol 1.1 --------------- Symbol is similar to COFFSymbol with Name field replaced by Go string. Symbol also does not have NumberOfAuxSymbols. ``` type Symbol struct { Name string Value uint32 SectionNumber int16 Type uint16 StorageClass uint8 } ``` go Package macho Package macho ============== * `import "debug/macho"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package macho implements access to Mach-O object files. ### Security This package is not designed to be hardened against adversarial inputs, and is outside the scope of <https://go.dev/security/policy>. In particular, only basic validation is done when parsing object files. As such, care should be taken when parsing untrusted inputs, as parsing malformed files may consume significant resources, or cause panics. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [type Cpu](#Cpu) * [func (i Cpu) GoString() string](#Cpu.GoString) * [func (i Cpu) String() string](#Cpu.String) * [type Dylib](#Dylib) * [type DylibCmd](#DylibCmd) * [type Dysymtab](#Dysymtab) * [type DysymtabCmd](#DysymtabCmd) * [type FatArch](#FatArch) * [type FatArchHeader](#FatArchHeader) * [type FatFile](#FatFile) * [func NewFatFile(r io.ReaderAt) (\*FatFile, error)](#NewFatFile) * [func OpenFat(name string) (\*FatFile, error)](#OpenFat) * [func (ff \*FatFile) Close() error](#FatFile.Close) * [type File](#File) * [func NewFile(r io.ReaderAt) (\*File, error)](#NewFile) * [func Open(name string) (\*File, error)](#Open) * [func (f \*File) Close() error](#File.Close) * [func (f \*File) DWARF() (\*dwarf.Data, error)](#File.DWARF) * [func (f \*File) ImportedLibraries() ([]string, error)](#File.ImportedLibraries) * [func (f \*File) ImportedSymbols() ([]string, error)](#File.ImportedSymbols) * [func (f \*File) Section(name string) \*Section](#File.Section) * [func (f \*File) Segment(name string) \*Segment](#File.Segment) * [type FileHeader](#FileHeader) * [type FormatError](#FormatError) * [func (e \*FormatError) Error() string](#FormatError.Error) * [type Load](#Load) * [type LoadBytes](#LoadBytes) * [func (b LoadBytes) Raw() []byte](#LoadBytes.Raw) * [type LoadCmd](#LoadCmd) * [func (i LoadCmd) GoString() string](#LoadCmd.GoString) * [func (i LoadCmd) String() string](#LoadCmd.String) * [type Nlist32](#Nlist32) * [type Nlist64](#Nlist64) * [type Regs386](#Regs386) * [type RegsAMD64](#RegsAMD64) * [type Reloc](#Reloc) * [type RelocTypeARM](#RelocTypeARM) * [func (r RelocTypeARM) GoString() string](#RelocTypeARM.GoString) * [func (i RelocTypeARM) String() string](#RelocTypeARM.String) * [type RelocTypeARM64](#RelocTypeARM64) * [func (r RelocTypeARM64) GoString() string](#RelocTypeARM64.GoString) * [func (i RelocTypeARM64) String() string](#RelocTypeARM64.String) * [type RelocTypeGeneric](#RelocTypeGeneric) * [func (r RelocTypeGeneric) GoString() string](#RelocTypeGeneric.GoString) * [func (i RelocTypeGeneric) String() string](#RelocTypeGeneric.String) * [type RelocTypeX86\_64](#RelocTypeX86_64) * [func (r RelocTypeX86\_64) GoString() string](#RelocTypeX86_64.GoString) * [func (i RelocTypeX86\_64) String() string](#RelocTypeX86_64.String) * [type Rpath](#Rpath) * [type RpathCmd](#RpathCmd) * [type Section](#Section) * [func (s \*Section) Data() ([]byte, error)](#Section.Data) * [func (s \*Section) Open() io.ReadSeeker](#Section.Open) * [type Section32](#Section32) * [type Section64](#Section64) * [type SectionHeader](#SectionHeader) * [type Segment](#Segment) * [func (s \*Segment) Data() ([]byte, error)](#Segment.Data) * [func (s \*Segment) Open() io.ReadSeeker](#Segment.Open) * [type Segment32](#Segment32) * [type Segment64](#Segment64) * [type SegmentHeader](#SegmentHeader) * [type Symbol](#Symbol) * [type Symtab](#Symtab) * [type SymtabCmd](#SymtabCmd) * [type Thread](#Thread) * [type Type](#Type) * [func (t Type) GoString() string](#Type.GoString) * [func (t Type) String() string](#Type.String) ### Package files fat.go file.go macho.go reloctype.go reloctype\_string.go Constants --------- ``` const ( Magic32 uint32 = 0xfeedface Magic64 uint32 = 0xfeedfacf MagicFat uint32 = 0xcafebabe ) ``` ``` const ( FlagNoUndefs uint32 = 0x1 FlagIncrLink uint32 = 0x2 FlagDyldLink uint32 = 0x4 FlagBindAtLoad uint32 = 0x8 FlagPrebound uint32 = 0x10 FlagSplitSegs uint32 = 0x20 FlagLazyInit uint32 = 0x40 FlagTwoLevel uint32 = 0x80 FlagForceFlat uint32 = 0x100 FlagNoMultiDefs uint32 = 0x200 FlagNoFixPrebinding uint32 = 0x400 FlagPrebindable uint32 = 0x800 FlagAllModsBound uint32 = 0x1000 FlagSubsectionsViaSymbols uint32 = 0x2000 FlagCanonical uint32 = 0x4000 FlagWeakDefines uint32 = 0x8000 FlagBindsToWeak uint32 = 0x10000 FlagAllowStackExecution uint32 = 0x20000 FlagRootSafe uint32 = 0x40000 FlagSetuidSafe uint32 = 0x80000 FlagNoReexportedDylibs uint32 = 0x100000 FlagPIE uint32 = 0x200000 FlagDeadStrippableDylib uint32 = 0x400000 FlagHasTLVDescriptors uint32 = 0x800000 FlagNoHeapExecution uint32 = 0x1000000 FlagAppExtensionSafe uint32 = 0x2000000 ) ``` Variables --------- ErrNotFat is returned from NewFatFile or OpenFat when the file is not a universal binary but may be a thin binary, based on its magic number. ``` var ErrNotFat = &FormatError{0, "not a fat Mach-O file", nil} ``` type Cpu -------- A Cpu is a Mach-O cpu type. ``` type Cpu uint32 ``` ``` const ( Cpu386 Cpu = 7 CpuAmd64 Cpu = Cpu386 | cpuArch64 CpuArm Cpu = 12 CpuArm64 Cpu = CpuArm | cpuArch64 CpuPpc Cpu = 18 CpuPpc64 Cpu = CpuPpc | cpuArch64 ) ``` ### func (Cpu) GoString ``` func (i Cpu) GoString() string ``` ### func (Cpu) String ``` func (i Cpu) String() string ``` type Dylib ---------- A Dylib represents a Mach-O load dynamic library command. ``` type Dylib struct { LoadBytes Name string Time uint32 CurrentVersion uint32 CompatVersion uint32 } ``` type DylibCmd ------------- A DylibCmd is a Mach-O load dynamic library command. ``` type DylibCmd struct { Cmd LoadCmd Len uint32 Name uint32 Time uint32 CurrentVersion uint32 CompatVersion uint32 } ``` type Dysymtab ------------- A Dysymtab represents a Mach-O dynamic symbol table command. ``` type Dysymtab struct { LoadBytes DysymtabCmd IndirectSyms []uint32 // indices into Symtab.Syms } ``` type DysymtabCmd ---------------- A DysymtabCmd is a Mach-O dynamic symbol table command. ``` type DysymtabCmd struct { Cmd LoadCmd Len uint32 Ilocalsym uint32 Nlocalsym uint32 Iextdefsym uint32 Nextdefsym uint32 Iundefsym uint32 Nundefsym uint32 Tocoffset uint32 Ntoc uint32 Modtaboff uint32 Nmodtab uint32 Extrefsymoff uint32 Nextrefsyms uint32 Indirectsymoff uint32 Nindirectsyms uint32 Extreloff uint32 Nextrel uint32 Locreloff uint32 Nlocrel uint32 } ``` type FatArch 1.3 ---------------- A FatArch is a Mach-O File inside a FatFile. ``` type FatArch struct { FatArchHeader *File } ``` type FatArchHeader 1.3 ---------------------- A FatArchHeader represents a fat header for a specific image architecture. ``` type FatArchHeader struct { Cpu Cpu SubCpu uint32 Offset uint32 Size uint32 Align uint32 } ``` type FatFile 1.3 ---------------- A FatFile is a Mach-O universal binary that contains at least one architecture. ``` type FatFile struct { Magic uint32 Arches []FatArch // contains filtered or unexported fields } ``` ### func NewFatFile 1.3 ``` func NewFatFile(r io.ReaderAt) (*FatFile, error) ``` NewFatFile creates a new FatFile for accessing all the Mach-O images in a universal binary. The Mach-O binary is expected to start at position 0 in the ReaderAt. ### func OpenFat 1.3 ``` func OpenFat(name string) (*FatFile, error) ``` OpenFat opens the named file using os.Open and prepares it for use as a Mach-O universal binary. ### func (\*FatFile) Close 1.3 ``` func (ff *FatFile) Close() error ``` type File --------- A File represents an open Mach-O file. ``` type File struct { FileHeader ByteOrder binary.ByteOrder Loads []Load Sections []*Section Symtab *Symtab Dysymtab *Dysymtab // contains filtered or unexported fields } ``` ### func NewFile ``` func NewFile(r io.ReaderAt) (*File, error) ``` NewFile creates a new File for accessing a Mach-O binary in an underlying reader. The Mach-O binary is expected to start at position 0 in the ReaderAt. ### func Open ``` func Open(name string) (*File, error) ``` Open opens the named file using os.Open and prepares it for use as a Mach-O binary. ### func (\*File) Close ``` func (f *File) Close() error ``` Close closes the File. If the File was created using NewFile directly instead of Open, Close has no effect. ### func (\*File) DWARF ``` func (f *File) DWARF() (*dwarf.Data, error) ``` DWARF returns the DWARF debug information for the Mach-O file. ### func (\*File) ImportedLibraries ``` func (f *File) ImportedLibraries() ([]string, error) ``` ImportedLibraries returns the paths of all libraries referred to by the binary f that are expected to be linked with the binary at dynamic link time. ### func (\*File) ImportedSymbols ``` func (f *File) ImportedSymbols() ([]string, error) ``` ImportedSymbols returns the names of all symbols referred to by the binary f that are expected to be satisfied by other libraries at dynamic load time. ### func (\*File) Section ``` func (f *File) Section(name string) *Section ``` Section returns the first section with the given name, or nil if no such section exists. ### func (\*File) Segment ``` func (f *File) Segment(name string) *Segment ``` Segment returns the first Segment with the given name, or nil if no such segment exists. type FileHeader --------------- A FileHeader represents a Mach-O file header. ``` type FileHeader struct { Magic uint32 Cpu Cpu SubCpu uint32 Type Type Ncmd uint32 Cmdsz uint32 Flags uint32 } ``` type FormatError ---------------- FormatError is returned by some operations if the data does not have the correct format for an object file. ``` type FormatError struct { // contains filtered or unexported fields } ``` ### func (\*FormatError) Error ``` func (e *FormatError) Error() string ``` type Load --------- A Load represents any Mach-O load command. ``` type Load interface { Raw() []byte } ``` type LoadBytes -------------- A LoadBytes is the uninterpreted bytes of a Mach-O load command. ``` type LoadBytes []byte ``` ### func (LoadBytes) Raw ``` func (b LoadBytes) Raw() []byte ``` type LoadCmd ------------ A LoadCmd is a Mach-O load command. ``` type LoadCmd uint32 ``` ``` const ( LoadCmdSegment LoadCmd = 0x1 LoadCmdSymtab LoadCmd = 0x2 LoadCmdThread LoadCmd = 0x4 LoadCmdUnixThread LoadCmd = 0x5 // thread+stack LoadCmdDysymtab LoadCmd = 0xb LoadCmdDylib LoadCmd = 0xc // load dylib command LoadCmdDylinker LoadCmd = 0xf // id dylinker command (not load dylinker command) LoadCmdSegment64 LoadCmd = 0x19 LoadCmdRpath LoadCmd = 0x8000001c ) ``` ### func (LoadCmd) GoString ``` func (i LoadCmd) GoString() string ``` ### func (LoadCmd) String ``` func (i LoadCmd) String() string ``` type Nlist32 ------------ An Nlist32 is a Mach-O 32-bit symbol table entry. ``` type Nlist32 struct { Name uint32 Type uint8 Sect uint8 Desc uint16 Value uint32 } ``` type Nlist64 ------------ An Nlist64 is a Mach-O 64-bit symbol table entry. ``` type Nlist64 struct { Name uint32 Type uint8 Sect uint8 Desc uint16 Value uint64 } ``` type Regs386 ------------ Regs386 is the Mach-O 386 register structure. ``` type Regs386 struct { AX uint32 BX uint32 CX uint32 DX uint32 DI uint32 SI uint32 BP uint32 SP uint32 SS uint32 FLAGS uint32 IP uint32 CS uint32 DS uint32 ES uint32 FS uint32 GS uint32 } ``` type RegsAMD64 -------------- RegsAMD64 is the Mach-O AMD64 register structure. ``` type RegsAMD64 struct { AX uint64 BX uint64 CX uint64 DX uint64 DI uint64 SI uint64 BP uint64 SP uint64 R8 uint64 R9 uint64 R10 uint64 R11 uint64 R12 uint64 R13 uint64 R14 uint64 R15 uint64 IP uint64 FLAGS uint64 CS uint64 FS uint64 GS uint64 } ``` type Reloc 1.10 --------------- A Reloc represents a Mach-O relocation. ``` type Reloc struct { Addr uint32 Value uint32 // when Scattered == false && Extern == true, Value is the symbol number. // when Scattered == false && Extern == false, Value is the section number. // when Scattered == true, Value is the value that this reloc refers to. Type uint8 Len uint8 // 0=byte, 1=word, 2=long, 3=quad Pcrel bool Extern bool // valid if Scattered == false Scattered bool } ``` type RelocTypeARM 1.10 ---------------------- ``` type RelocTypeARM int ``` ``` const ( ARM_RELOC_VANILLA RelocTypeARM = 0 ARM_RELOC_PAIR RelocTypeARM = 1 ARM_RELOC_SECTDIFF RelocTypeARM = 2 ARM_RELOC_LOCAL_SECTDIFF RelocTypeARM = 3 ARM_RELOC_PB_LA_PTR RelocTypeARM = 4 ARM_RELOC_BR24 RelocTypeARM = 5 ARM_THUMB_RELOC_BR22 RelocTypeARM = 6 ARM_THUMB_32BIT_BRANCH RelocTypeARM = 7 ARM_RELOC_HALF RelocTypeARM = 8 ARM_RELOC_HALF_SECTDIFF RelocTypeARM = 9 ) ``` ### func (RelocTypeARM) GoString 1.10 ``` func (r RelocTypeARM) GoString() string ``` ### func (RelocTypeARM) String 1.10 ``` func (i RelocTypeARM) String() string ``` type RelocTypeARM64 1.10 ------------------------ ``` type RelocTypeARM64 int ``` ``` const ( ARM64_RELOC_UNSIGNED RelocTypeARM64 = 0 ARM64_RELOC_SUBTRACTOR RelocTypeARM64 = 1 ARM64_RELOC_BRANCH26 RelocTypeARM64 = 2 ARM64_RELOC_PAGE21 RelocTypeARM64 = 3 ARM64_RELOC_PAGEOFF12 RelocTypeARM64 = 4 ARM64_RELOC_GOT_LOAD_PAGE21 RelocTypeARM64 = 5 ARM64_RELOC_GOT_LOAD_PAGEOFF12 RelocTypeARM64 = 6 ARM64_RELOC_POINTER_TO_GOT RelocTypeARM64 = 7 ARM64_RELOC_TLVP_LOAD_PAGE21 RelocTypeARM64 = 8 ARM64_RELOC_TLVP_LOAD_PAGEOFF12 RelocTypeARM64 = 9 ARM64_RELOC_ADDEND RelocTypeARM64 = 10 ) ``` ### func (RelocTypeARM64) GoString 1.10 ``` func (r RelocTypeARM64) GoString() string ``` ### func (RelocTypeARM64) String 1.10 ``` func (i RelocTypeARM64) String() string ``` type RelocTypeGeneric 1.10 -------------------------- ``` type RelocTypeGeneric int ``` ``` const ( GENERIC_RELOC_VANILLA RelocTypeGeneric = 0 GENERIC_RELOC_PAIR RelocTypeGeneric = 1 GENERIC_RELOC_SECTDIFF RelocTypeGeneric = 2 GENERIC_RELOC_PB_LA_PTR RelocTypeGeneric = 3 GENERIC_RELOC_LOCAL_SECTDIFF RelocTypeGeneric = 4 GENERIC_RELOC_TLV RelocTypeGeneric = 5 ) ``` ### func (RelocTypeGeneric) GoString 1.10 ``` func (r RelocTypeGeneric) GoString() string ``` ### func (RelocTypeGeneric) String 1.10 ``` func (i RelocTypeGeneric) String() string ``` type RelocTypeX86\_64 1.10 -------------------------- ``` type RelocTypeX86_64 int ``` ``` const ( X86_64_RELOC_UNSIGNED RelocTypeX86_64 = 0 X86_64_RELOC_SIGNED RelocTypeX86_64 = 1 X86_64_RELOC_BRANCH RelocTypeX86_64 = 2 X86_64_RELOC_GOT_LOAD RelocTypeX86_64 = 3 X86_64_RELOC_GOT RelocTypeX86_64 = 4 X86_64_RELOC_SUBTRACTOR RelocTypeX86_64 = 5 X86_64_RELOC_SIGNED_1 RelocTypeX86_64 = 6 X86_64_RELOC_SIGNED_2 RelocTypeX86_64 = 7 X86_64_RELOC_SIGNED_4 RelocTypeX86_64 = 8 X86_64_RELOC_TLV RelocTypeX86_64 = 9 ) ``` ### func (RelocTypeX86\_64) GoString 1.10 ``` func (r RelocTypeX86_64) GoString() string ``` ### func (RelocTypeX86\_64) String 1.10 ``` func (i RelocTypeX86_64) String() string ``` type Rpath 1.10 --------------- A Rpath represents a Mach-O rpath command. ``` type Rpath struct { LoadBytes Path string } ``` type RpathCmd 1.10 ------------------ A RpathCmd is a Mach-O rpath command. ``` type RpathCmd struct { Cmd LoadCmd Len uint32 Path uint32 } ``` type Section ------------ ``` type Section struct { SectionHeader Relocs []Reloc // Go 1.10 // Embed ReaderAt for ReadAt method. // Do not embed SectionReader directly // to avoid having Read and Seek. // If a client wants Read and Seek it must use // Open() to avoid fighting over the seek offset // with other clients. io.ReaderAt // contains filtered or unexported fields } ``` ### func (\*Section) Data ``` func (s *Section) Data() ([]byte, error) ``` Data reads and returns the contents of the Mach-O section. ### func (\*Section) Open ``` func (s *Section) Open() io.ReadSeeker ``` Open returns a new ReadSeeker reading the Mach-O section. type Section32 -------------- A Section32 is a 32-bit Mach-O section header. ``` type Section32 struct { Name [16]byte Seg [16]byte Addr uint32 Size uint32 Offset uint32 Align uint32 Reloff uint32 Nreloc uint32 Flags uint32 Reserve1 uint32 Reserve2 uint32 } ``` type Section64 -------------- A Section64 is a 64-bit Mach-O section header. ``` type Section64 struct { Name [16]byte Seg [16]byte Addr uint64 Size uint64 Offset uint32 Align uint32 Reloff uint32 Nreloc uint32 Flags uint32 Reserve1 uint32 Reserve2 uint32 Reserve3 uint32 } ``` type SectionHeader ------------------ ``` type SectionHeader struct { Name string Seg string Addr uint64 Size uint64 Offset uint32 Align uint32 Reloff uint32 Nreloc uint32 Flags uint32 } ``` type Segment ------------ A Segment represents a Mach-O 32-bit or 64-bit load segment command. ``` type Segment struct { LoadBytes SegmentHeader // Embed ReaderAt for ReadAt method. // Do not embed SectionReader directly // to avoid having Read and Seek. // If a client wants Read and Seek it must use // Open() to avoid fighting over the seek offset // with other clients. io.ReaderAt // contains filtered or unexported fields } ``` ### func (\*Segment) Data ``` func (s *Segment) Data() ([]byte, error) ``` Data reads and returns the contents of the segment. ### func (\*Segment) Open ``` func (s *Segment) Open() io.ReadSeeker ``` Open returns a new ReadSeeker reading the segment. type Segment32 -------------- A Segment32 is a 32-bit Mach-O segment load command. ``` type Segment32 struct { Cmd LoadCmd Len uint32 Name [16]byte Addr uint32 Memsz uint32 Offset uint32 Filesz uint32 Maxprot uint32 Prot uint32 Nsect uint32 Flag uint32 } ``` type Segment64 -------------- A Segment64 is a 64-bit Mach-O segment load command. ``` type Segment64 struct { Cmd LoadCmd Len uint32 Name [16]byte Addr uint64 Memsz uint64 Offset uint64 Filesz uint64 Maxprot uint32 Prot uint32 Nsect uint32 Flag uint32 } ``` type SegmentHeader ------------------ A SegmentHeader is the header for a Mach-O 32-bit or 64-bit load segment command. ``` type SegmentHeader struct { Cmd LoadCmd Len uint32 Name string Addr uint64 Memsz uint64 Offset uint64 Filesz uint64 Maxprot uint32 Prot uint32 Nsect uint32 Flag uint32 } ``` type Symbol ----------- A Symbol is a Mach-O 32-bit or 64-bit symbol table entry. ``` type Symbol struct { Name string Type uint8 Sect uint8 Desc uint16 Value uint64 } ``` type Symtab ----------- A Symtab represents a Mach-O symbol table command. ``` type Symtab struct { LoadBytes SymtabCmd Syms []Symbol } ``` type SymtabCmd -------------- A SymtabCmd is a Mach-O symbol table command. ``` type SymtabCmd struct { Cmd LoadCmd Len uint32 Symoff uint32 Nsyms uint32 Stroff uint32 Strsize uint32 } ``` type Thread ----------- A Thread is a Mach-O thread state command. ``` type Thread struct { Cmd LoadCmd Len uint32 Type uint32 Data []uint32 } ``` type Type --------- A Type is the Mach-O file type, e.g. an object file, executable, or dynamic library. ``` type Type uint32 ``` ``` const ( TypeObj Type = 1 TypeExec Type = 2 TypeDylib Type = 6 TypeBundle Type = 8 ) ``` ### func (Type) GoString 1.10 ``` func (t Type) GoString() string ``` ### func (Type) String 1.10 ``` func (t Type) String() string ```
programming_docs
go Package dwarf Package dwarf ============== * `import "debug/dwarf"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package dwarf provides access to DWARF debugging information loaded from executable files, as defined in the DWARF 2.0 Standard at <http://dwarfstd.org/doc/dwarf-2.0.0.pdf>. ### Security This package is not designed to be hardened against adversarial inputs, and is outside the scope of <https://go.dev/security/policy>. In particular, only basic validation is done when parsing object files. As such, care should be taken when parsing untrusted inputs, as parsing malformed files may consume significant resources, or cause panics. Index ----- * [Variables](#pkg-variables) * [type AddrType](#AddrType) * [type ArrayType](#ArrayType) * [func (t \*ArrayType) Size() int64](#ArrayType.Size) * [func (t \*ArrayType) String() string](#ArrayType.String) * [type Attr](#Attr) * [func (a Attr) GoString() string](#Attr.GoString) * [func (i Attr) String() string](#Attr.String) * [type BasicType](#BasicType) * [func (b \*BasicType) Basic() \*BasicType](#BasicType.Basic) * [func (t \*BasicType) String() string](#BasicType.String) * [type BoolType](#BoolType) * [type CharType](#CharType) * [type Class](#Class) * [func (i Class) GoString() string](#Class.GoString) * [func (i Class) String() string](#Class.String) * [type CommonType](#CommonType) * [func (c \*CommonType) Common() \*CommonType](#CommonType.Common) * [func (c \*CommonType) Size() int64](#CommonType.Size) * [type ComplexType](#ComplexType) * [type Data](#Data) * [func New(abbrev, aranges, frame, info, line, pubnames, ranges, str []byte) (\*Data, error)](#New) * [func (d \*Data) AddSection(name string, contents []byte) error](#Data.AddSection) * [func (d \*Data) AddTypes(name string, types []byte) error](#Data.AddTypes) * [func (d \*Data) LineReader(cu \*Entry) (\*LineReader, error)](#Data.LineReader) * [func (d \*Data) Ranges(e \*Entry) ([][2]uint64, error)](#Data.Ranges) * [func (d \*Data) Reader() \*Reader](#Data.Reader) * [func (d \*Data) Type(off Offset) (Type, error)](#Data.Type) * [type DecodeError](#DecodeError) * [func (e DecodeError) Error() string](#DecodeError.Error) * [type DotDotDotType](#DotDotDotType) * [func (t \*DotDotDotType) String() string](#DotDotDotType.String) * [type Entry](#Entry) * [func (e \*Entry) AttrField(a Attr) \*Field](#Entry.AttrField) * [func (e \*Entry) Val(a Attr) any](#Entry.Val) * [type EnumType](#EnumType) * [func (t \*EnumType) String() string](#EnumType.String) * [type EnumValue](#EnumValue) * [type Field](#Field) * [type FloatType](#FloatType) * [type FuncType](#FuncType) * [func (t \*FuncType) String() string](#FuncType.String) * [type IntType](#IntType) * [type LineEntry](#LineEntry) * [type LineFile](#LineFile) * [type LineReader](#LineReader) * [func (r \*LineReader) Files() []\*LineFile](#LineReader.Files) * [func (r \*LineReader) Next(entry \*LineEntry) error](#LineReader.Next) * [func (r \*LineReader) Reset()](#LineReader.Reset) * [func (r \*LineReader) Seek(pos LineReaderPos)](#LineReader.Seek) * [func (r \*LineReader) SeekPC(pc uint64, entry \*LineEntry) error](#LineReader.SeekPC) * [func (r \*LineReader) Tell() LineReaderPos](#LineReader.Tell) * [type LineReaderPos](#LineReaderPos) * [type Offset](#Offset) * [type PtrType](#PtrType) * [func (t \*PtrType) String() string](#PtrType.String) * [type QualType](#QualType) * [func (t \*QualType) Size() int64](#QualType.Size) * [func (t \*QualType) String() string](#QualType.String) * [type Reader](#Reader) * [func (r \*Reader) AddressSize() int](#Reader.AddressSize) * [func (r \*Reader) ByteOrder() binary.ByteOrder](#Reader.ByteOrder) * [func (r \*Reader) Next() (\*Entry, error)](#Reader.Next) * [func (r \*Reader) Seek(off Offset)](#Reader.Seek) * [func (r \*Reader) SeekPC(pc uint64) (\*Entry, error)](#Reader.SeekPC) * [func (r \*Reader) SkipChildren()](#Reader.SkipChildren) * [type StructField](#StructField) * [type StructType](#StructType) * [func (t \*StructType) Defn() string](#StructType.Defn) * [func (t \*StructType) String() string](#StructType.String) * [type Tag](#Tag) * [func (t Tag) GoString() string](#Tag.GoString) * [func (i Tag) String() string](#Tag.String) * [type Type](#Type) * [type TypedefType](#TypedefType) * [func (t \*TypedefType) Size() int64](#TypedefType.Size) * [func (t \*TypedefType) String() string](#TypedefType.String) * [type UcharType](#UcharType) * [type UintType](#UintType) * [type UnspecifiedType](#UnspecifiedType) * [type UnsupportedType](#UnsupportedType) * [func (t \*UnsupportedType) String() string](#UnsupportedType.String) * [type VoidType](#VoidType) * [func (t \*VoidType) String() string](#VoidType.String) ### Package files attr\_string.go buf.go class\_string.go const.go entry.go line.go open.go tag\_string.go type.go typeunit.go unit.go Variables --------- ErrUnknownPC is the error returned by LineReader.ScanPC when the seek PC is not covered by any entry in the line table. ``` var ErrUnknownPC = errors.New("ErrUnknownPC") ``` type AddrType ------------- An AddrType represents a machine address type. ``` type AddrType struct { BasicType } ``` type ArrayType -------------- An ArrayType represents a fixed size array type. ``` type ArrayType struct { CommonType Type Type StrideBitSize int64 // if > 0, number of bits to hold each element Count int64 // if == -1, an incomplete array, like char x[]. } ``` ### func (\*ArrayType) Size ``` func (t *ArrayType) Size() int64 ``` ### func (\*ArrayType) String ``` func (t *ArrayType) String() string ``` type Attr --------- An Attr identifies the attribute type in a DWARF Entry's Field. ``` type Attr uint32 ``` ``` const ( AttrSibling Attr = 0x01 AttrLocation Attr = 0x02 AttrName Attr = 0x03 AttrOrdering Attr = 0x09 AttrByteSize Attr = 0x0B AttrBitOffset Attr = 0x0C AttrBitSize Attr = 0x0D AttrStmtList Attr = 0x10 AttrLowpc Attr = 0x11 AttrHighpc Attr = 0x12 AttrLanguage Attr = 0x13 AttrDiscr Attr = 0x15 AttrDiscrValue Attr = 0x16 AttrVisibility Attr = 0x17 AttrImport Attr = 0x18 AttrStringLength Attr = 0x19 AttrCommonRef Attr = 0x1A AttrCompDir Attr = 0x1B AttrConstValue Attr = 0x1C AttrContainingType Attr = 0x1D AttrDefaultValue Attr = 0x1E AttrInline Attr = 0x20 AttrIsOptional Attr = 0x21 AttrLowerBound Attr = 0x22 AttrProducer Attr = 0x25 AttrPrototyped Attr = 0x27 AttrReturnAddr Attr = 0x2A AttrStartScope Attr = 0x2C AttrStrideSize Attr = 0x2E AttrUpperBound Attr = 0x2F AttrAbstractOrigin Attr = 0x31 AttrAccessibility Attr = 0x32 AttrAddrClass Attr = 0x33 AttrArtificial Attr = 0x34 AttrBaseTypes Attr = 0x35 AttrCalling Attr = 0x36 AttrCount Attr = 0x37 AttrDataMemberLoc Attr = 0x38 AttrDeclColumn Attr = 0x39 AttrDeclFile Attr = 0x3A AttrDeclLine Attr = 0x3B AttrDeclaration Attr = 0x3C AttrDiscrList Attr = 0x3D AttrEncoding Attr = 0x3E AttrExternal Attr = 0x3F AttrFrameBase Attr = 0x40 AttrFriend Attr = 0x41 AttrIdentifierCase Attr = 0x42 AttrMacroInfo Attr = 0x43 AttrNamelistItem Attr = 0x44 AttrPriority Attr = 0x45 AttrSegment Attr = 0x46 AttrSpecification Attr = 0x47 AttrStaticLink Attr = 0x48 AttrType Attr = 0x49 AttrUseLocation Attr = 0x4A AttrVarParam Attr = 0x4B AttrVirtuality Attr = 0x4C AttrVtableElemLoc Attr = 0x4D // The following are new in DWARF 3. AttrAllocated Attr = 0x4E AttrAssociated Attr = 0x4F AttrDataLocation Attr = 0x50 AttrStride Attr = 0x51 AttrEntrypc Attr = 0x52 AttrUseUTF8 Attr = 0x53 AttrExtension Attr = 0x54 AttrRanges Attr = 0x55 AttrTrampoline Attr = 0x56 AttrCallColumn Attr = 0x57 AttrCallFile Attr = 0x58 AttrCallLine Attr = 0x59 AttrDescription Attr = 0x5A AttrBinaryScale Attr = 0x5B AttrDecimalScale Attr = 0x5C AttrSmall Attr = 0x5D AttrDecimalSign Attr = 0x5E AttrDigitCount Attr = 0x5F AttrPictureString Attr = 0x60 AttrMutable Attr = 0x61 AttrThreadsScaled Attr = 0x62 AttrExplicit Attr = 0x63 AttrObjectPointer Attr = 0x64 AttrEndianity Attr = 0x65 AttrElemental Attr = 0x66 AttrPure Attr = 0x67 AttrRecursive Attr = 0x68 // The following are new in DWARF 4. AttrSignature Attr = 0x69 AttrMainSubprogram Attr = 0x6A AttrDataBitOffset Attr = 0x6B AttrConstExpr Attr = 0x6C AttrEnumClass Attr = 0x6D AttrLinkageName Attr = 0x6E // The following are new in DWARF 5. AttrStringLengthBitSize Attr = 0x6F AttrStringLengthByteSize Attr = 0x70 AttrRank Attr = 0x71 AttrStrOffsetsBase Attr = 0x72 AttrAddrBase Attr = 0x73 AttrRnglistsBase Attr = 0x74 AttrDwoName Attr = 0x76 AttrReference Attr = 0x77 AttrRvalueReference Attr = 0x78 AttrMacros Attr = 0x79 AttrCallAllCalls Attr = 0x7A AttrCallAllSourceCalls Attr = 0x7B AttrCallAllTailCalls Attr = 0x7C AttrCallReturnPC Attr = 0x7D AttrCallValue Attr = 0x7E AttrCallOrigin Attr = 0x7F AttrCallParameter Attr = 0x80 AttrCallPC Attr = 0x81 AttrCallTailCall Attr = 0x82 AttrCallTarget Attr = 0x83 AttrCallTargetClobbered Attr = 0x84 AttrCallDataLocation Attr = 0x85 AttrCallDataValue Attr = 0x86 AttrNoreturn Attr = 0x87 AttrAlignment Attr = 0x88 AttrExportSymbols Attr = 0x89 AttrDeleted Attr = 0x8A AttrDefaulted Attr = 0x8B AttrLoclistsBase Attr = 0x8C ) ``` ### func (Attr) GoString ``` func (a Attr) GoString() string ``` ### func (Attr) String ``` func (i Attr) String() string ``` type BasicType -------------- A BasicType holds fields common to all basic types. See the documentation for StructField for more info on the interpretation of the BitSize/BitOffset/DataBitOffset fields. ``` type BasicType struct { CommonType BitSize int64 BitOffset int64 DataBitOffset int64 // Go 1.18 } ``` ### func (\*BasicType) Basic ``` func (b *BasicType) Basic() *BasicType ``` ### func (\*BasicType) String ``` func (t *BasicType) String() string ``` type BoolType ------------- A BoolType represents a boolean type. ``` type BoolType struct { BasicType } ``` type CharType ------------- A CharType represents a signed character type. ``` type CharType struct { BasicType } ``` type Class 1.5 -------------- A Class is the DWARF 4 class of an attribute value. In general, a given attribute's value may take on one of several possible classes defined by DWARF, each of which leads to a slightly different interpretation of the attribute. DWARF version 4 distinguishes attribute value classes more finely than previous versions of DWARF. The reader will disambiguate coarser classes from earlier versions of DWARF into the appropriate DWARF 4 class. For example, DWARF 2 uses "constant" for constants as well as all types of section offsets, but the reader will canonicalize attributes in DWARF 2 files that refer to section offsets to one of the Class\*Ptr classes, even though these classes were only defined in DWARF 3. ``` type Class int ``` ``` const ( // ClassUnknown represents values of unknown DWARF class. ClassUnknown Class = iota // ClassAddress represents values of type uint64 that are // addresses on the target machine. ClassAddress // ClassBlock represents values of type []byte whose // interpretation depends on the attribute. ClassBlock // ClassConstant represents values of type int64 that are // constants. The interpretation of this constant depends on // the attribute. ClassConstant // ClassExprLoc represents values of type []byte that contain // an encoded DWARF expression or location description. ClassExprLoc // ClassFlag represents values of type bool. ClassFlag // ClassLinePtr represents values that are an int64 offset // into the "line" section. ClassLinePtr // ClassLocListPtr represents values that are an int64 offset // into the "loclist" section. ClassLocListPtr // ClassMacPtr represents values that are an int64 offset into // the "mac" section. ClassMacPtr // ClassRangeListPtr represents values that are an int64 offset into // the "rangelist" section. ClassRangeListPtr // ClassReference represents values that are an Offset offset // of an Entry in the info section (for use with Reader.Seek). // The DWARF specification combines ClassReference and // ClassReferenceSig into class "reference". ClassReference // ClassReferenceSig represents values that are a uint64 type // signature referencing a type Entry. ClassReferenceSig // ClassString represents values that are strings. If the // compilation unit specifies the AttrUseUTF8 flag (strongly // recommended), the string value will be encoded in UTF-8. // Otherwise, the encoding is unspecified. ClassString // ClassReferenceAlt represents values of type int64 that are // an offset into the DWARF "info" section of an alternate // object file. ClassReferenceAlt // ClassStringAlt represents values of type int64 that are an // offset into the DWARF string section of an alternate object // file. ClassStringAlt // ClassAddrPtr represents values that are an int64 offset // into the "addr" section. ClassAddrPtr // ClassLocList represents values that are an int64 offset // into the "loclists" section. ClassLocList // ClassRngList represents values that are a uint64 offset // from the base of the "rnglists" section. ClassRngList // ClassRngListsPtr represents values that are an int64 offset // into the "rnglists" section. These are used as the base for // ClassRngList values. ClassRngListsPtr // ClassStrOffsetsPtr represents values that are an int64 // offset into the "str_offsets" section. ClassStrOffsetsPtr ) ``` ### func (Class) GoString 1.5 ``` func (i Class) GoString() string ``` ### func (Class) String 1.5 ``` func (i Class) String() string ``` type CommonType --------------- A CommonType holds fields common to multiple types. If a field is not known or not applicable for a given type, the zero value is used. ``` type CommonType struct { ByteSize int64 // size of value of this type, in bytes Name string // name that can be used to refer to type } ``` ### func (\*CommonType) Common ``` func (c *CommonType) Common() *CommonType ``` ### func (\*CommonType) Size ``` func (c *CommonType) Size() int64 ``` type ComplexType ---------------- A ComplexType represents a complex floating point type. ``` type ComplexType struct { BasicType } ``` type Data --------- Data represents the DWARF debugging information loaded from an executable file (for example, an ELF or Mach-O executable). ``` type Data struct { // contains filtered or unexported fields } ``` ### func New ``` func New(abbrev, aranges, frame, info, line, pubnames, ranges, str []byte) (*Data, error) ``` New returns a new Data object initialized from the given parameters. Rather than calling this function directly, clients should typically use the DWARF method of the File type of the appropriate package debug/elf, debug/macho, or debug/pe. The []byte arguments are the data from the corresponding debug section in the object file; for example, for an ELF object, abbrev is the contents of the ".debug\_abbrev" section. ### func (\*Data) AddSection 1.14 ``` func (d *Data) AddSection(name string, contents []byte) error ``` AddSection adds another DWARF section by name. The name should be a DWARF section name such as ".debug\_addr", ".debug\_str\_offsets", and so forth. This approach is used for new DWARF sections added in DWARF 5 and later. ### func (\*Data) AddTypes 1.3 ``` func (d *Data) AddTypes(name string, types []byte) error ``` AddTypes will add one .debug\_types section to the DWARF data. A typical object with DWARF version 4 debug info will have multiple .debug\_types sections. The name is used for error reporting only, and serves to distinguish one .debug\_types section from another. ### func (\*Data) LineReader 1.5 ``` func (d *Data) LineReader(cu *Entry) (*LineReader, error) ``` LineReader returns a new reader for the line table of compilation unit cu, which must be an Entry with tag TagCompileUnit. If this compilation unit has no line table, it returns nil, nil. ### func (\*Data) Ranges 1.7 ``` func (d *Data) Ranges(e *Entry) ([][2]uint64, error) ``` Ranges returns the PC ranges covered by e, a slice of [low,high) pairs. Only some entry types, such as TagCompileUnit or TagSubprogram, have PC ranges; for others, this will return nil with no error. ### func (\*Data) Reader ``` func (d *Data) Reader() *Reader ``` Reader returns a new Reader for Data. The reader is positioned at byte offset 0 in the DWARF “info” section. ### func (\*Data) Type ``` func (d *Data) Type(off Offset) (Type, error) ``` Type reads the type at off in the DWARF “info” section. type DecodeError ---------------- ``` type DecodeError struct { Name string Offset Offset Err string } ``` ### func (DecodeError) Error ``` func (e DecodeError) Error() string ``` type DotDotDotType ------------------ A DotDotDotType represents the variadic ... function parameter. ``` type DotDotDotType struct { CommonType } ``` ### func (\*DotDotDotType) String ``` func (t *DotDotDotType) String() string ``` type Entry ---------- An entry is a sequence of attribute/value pairs. ``` type Entry struct { Offset Offset // offset of Entry in DWARF info Tag Tag // tag (kind of Entry) Children bool // whether Entry is followed by children Field []Field } ``` ### func (\*Entry) AttrField 1.5 ``` func (e *Entry) AttrField(a Attr) *Field ``` AttrField returns the Field associated with attribute Attr in Entry, or nil if there is no such attribute. ### func (\*Entry) Val ``` func (e *Entry) Val(a Attr) any ``` Val returns the value associated with attribute Attr in Entry, or nil if there is no such attribute. A common idiom is to merge the check for nil return with the check that the value has the expected dynamic type, as in: ``` v, ok := e.Val(AttrSibling).(int64) ``` type EnumType ------------- An EnumType represents an enumerated type. The only indication of its native integer type is its ByteSize (inside CommonType). ``` type EnumType struct { CommonType EnumName string Val []*EnumValue } ``` ### func (\*EnumType) String ``` func (t *EnumType) String() string ``` type EnumValue -------------- An EnumValue represents a single enumeration value. ``` type EnumValue struct { Name string Val int64 } ``` type Field ---------- A Field is a single attribute/value pair in an Entry. A value can be one of several "attribute classes" defined by DWARF. The Go types corresponding to each class are: ``` DWARF class Go type Class ----------- ------- ----- address uint64 ClassAddress block []byte ClassBlock constant int64 ClassConstant flag bool ClassFlag reference to info dwarf.Offset ClassReference to type unit uint64 ClassReferenceSig string string ClassString exprloc []byte ClassExprLoc lineptr int64 ClassLinePtr loclistptr int64 ClassLocListPtr macptr int64 ClassMacPtr rangelistptr int64 ClassRangeListPtr ``` For unrecognized or vendor-defined attributes, Class may be ClassUnknown. ``` type Field struct { Attr Attr Val any Class Class // Go 1.5 } ``` type FloatType -------------- A FloatType represents a floating point type. ``` type FloatType struct { BasicType } ``` type FuncType ------------- A FuncType represents a function type. ``` type FuncType struct { CommonType ReturnType Type ParamType []Type } ``` ### func (\*FuncType) String ``` func (t *FuncType) String() string ``` type IntType ------------ An IntType represents a signed integer type. ``` type IntType struct { BasicType } ``` type LineEntry 1.5 ------------------ A LineEntry is a row in a DWARF line table. ``` type LineEntry struct { // Address is the program-counter value of a machine // instruction generated by the compiler. This LineEntry // applies to each instruction from Address to just before the // Address of the next LineEntry. Address uint64 // OpIndex is the index of an operation within a VLIW // instruction. The index of the first operation is 0. For // non-VLIW architectures, it will always be 0. Address and // OpIndex together form an operation pointer that can // reference any individual operation within the instruction // stream. OpIndex int // File is the source file corresponding to these // instructions. File *LineFile // Line is the source code line number corresponding to these // instructions. Lines are numbered beginning at 1. It may be // 0 if these instructions cannot be attributed to any source // line. Line int // Column is the column number within the source line of these // instructions. Columns are numbered beginning at 1. It may // be 0 to indicate the "left edge" of the line. Column int // IsStmt indicates that Address is a recommended breakpoint // location, such as the beginning of a line, statement, or a // distinct subpart of a statement. IsStmt bool // BasicBlock indicates that Address is the beginning of a // basic block. BasicBlock bool // PrologueEnd indicates that Address is one (of possibly // many) PCs where execution should be suspended for a // breakpoint on entry to the containing function. // // Added in DWARF 3. PrologueEnd bool // EpilogueBegin indicates that Address is one (of possibly // many) PCs where execution should be suspended for a // breakpoint on exit from this function. // // Added in DWARF 3. EpilogueBegin bool // ISA is the instruction set architecture for these // instructions. Possible ISA values should be defined by the // applicable ABI specification. // // Added in DWARF 3. ISA int // Discriminator is an arbitrary integer indicating the block // to which these instructions belong. It serves to // distinguish among multiple blocks that may all have with // the same source file, line, and column. Where only one // block exists for a given source position, it should be 0. // // Added in DWARF 3. Discriminator int // EndSequence indicates that Address is the first byte after // the end of a sequence of target machine instructions. If it // is set, only this and the Address field are meaningful. A // line number table may contain information for multiple // potentially disjoint instruction sequences. The last entry // in a line table should always have EndSequence set. EndSequence bool } ``` type LineFile 1.5 ----------------- A LineFile is a source file referenced by a DWARF line table entry. ``` type LineFile struct { Name string Mtime uint64 // Implementation defined modification time, or 0 if unknown Length int // File length, or 0 if unknown } ``` type LineReader 1.5 ------------------- A LineReader reads a sequence of LineEntry structures from a DWARF "line" section for a single compilation unit. LineEntries occur in order of increasing PC and each LineEntry gives metadata for the instructions from that LineEntry's PC to just before the next LineEntry's PC. The last entry will have its EndSequence field set. ``` type LineReader struct { // contains filtered or unexported fields } ``` ### func (\*LineReader) Files 1.14 ``` func (r *LineReader) Files() []*LineFile ``` Files returns the file name table of this compilation unit as of the current position in the line table. The file name table may be referenced from attributes in this compilation unit such as AttrDeclFile. Entry 0 is always nil, since file index 0 represents "no file". The file name table of a compilation unit is not fixed. Files returns the file table as of the current position in the line table. This may contain more entries than the file table at an earlier position in the line table, though existing entries never change. ### func (\*LineReader) Next 1.5 ``` func (r *LineReader) Next(entry *LineEntry) error ``` Next sets \*entry to the next row in this line table and moves to the next row. If there are no more entries and the line table is properly terminated, it returns io.EOF. Rows are always in order of increasing entry.Address, but entry.Line may go forward or backward. ### func (\*LineReader) Reset 1.5 ``` func (r *LineReader) Reset() ``` Reset repositions the line table reader at the beginning of the line table. ### func (\*LineReader) Seek 1.5 ``` func (r *LineReader) Seek(pos LineReaderPos) ``` Seek restores the line table reader to a position returned by Tell. The argument pos must have been returned by a call to Tell on this line table. ### func (\*LineReader) SeekPC 1.5 ``` func (r *LineReader) SeekPC(pc uint64, entry *LineEntry) error ``` SeekPC sets \*entry to the LineEntry that includes pc and positions the reader on the next entry in the line table. If necessary, this will seek backwards to find pc. If pc is not covered by any entry in this line table, SeekPC returns ErrUnknownPC. In this case, \*entry and the final seek position are unspecified. Note that DWARF line tables only permit sequential, forward scans. Hence, in the worst case, this takes time linear in the size of the line table. If the caller wishes to do repeated fast PC lookups, it should build an appropriate index of the line table. ### func (\*LineReader) Tell 1.5 ``` func (r *LineReader) Tell() LineReaderPos ``` Tell returns the current position in the line table. type LineReaderPos 1.5 ---------------------- A LineReaderPos represents a position in a line table. ``` type LineReaderPos struct { // contains filtered or unexported fields } ``` type Offset ----------- An Offset represents the location of an Entry within the DWARF info. (See Reader.Seek.) ``` type Offset uint32 ``` type PtrType ------------ A PtrType represents a pointer type. ``` type PtrType struct { CommonType Type Type } ``` ### func (\*PtrType) String ``` func (t *PtrType) String() string ``` type QualType ------------- A QualType represents a type that has the C/C++ "const", "restrict", or "volatile" qualifier. ``` type QualType struct { CommonType Qual string Type Type } ``` ### func (\*QualType) Size ``` func (t *QualType) Size() int64 ``` ### func (\*QualType) String ``` func (t *QualType) String() string ``` type Reader ----------- A Reader allows reading Entry structures from a DWARF “info” section. The Entry structures are arranged in a tree. The Reader's Next function return successive entries from a pre-order traversal of the tree. If an entry has children, its Children field will be true, and the children follow, terminated by an Entry with Tag 0. ``` type Reader struct { // contains filtered or unexported fields } ``` ### func (\*Reader) AddressSize 1.5 ``` func (r *Reader) AddressSize() int ``` AddressSize returns the size in bytes of addresses in the current compilation unit. ### func (\*Reader) ByteOrder 1.14 ``` func (r *Reader) ByteOrder() binary.ByteOrder ``` ByteOrder returns the byte order in the current compilation unit. ### func (\*Reader) Next ``` func (r *Reader) Next() (*Entry, error) ``` Next reads the next entry from the encoded entry stream. It returns nil, nil when it reaches the end of the section. It returns an error if the current offset is invalid or the data at the offset cannot be decoded as a valid Entry. ### func (\*Reader) Seek ``` func (r *Reader) Seek(off Offset) ``` Seek positions the Reader at offset off in the encoded entry stream. Offset 0 can be used to denote the first entry. ### func (\*Reader) SeekPC 1.7 ``` func (r *Reader) SeekPC(pc uint64) (*Entry, error) ``` SeekPC returns the Entry for the compilation unit that includes pc, and positions the reader to read the children of that unit. If pc is not covered by any unit, SeekPC returns ErrUnknownPC and the position of the reader is undefined. Because compilation units can describe multiple regions of the executable, in the worst case SeekPC must search through all the ranges in all the compilation units. Each call to SeekPC starts the search at the compilation unit of the last call, so in general looking up a series of PCs will be faster if they are sorted. If the caller wishes to do repeated fast PC lookups, it should build an appropriate index using the Ranges method. ### func (\*Reader) SkipChildren ``` func (r *Reader) SkipChildren() ``` SkipChildren skips over the child entries associated with the last Entry returned by Next. If that Entry did not have children or Next has not been called, SkipChildren is a no-op. type StructField ---------------- A StructField represents a field in a struct, union, or C++ class type. ### Bit Fields The BitSize, BitOffset, and DataBitOffset fields describe the bit size and offset of data members declared as bit fields in C/C++ struct/union/class types. BitSize is the number of bits in the bit field. DataBitOffset, if non-zero, is the number of bits from the start of the enclosing entity (e.g. containing struct/class/union) to the start of the bit field. This corresponds to the DW\_AT\_data\_bit\_offset DWARF attribute that was introduced in DWARF 4. BitOffset, if non-zero, is the number of bits between the most significant bit of the storage unit holding the bit field to the most significant bit of the bit field. Here "storage unit" is the type name before the bit field (for a field "unsigned x:17", the storage unit is "unsigned"). BitOffset values can vary depending on the endianness of the system. BitOffset corresponds to the DW\_AT\_bit\_offset DWARF attribute that was deprecated in DWARF 4 and removed in DWARF 5. At most one of DataBitOffset and BitOffset will be non-zero; DataBitOffset/BitOffset will only be non-zero if BitSize is non-zero. Whether a C compiler uses one or the other will depend on compiler vintage and command line options. Here is an example of C/C++ bit field use, along with what to expect in terms of DWARF bit offset info. Consider this code: ``` struct S { int q; int j:5; int k:6; int m:5; int n:8; } s; ``` For the code above, one would expect to see the following for DW\_AT\_bit\_offset values (using GCC 8): ``` Little | Big Endian | Endian | "j": 27 | 0 "k": 21 | 5 "m": 16 | 11 "n": 8 | 16 ``` Note that in the above the offsets are purely with respect to the containing storage unit for j/k/m/n -- these values won't vary based on the size of prior data members in the containing struct. If the compiler emits DW\_AT\_data\_bit\_offset, the expected values would be: ``` "j": 32 "k": 37 "m": 43 "n": 48 ``` Here the value 32 for "j" reflects the fact that the bit field is preceded by other data members (recall that DW\_AT\_data\_bit\_offset values are relative to the start of the containing struct). Hence DW\_AT\_data\_bit\_offset values can be quite large for structs with many fields. DWARF also allow for the possibility of base types that have non-zero bit size and bit offset, so this information is also captured for base types, but it is worth noting that it is not possible to trigger this behavior using mainstream languages. ``` type StructField struct { Name string Type Type ByteOffset int64 ByteSize int64 // usually zero; use Type.Size() for normal fields BitOffset int64 DataBitOffset int64 // Go 1.18 BitSize int64 // zero if not a bit field } ``` type StructType --------------- A StructType represents a struct, union, or C++ class type. ``` type StructType struct { CommonType StructName string Kind string // "struct", "union", or "class". Field []*StructField Incomplete bool // if true, struct, union, class is declared but not defined } ``` ### func (\*StructType) Defn ``` func (t *StructType) Defn() string ``` ### func (\*StructType) String ``` func (t *StructType) String() string ``` type Tag -------- A Tag is the classification (the type) of an Entry. ``` type Tag uint32 ``` ``` const ( TagArrayType Tag = 0x01 TagClassType Tag = 0x02 TagEntryPoint Tag = 0x03 TagEnumerationType Tag = 0x04 TagFormalParameter Tag = 0x05 TagImportedDeclaration Tag = 0x08 TagLabel Tag = 0x0A TagLexDwarfBlock Tag = 0x0B TagMember Tag = 0x0D TagPointerType Tag = 0x0F TagReferenceType Tag = 0x10 TagCompileUnit Tag = 0x11 TagStringType Tag = 0x12 TagStructType Tag = 0x13 TagSubroutineType Tag = 0x15 TagTypedef Tag = 0x16 TagUnionType Tag = 0x17 TagUnspecifiedParameters Tag = 0x18 TagVariant Tag = 0x19 TagCommonDwarfBlock Tag = 0x1A TagCommonInclusion Tag = 0x1B TagInheritance Tag = 0x1C TagInlinedSubroutine Tag = 0x1D TagModule Tag = 0x1E TagPtrToMemberType Tag = 0x1F TagSetType Tag = 0x20 TagSubrangeType Tag = 0x21 TagWithStmt Tag = 0x22 TagAccessDeclaration Tag = 0x23 TagBaseType Tag = 0x24 TagCatchDwarfBlock Tag = 0x25 TagConstType Tag = 0x26 TagConstant Tag = 0x27 TagEnumerator Tag = 0x28 TagFileType Tag = 0x29 TagFriend Tag = 0x2A TagNamelist Tag = 0x2B TagNamelistItem Tag = 0x2C TagPackedType Tag = 0x2D TagSubprogram Tag = 0x2E TagTemplateTypeParameter Tag = 0x2F TagTemplateValueParameter Tag = 0x30 TagThrownType Tag = 0x31 TagTryDwarfBlock Tag = 0x32 TagVariantPart Tag = 0x33 TagVariable Tag = 0x34 TagVolatileType Tag = 0x35 // The following are new in DWARF 3. TagDwarfProcedure Tag = 0x36 TagRestrictType Tag = 0x37 TagInterfaceType Tag = 0x38 TagNamespace Tag = 0x39 TagImportedModule Tag = 0x3A TagUnspecifiedType Tag = 0x3B TagPartialUnit Tag = 0x3C TagImportedUnit Tag = 0x3D TagMutableType Tag = 0x3E // Later removed from DWARF. TagCondition Tag = 0x3F TagSharedType Tag = 0x40 // The following are new in DWARF 4. TagTypeUnit Tag = 0x41 TagRvalueReferenceType Tag = 0x42 TagTemplateAlias Tag = 0x43 // The following are new in DWARF 5. TagCoarrayType Tag = 0x44 TagGenericSubrange Tag = 0x45 TagDynamicType Tag = 0x46 TagAtomicType Tag = 0x47 TagCallSite Tag = 0x48 TagCallSiteParameter Tag = 0x49 TagSkeletonUnit Tag = 0x4A TagImmutableType Tag = 0x4B ) ``` ### func (Tag) GoString ``` func (t Tag) GoString() string ``` ### func (Tag) String ``` func (i Tag) String() string ``` type Type --------- A Type conventionally represents a pointer to any of the specific Type structures (CharType, StructType, etc.). ``` type Type interface { Common() *CommonType String() string Size() int64 } ``` type TypedefType ---------------- A TypedefType represents a named type. ``` type TypedefType struct { CommonType Type Type } ``` ### func (\*TypedefType) Size ``` func (t *TypedefType) Size() int64 ``` ### func (\*TypedefType) String ``` func (t *TypedefType) String() string ``` type UcharType -------------- A UcharType represents an unsigned character type. ``` type UcharType struct { BasicType } ``` type UintType ------------- A UintType represents an unsigned integer type. ``` type UintType struct { BasicType } ``` type UnspecifiedType 1.4 ------------------------ An UnspecifiedType represents an implicit, unknown, ambiguous or nonexistent type. ``` type UnspecifiedType struct { BasicType } ``` type UnsupportedType 1.13 ------------------------- An UnsupportedType is a placeholder returned in situations where we encounter a type that isn't supported. ``` type UnsupportedType struct { CommonType Tag Tag } ``` ### func (\*UnsupportedType) String 1.13 ``` func (t *UnsupportedType) String() string ``` type VoidType ------------- A VoidType represents the C void type. ``` type VoidType struct { CommonType } ``` ### func (\*VoidType) String ``` func (t *VoidType) String() string ```
programming_docs
go Package gosym Package gosym ============== * `import "debug/gosym"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package gosym implements access to the Go symbol and line number tables embedded in Go binaries generated by the gc compilers. Index ----- * [type DecodingError](#DecodingError) * [func (e \*DecodingError) Error() string](#DecodingError.Error) * [type Func](#Func) * [type LineTable](#LineTable) * [func NewLineTable(data []byte, text uint64) \*LineTable](#NewLineTable) * [func (t \*LineTable) LineToPC(line int, maxpc uint64) uint64](#LineTable.LineToPC) * [func (t \*LineTable) PCToLine(pc uint64) int](#LineTable.PCToLine) * [type Obj](#Obj) * [type Sym](#Sym) * [func (s \*Sym) BaseName() string](#Sym.BaseName) * [func (s \*Sym) PackageName() string](#Sym.PackageName) * [func (s \*Sym) ReceiverName() string](#Sym.ReceiverName) * [func (s \*Sym) Static() bool](#Sym.Static) * [type Table](#Table) * [func NewTable(symtab []byte, pcln \*LineTable) (\*Table, error)](#NewTable) * [func (t \*Table) LineToPC(file string, line int) (pc uint64, fn \*Func, err error)](#Table.LineToPC) * [func (t \*Table) LookupFunc(name string) \*Func](#Table.LookupFunc) * [func (t \*Table) LookupSym(name string) \*Sym](#Table.LookupSym) * [func (t \*Table) PCToFunc(pc uint64) \*Func](#Table.PCToFunc) * [func (t \*Table) PCToLine(pc uint64) (file string, line int, fn \*Func)](#Table.PCToLine) * [func (t \*Table) SymByAddr(addr uint64) \*Sym](#Table.SymByAddr) * [type UnknownFileError](#UnknownFileError) * [func (e UnknownFileError) Error() string](#UnknownFileError.Error) * [type UnknownLineError](#UnknownLineError) * [func (e \*UnknownLineError) Error() string](#UnknownLineError.Error) ### Package files pclntab.go symtab.go type DecodingError ------------------ DecodingError represents an error during the decoding of the symbol table. ``` type DecodingError struct { // contains filtered or unexported fields } ``` ### func (\*DecodingError) Error ``` func (e *DecodingError) Error() string ``` type Func --------- A Func collects information about a single function. ``` type Func struct { Entry uint64 *Sym End uint64 Params []*Sym // nil for Go 1.3 and later binaries Locals []*Sym // nil for Go 1.3 and later binaries FrameSize int LineTable *LineTable Obj *Obj } ``` type LineTable -------------- A LineTable is a data structure mapping program counters to line numbers. In Go 1.1 and earlier, each function (represented by a Func) had its own LineTable, and the line number corresponded to a numbering of all source lines in the program, across all files. That absolute line number would then have to be converted separately to a file name and line number within the file. In Go 1.2, the format of the data changed so that there is a single LineTable for the entire program, shared by all Funcs, and there are no absolute line numbers, just line numbers within specific files. For the most part, LineTable's methods should be treated as an internal detail of the package; callers should use the methods on Table instead. ``` type LineTable struct { Data []byte PC uint64 Line int // contains filtered or unexported fields } ``` ### func NewLineTable ``` func NewLineTable(data []byte, text uint64) *LineTable ``` NewLineTable returns a new PC/line table corresponding to the encoded data. Text must be the start address of the corresponding text segment. ### func (\*LineTable) LineToPC ``` func (t *LineTable) LineToPC(line int, maxpc uint64) uint64 ``` LineToPC returns the program counter for the given line number, considering only program counters before maxpc. Deprecated: Use Table's LineToPC method instead. ### func (\*LineTable) PCToLine ``` func (t *LineTable) PCToLine(pc uint64) int ``` PCToLine returns the line number for the given program counter. Deprecated: Use Table's PCToLine method instead. type Obj -------- An Obj represents a collection of functions in a symbol table. The exact method of division of a binary into separate Objs is an internal detail of the symbol table format. In early versions of Go each source file became a different Obj. In Go 1 and Go 1.1, each package produced one Obj for all Go sources and one Obj per C source file. In Go 1.2, there is a single Obj for the entire program. ``` type Obj struct { // Funcs is a list of functions in the Obj. Funcs []Func // In Go 1.1 and earlier, Paths is a list of symbols corresponding // to the source file names that produced the Obj. // In Go 1.2, Paths is nil. // Use the keys of Table.Files to obtain a list of source files. Paths []Sym // meta } ``` type Sym -------- A Sym represents a single symbol table entry. ``` type Sym struct { Value uint64 Type byte Name string GoType uint64 // If this symbol is a function symbol, the corresponding Func Func *Func // contains filtered or unexported fields } ``` ### func (\*Sym) BaseName ``` func (s *Sym) BaseName() string ``` BaseName returns the symbol name without the package or receiver name. ### func (\*Sym) PackageName ``` func (s *Sym) PackageName() string ``` PackageName returns the package part of the symbol name, or the empty string if there is none. ### func (\*Sym) ReceiverName ``` func (s *Sym) ReceiverName() string ``` ReceiverName returns the receiver type name of this symbol, or the empty string if there is none. A receiver name is only detected in the case that s.Name is fully-specified with a package name. ### func (\*Sym) Static ``` func (s *Sym) Static() bool ``` Static reports whether this symbol is static (not visible outside its file). type Table ---------- Table represents a Go symbol table. It stores all of the symbols decoded from the program and provides methods to translate between symbols, names, and addresses. ``` type Table struct { Syms []Sym // nil for Go 1.3 and later binaries Funcs []Func Files map[string]*Obj // for Go 1.2 and later all files map to one Obj Objs []Obj // for Go 1.2 and later only one Obj in slice // contains filtered or unexported fields } ``` ### func NewTable ``` func NewTable(symtab []byte, pcln *LineTable) (*Table, error) ``` NewTable decodes the Go symbol table (the ".gosymtab" section in ELF), returning an in-memory representation. Starting with Go 1.3, the Go symbol table no longer includes symbol data. ### func (\*Table) LineToPC ``` func (t *Table) LineToPC(file string, line int) (pc uint64, fn *Func, err error) ``` LineToPC looks up the first program counter on the given line in the named file. It returns UnknownPathError or UnknownLineError if there is an error looking up this line. ### func (\*Table) LookupFunc ``` func (t *Table) LookupFunc(name string) *Func ``` LookupFunc returns the text, data, or bss symbol with the given name, or nil if no such symbol is found. ### func (\*Table) LookupSym ``` func (t *Table) LookupSym(name string) *Sym ``` LookupSym returns the text, data, or bss symbol with the given name, or nil if no such symbol is found. ### func (\*Table) PCToFunc ``` func (t *Table) PCToFunc(pc uint64) *Func ``` PCToFunc returns the function containing the program counter pc, or nil if there is no such function. ### func (\*Table) PCToLine ``` func (t *Table) PCToLine(pc uint64) (file string, line int, fn *Func) ``` PCToLine looks up line number information for a program counter. If there is no information, it returns fn == nil. ### func (\*Table) SymByAddr ``` func (t *Table) SymByAddr(addr uint64) *Sym ``` SymByAddr returns the text, data, or bss symbol starting at the given address. type UnknownFileError --------------------- UnknownFileError represents a failure to find the specific file in the symbol table. ``` type UnknownFileError string ``` ### func (UnknownFileError) Error ``` func (e UnknownFileError) Error() string ``` type UnknownLineError --------------------- UnknownLineError represents a failure to map a line to a program counter, either because the line is beyond the bounds of the file or because there is no code on the given line. ``` type UnknownLineError struct { File string Line int } ``` ### func (\*UnknownLineError) Error ``` func (e *UnknownLineError) Error() string ``` go Package plan9obj Package plan9obj ================= * `import "debug/plan9obj"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package plan9obj implements access to Plan 9 a.out object files. ### Security This package is not designed to be hardened against adversarial inputs, and is outside the scope of <https://go.dev/security/policy>. In particular, only basic validation is done when parsing object files. As such, care should be taken when parsing untrusted inputs, as parsing malformed files may consume significant resources, or cause panics. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [type File](#File) * [func NewFile(r io.ReaderAt) (\*File, error)](#NewFile) * [func Open(name string) (\*File, error)](#Open) * [func (f \*File) Close() error](#File.Close) * [func (f \*File) Section(name string) \*Section](#File.Section) * [func (f \*File) Symbols() ([]Sym, error)](#File.Symbols) * [type FileHeader](#FileHeader) * [type Section](#Section) * [func (s \*Section) Data() ([]byte, error)](#Section.Data) * [func (s \*Section) Open() io.ReadSeeker](#Section.Open) * [type SectionHeader](#SectionHeader) * [type Sym](#Sym) ### Package files file.go plan9obj.go Constants --------- ``` const ( Magic64 = 0x8000 // 64-bit expanded header Magic386 = (4*11+0)*11 + 7 MagicAMD64 = (4*26+0)*26 + 7 + Magic64 MagicARM = (4*20+0)*20 + 7 ) ``` Variables --------- ErrNoSymbols is returned by File.Symbols if there is no such section in the File. ``` var ErrNoSymbols = errors.New("no symbol section") ``` type File 1.3 ------------- A File represents an open Plan 9 a.out file. ``` type File struct { FileHeader Sections []*Section // contains filtered or unexported fields } ``` ### func NewFile 1.3 ``` func NewFile(r io.ReaderAt) (*File, error) ``` NewFile creates a new File for accessing a Plan 9 binary in an underlying reader. The Plan 9 binary is expected to start at position 0 in the ReaderAt. ### func Open 1.3 ``` func Open(name string) (*File, error) ``` Open opens the named file using os.Open and prepares it for use as a Plan 9 a.out binary. ### func (\*File) Close 1.3 ``` func (f *File) Close() error ``` Close closes the File. If the File was created using NewFile directly instead of Open, Close has no effect. ### func (\*File) Section 1.3 ``` func (f *File) Section(name string) *Section ``` Section returns a section with the given name, or nil if no such section exists. ### func (\*File) Symbols 1.3 ``` func (f *File) Symbols() ([]Sym, error) ``` Symbols returns the symbol table for f. type FileHeader 1.3 ------------------- A FileHeader represents a Plan 9 a.out file header. ``` type FileHeader struct { Magic uint32 Bss uint32 Entry uint64 PtrSize int LoadAddress uint64 // Go 1.4 HdrSize uint64 // Go 1.4 } ``` type Section 1.3 ---------------- A Section represents a single section in a Plan 9 a.out file. ``` type Section struct { SectionHeader // Embed ReaderAt for ReadAt method. // Do not embed SectionReader directly // to avoid having Read and Seek. // If a client wants Read and Seek it must use // Open() to avoid fighting over the seek offset // with other clients. io.ReaderAt // contains filtered or unexported fields } ``` ### func (\*Section) Data 1.3 ``` func (s *Section) Data() ([]byte, error) ``` Data reads and returns the contents of the Plan 9 a.out section. ### func (\*Section) Open 1.3 ``` func (s *Section) Open() io.ReadSeeker ``` Open returns a new ReadSeeker reading the Plan 9 a.out section. type SectionHeader 1.3 ---------------------- A SectionHeader represents a single Plan 9 a.out section header. This structure doesn't exist on-disk, but eases navigation through the object file. ``` type SectionHeader struct { Name string Size uint32 Offset uint32 } ``` type Sym 1.3 ------------ A Symbol represents an entry in a Plan 9 a.out symbol table section. ``` type Sym struct { Value uint64 Type rune Name string } ``` go Package buildinfo Package buildinfo ================== * `import "debug/buildinfo"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package buildinfo provides access to information embedded in a Go binary about how it was built. This includes the Go toolchain version, and the set of modules used (for binaries built in module mode). Build information is available for the currently running binary in runtime/debug.ReadBuildInfo. Index ----- * [type BuildInfo](#BuildInfo) * [func Read(r io.ReaderAt) (\*BuildInfo, error)](#Read) * [func ReadFile(name string) (info \*BuildInfo, err error)](#ReadFile) ### Package files buildinfo.go type BuildInfo 1.18 ------------------- Type alias for build info. We cannot move the types here, since runtime/debug would need to import this package, which would make it a much larger dependency. ``` type BuildInfo = debug.BuildInfo ``` ### func Read 1.18 ``` func Read(r io.ReaderAt) (*BuildInfo, error) ``` Read returns build information embedded in a Go binary file accessed through the given ReaderAt. Most information is only available for binaries built with module support. ### func ReadFile 1.18 ``` func ReadFile(name string) (info *BuildInfo, err error) ``` ReadFile returns build information embedded in a Go binary file at the given path. Most information is only available for binaries built with module support. go Package elf Package elf ============ * `import "debug/elf"` * [Overview](#pkg-overview) * [Index](#pkg-index) Overview -------- Package elf implements access to ELF object files. ### Security This package is not designed to be hardened against adversarial inputs, and is outside the scope of <https://go.dev/security/policy>. In particular, only basic validation is done when parsing object files. As such, care should be taken when parsing untrusted inputs, as parsing malformed files may consume significant resources, or cause panics. Index ----- * [Constants](#pkg-constants) * [Variables](#pkg-variables) * [func R\_INFO(sym, typ uint32) uint64](#R_INFO) * [func R\_INFO32(sym, typ uint32) uint32](#R_INFO32) * [func R\_SYM32(info uint32) uint32](#R_SYM32) * [func R\_SYM64(info uint64) uint32](#R_SYM64) * [func R\_TYPE32(info uint32) uint32](#R_TYPE32) * [func R\_TYPE64(info uint64) uint32](#R_TYPE64) * [func ST\_INFO(bind SymBind, typ SymType) uint8](#ST_INFO) * [type Chdr32](#Chdr32) * [type Chdr64](#Chdr64) * [type Class](#Class) * [func (i Class) GoString() string](#Class.GoString) * [func (i Class) String() string](#Class.String) * [type CompressionType](#CompressionType) * [func (i CompressionType) GoString() string](#CompressionType.GoString) * [func (i CompressionType) String() string](#CompressionType.String) * [type Data](#Data) * [func (i Data) GoString() string](#Data.GoString) * [func (i Data) String() string](#Data.String) * [type Dyn32](#Dyn32) * [type Dyn64](#Dyn64) * [type DynFlag](#DynFlag) * [func (i DynFlag) GoString() string](#DynFlag.GoString) * [func (i DynFlag) String() string](#DynFlag.String) * [type DynTag](#DynTag) * [func (i DynTag) GoString() string](#DynTag.GoString) * [func (i DynTag) String() string](#DynTag.String) * [type File](#File) * [func NewFile(r io.ReaderAt) (\*File, error)](#NewFile) * [func Open(name string) (\*File, error)](#Open) * [func (f \*File) Close() error](#File.Close) * [func (f \*File) DWARF() (\*dwarf.Data, error)](#File.DWARF) * [func (f \*File) DynString(tag DynTag) ([]string, error)](#File.DynString) * [func (f \*File) DynamicSymbols() ([]Symbol, error)](#File.DynamicSymbols) * [func (f \*File) ImportedLibraries() ([]string, error)](#File.ImportedLibraries) * [func (f \*File) ImportedSymbols() ([]ImportedSymbol, error)](#File.ImportedSymbols) * [func (f \*File) Section(name string) \*Section](#File.Section) * [func (f \*File) SectionByType(typ SectionType) \*Section](#File.SectionByType) * [func (f \*File) Symbols() ([]Symbol, error)](#File.Symbols) * [type FileHeader](#FileHeader) * [type FormatError](#FormatError) * [func (e \*FormatError) Error() string](#FormatError.Error) * [type Header32](#Header32) * [type Header64](#Header64) * [type ImportedSymbol](#ImportedSymbol) * [type Machine](#Machine) * [func (i Machine) GoString() string](#Machine.GoString) * [func (i Machine) String() string](#Machine.String) * [type NType](#NType) * [func (i NType) GoString() string](#NType.GoString) * [func (i NType) String() string](#NType.String) * [type OSABI](#OSABI) * [func (i OSABI) GoString() string](#OSABI.GoString) * [func (i OSABI) String() string](#OSABI.String) * [type Prog](#Prog) * [func (p \*Prog) Open() io.ReadSeeker](#Prog.Open) * [type Prog32](#Prog32) * [type Prog64](#Prog64) * [type ProgFlag](#ProgFlag) * [func (i ProgFlag) GoString() string](#ProgFlag.GoString) * [func (i ProgFlag) String() string](#ProgFlag.String) * [type ProgHeader](#ProgHeader) * [type ProgType](#ProgType) * [func (i ProgType) GoString() string](#ProgType.GoString) * [func (i ProgType) String() string](#ProgType.String) * [type R\_386](#R_386) * [func (i R\_386) GoString() string](#R_386.GoString) * [func (i R\_386) String() string](#R_386.String) * [type R\_390](#R_390) * [func (i R\_390) GoString() string](#R_390.GoString) * [func (i R\_390) String() string](#R_390.String) * [type R\_AARCH64](#R_AARCH64) * [func (i R\_AARCH64) GoString() string](#R_AARCH64.GoString) * [func (i R\_AARCH64) String() string](#R_AARCH64.String) * [type R\_ALPHA](#R_ALPHA) * [func (i R\_ALPHA) GoString() string](#R_ALPHA.GoString) * [func (i R\_ALPHA) String() string](#R_ALPHA.String) * [type R\_ARM](#R_ARM) * [func (i R\_ARM) GoString() string](#R_ARM.GoString) * [func (i R\_ARM) String() string](#R_ARM.String) * [type R\_LARCH](#R_LARCH) * [func (i R\_LARCH) GoString() string](#R_LARCH.GoString) * [func (i R\_LARCH) String() string](#R_LARCH.String) * [type R\_MIPS](#R_MIPS) * [func (i R\_MIPS) GoString() string](#R_MIPS.GoString) * [func (i R\_MIPS) String() string](#R_MIPS.String) * [type R\_PPC](#R_PPC) * [func (i R\_PPC) GoString() string](#R_PPC.GoString) * [func (i R\_PPC) String() string](#R_PPC.String) * [type R\_PPC64](#R_PPC64) * [func (i R\_PPC64) GoString() string](#R_PPC64.GoString) * [func (i R\_PPC64) String() string](#R_PPC64.String) * [type R\_RISCV](#R_RISCV) * [func (i R\_RISCV) GoString() string](#R_RISCV.GoString) * [func (i R\_RISCV) String() string](#R_RISCV.String) * [type R\_SPARC](#R_SPARC) * [func (i R\_SPARC) GoString() string](#R_SPARC.GoString) * [func (i R\_SPARC) String() string](#R_SPARC.String) * [type R\_X86\_64](#R_X86_64) * [func (i R\_X86\_64) GoString() string](#R_X86_64.GoString) * [func (i R\_X86\_64) String() string](#R_X86_64.String) * [type Rel32](#Rel32) * [type Rel64](#Rel64) * [type Rela32](#Rela32) * [type Rela64](#Rela64) * [type Section](#Section) * [func (s \*Section) Data() ([]byte, error)](#Section.Data) * [func (s \*Section) Open() io.ReadSeeker](#Section.Open) * [type Section32](#Section32) * [type Section64](#Section64) * [type SectionFlag](#SectionFlag) * [func (i SectionFlag) GoString() string](#SectionFlag.GoString) * [func (i SectionFlag) String() string](#SectionFlag.String) * [type SectionHeader](#SectionHeader) * [type SectionIndex](#SectionIndex) * [func (i SectionIndex) GoString() string](#SectionIndex.GoString) * [func (i SectionIndex) String() string](#SectionIndex.String) * [type SectionType](#SectionType) * [func (i SectionType) GoString() string](#SectionType.GoString) * [func (i SectionType) String() string](#SectionType.String) * [type Sym32](#Sym32) * [type Sym64](#Sym64) * [type SymBind](#SymBind) * [func ST\_BIND(info uint8) SymBind](#ST_BIND) * [func (i SymBind) GoString() string](#SymBind.GoString) * [func (i SymBind) String() string](#SymBind.String) * [type SymType](#SymType) * [func ST\_TYPE(info uint8) SymType](#ST_TYPE) * [func (i SymType) GoString() string](#SymType.GoString) * [func (i SymType) String() string](#SymType.String) * [type SymVis](#SymVis) * [func ST\_VISIBILITY(other uint8) SymVis](#ST_VISIBILITY) * [func (i SymVis) GoString() string](#SymVis.GoString) * [func (i SymVis) String() string](#SymVis.String) * [type Symbol](#Symbol) * [type Type](#Type) * [func (i Type) GoString() string](#Type.GoString) * [func (i Type) String() string](#Type.String) * [type Version](#Version) * [func (i Version) GoString() string](#Version.GoString) * [func (i Version) String() string](#Version.String) ### Package files elf.go file.go reader.go Constants --------- Indexes into the Header.Ident array. ``` const ( EI_CLASS = 4 /* Class of machine. */ EI_DATA = 5 /* Data format. */ EI_VERSION = 6 /* ELF format version. */ EI_OSABI = 7 /* Operating system / ABI identification */ EI_ABIVERSION = 8 /* ABI version */ EI_PAD = 9 /* Start of padding (per SVR4 ABI). */ EI_NIDENT = 16 /* Size of e_ident array. */ ) ``` Magic number for the elf trampoline, chosen wisely to be an immediate value. ``` const ARM_MAGIC_TRAMP_NUMBER = 0x5c000003 ``` Initial magic number for ELF files. ``` const ELFMAG = "\177ELF" ``` ``` const Sym32Size = 16 ``` ``` const Sym64Size = 24 ``` Variables --------- ErrNoSymbols is returned by File.Symbols and File.DynamicSymbols if there is no such section in the File. ``` var ErrNoSymbols = errors.New("no symbol section") ``` func R\_INFO ------------ ``` func R_INFO(sym, typ uint32) uint64 ``` func R\_INFO32 -------------- ``` func R_INFO32(sym, typ uint32) uint32 ``` func R\_SYM32 ------------- ``` func R_SYM32(info uint32) uint32 ``` func R\_SYM64 ------------- ``` func R_SYM64(info uint64) uint32 ``` func R\_TYPE32 -------------- ``` func R_TYPE32(info uint32) uint32 ``` func R\_TYPE64 -------------- ``` func R_TYPE64(info uint64) uint32 ``` func ST\_INFO ------------- ``` func ST_INFO(bind SymBind, typ SymType) uint8 ``` type Chdr32 1.6 --------------- ELF32 Compression header. ``` type Chdr32 struct { Type uint32 Size uint32 Addralign uint32 } ``` type Chdr64 1.6 --------------- ELF64 Compression header. ``` type Chdr64 struct { Type uint32 Size uint64 Addralign uint64 // contains filtered or unexported fields } ``` type Class ---------- Class is found in Header.Ident[EI\_CLASS] and Header.Class. ``` type Class byte ``` ``` const ( ELFCLASSNONE Class = 0 /* Unknown class. */ ELFCLASS32 Class = 1 /* 32-bit architecture. */ ELFCLASS64 Class = 2 /* 64-bit architecture. */ ) ``` ### func (Class) GoString ``` func (i Class) GoString() string ``` ### func (Class) String ``` func (i Class) String() string ``` type CompressionType 1.6 ------------------------ Section compression type. ``` type CompressionType int ``` ``` const ( COMPRESS_ZLIB CompressionType = 1 /* ZLIB compression. */ COMPRESS_LOOS CompressionType = 0x60000000 /* First OS-specific. */ COMPRESS_HIOS CompressionType = 0x6fffffff /* Last OS-specific. */ COMPRESS_LOPROC CompressionType = 0x70000000 /* First processor-specific type. */ COMPRESS_HIPROC CompressionType = 0x7fffffff /* Last processor-specific type. */ ) ``` ### func (CompressionType) GoString 1.6 ``` func (i CompressionType) GoString() string ``` ### func (CompressionType) String 1.6 ``` func (i CompressionType) String() string ``` type Data --------- Data is found in Header.Ident[EI\_DATA] and Header.Data. ``` type Data byte ``` ``` const ( ELFDATANONE Data = 0 /* Unknown data format. */ ELFDATA2LSB Data = 1 /* 2's complement little-endian. */ ELFDATA2MSB Data = 2 /* 2's complement big-endian. */ ) ``` ### func (Data) GoString ``` func (i Data) GoString() string ``` ### func (Data) String ``` func (i Data) String() string ``` type Dyn32 ---------- ELF32 Dynamic structure. The ".dynamic" section contains an array of them. ``` type Dyn32 struct { Tag int32 /* Entry type. */ Val uint32 /* Integer/Address value. */ } ``` type Dyn64 ---------- ELF64 Dynamic structure. The ".dynamic" section contains an array of them. ``` type Dyn64 struct { Tag int64 /* Entry type. */ Val uint64 /* Integer/address value */ } ``` type DynFlag ------------ DT\_FLAGS values. ``` type DynFlag int ``` ``` const ( DF_ORIGIN DynFlag = 0x0001 /* Indicates that the object being loaded may make reference to the $ORIGIN substitution string */ DF_SYMBOLIC DynFlag = 0x0002 /* Indicates "symbolic" linking. */ DF_TEXTREL DynFlag = 0x0004 /* Indicates there may be relocations in non-writable segments. */ DF_BIND_NOW DynFlag = 0x0008 /* Indicates that the dynamic linker should process all relocations for the object containing this entry before transferring control to the program. */ DF_STATIC_TLS DynFlag = 0x0010 /* Indicates that the shared object or executable contains code using a static thread-local storage scheme. */ ) ``` ### func (DynFlag) GoString ``` func (i DynFlag) GoString() string ``` ### func (DynFlag) String ``` func (i DynFlag) String() string ``` type DynTag ----------- Dyn.Tag ``` type DynTag int ``` ``` const ( DT_NULL DynTag = 0 /* Terminating entry. */ DT_NEEDED DynTag = 1 /* String table offset of a needed shared library. */ DT_PLTRELSZ DynTag = 2 /* Total size in bytes of PLT relocations. */ DT_PLTGOT DynTag = 3 /* Processor-dependent address. */ DT_HASH DynTag = 4 /* Address of symbol hash table. */ DT_STRTAB DynTag = 5 /* Address of string table. */ DT_SYMTAB DynTag = 6 /* Address of symbol table. */ DT_RELA DynTag = 7 /* Address of ElfNN_Rela relocations. */ DT_RELASZ DynTag = 8 /* Total size of ElfNN_Rela relocations. */ DT_RELAENT DynTag = 9 /* Size of each ElfNN_Rela relocation entry. */ DT_STRSZ DynTag = 10 /* Size of string table. */ DT_SYMENT DynTag = 11 /* Size of each symbol table entry. */ DT_INIT DynTag = 12 /* Address of initialization function. */ DT_FINI DynTag = 13 /* Address of finalization function. */ DT_SONAME DynTag = 14 /* String table offset of shared object name. */ DT_RPATH DynTag = 15 /* String table offset of library path. [sup] */ DT_SYMBOLIC DynTag = 16 /* Indicates "symbolic" linking. [sup] */ DT_REL DynTag = 17 /* Address of ElfNN_Rel relocations. */ DT_RELSZ DynTag = 18 /* Total size of ElfNN_Rel relocations. */ DT_RELENT DynTag = 19 /* Size of each ElfNN_Rel relocation. */ DT_PLTREL DynTag = 20 /* Type of relocation used for PLT. */ DT_DEBUG DynTag = 21 /* Reserved (not used). */ DT_TEXTREL DynTag = 22 /* Indicates there may be relocations in non-writable segments. [sup] */ DT_JMPREL DynTag = 23 /* Address of PLT relocations. */ DT_BIND_NOW DynTag = 24 /* [sup] */ DT_INIT_ARRAY DynTag = 25 /* Address of the array of pointers to initialization functions */ DT_FINI_ARRAY DynTag = 26 /* Address of the array of pointers to termination functions */ DT_INIT_ARRAYSZ DynTag = 27 /* Size in bytes of the array of initialization functions. */ DT_FINI_ARRAYSZ DynTag = 28 /* Size in bytes of the array of termination functions. */ DT_RUNPATH DynTag = 29 /* String table offset of a null-terminated library search path string. */ DT_FLAGS DynTag = 30 /* Object specific flag values. */ DT_ENCODING DynTag = 32 /* Values greater than or equal to DT_ENCODING and less than DT_LOOS follow the rules for the interpretation of the d_un union as follows: even == 'd_ptr', even == 'd_val' or none */ DT_PREINIT_ARRAY DynTag = 32 /* Address of the array of pointers to pre-initialization functions. */ DT_PREINIT_ARRAYSZ DynTag = 33 /* Size in bytes of the array of pre-initialization functions. */ DT_SYMTAB_SHNDX DynTag = 34 /* Address of SHT_SYMTAB_SHNDX section. */ DT_LOOS DynTag = 0x6000000d /* First OS-specific */ DT_HIOS DynTag = 0x6ffff000 /* Last OS-specific */ DT_VALRNGLO DynTag = 0x6ffffd00 DT_GNU_PRELINKED DynTag = 0x6ffffdf5 DT_GNU_CONFLICTSZ DynTag = 0x6ffffdf6 DT_GNU_LIBLISTSZ DynTag = 0x6ffffdf7 DT_CHECKSUM DynTag = 0x6ffffdf8 DT_PLTPADSZ DynTag = 0x6ffffdf9 DT_MOVEENT DynTag = 0x6ffffdfa DT_MOVESZ DynTag = 0x6ffffdfb DT_FEATURE DynTag = 0x6ffffdfc DT_POSFLAG_1 DynTag = 0x6ffffdfd DT_SYMINSZ DynTag = 0x6ffffdfe DT_SYMINENT DynTag = 0x6ffffdff DT_VALRNGHI DynTag = 0x6ffffdff DT_ADDRRNGLO DynTag = 0x6ffffe00 DT_GNU_HASH DynTag = 0x6ffffef5 DT_TLSDESC_PLT DynTag = 0x6ffffef6 DT_TLSDESC_GOT DynTag = 0x6ffffef7 DT_GNU_CONFLICT DynTag = 0x6ffffef8 DT_GNU_LIBLIST DynTag = 0x6ffffef9 DT_CONFIG DynTag = 0x6ffffefa DT_DEPAUDIT DynTag = 0x6ffffefb DT_AUDIT DynTag = 0x6ffffefc DT_PLTPAD DynTag = 0x6ffffefd DT_MOVETAB DynTag = 0x6ffffefe DT_SYMINFO DynTag = 0x6ffffeff DT_ADDRRNGHI DynTag = 0x6ffffeff DT_VERSYM DynTag = 0x6ffffff0 DT_RELACOUNT DynTag = 0x6ffffff9 DT_RELCOUNT DynTag = 0x6ffffffa DT_FLAGS_1 DynTag = 0x6ffffffb DT_VERDEF DynTag = 0x6ffffffc DT_VERDEFNUM DynTag = 0x6ffffffd DT_VERNEED DynTag = 0x6ffffffe DT_VERNEEDNUM DynTag = 0x6fffffff DT_LOPROC DynTag = 0x70000000 /* First processor-specific type. */ DT_MIPS_RLD_VERSION DynTag = 0x70000001 DT_MIPS_TIME_STAMP DynTag = 0x70000002 DT_MIPS_ICHECKSUM DynTag = 0x70000003 DT_MIPS_IVERSION DynTag = 0x70000004 DT_MIPS_FLAGS DynTag = 0x70000005 DT_MIPS_BASE_ADDRESS DynTag = 0x70000006 DT_MIPS_MSYM DynTag = 0x70000007 DT_MIPS_CONFLICT DynTag = 0x70000008 DT_MIPS_LIBLIST DynTag = 0x70000009 DT_MIPS_LOCAL_GOTNO DynTag = 0x7000000a DT_MIPS_CONFLICTNO DynTag = 0x7000000b DT_MIPS_LIBLISTNO DynTag = 0x70000010 DT_MIPS_SYMTABNO DynTag = 0x70000011 DT_MIPS_UNREFEXTNO DynTag = 0x70000012 DT_MIPS_GOTSYM DynTag = 0x70000013 DT_MIPS_HIPAGENO DynTag = 0x70000014 DT_MIPS_RLD_MAP DynTag = 0x70000016 DT_MIPS_DELTA_CLASS DynTag = 0x70000017 DT_MIPS_DELTA_CLASS_NO DynTag = 0x70000018 DT_MIPS_DELTA_INSTANCE DynTag = 0x70000019 DT_MIPS_DELTA_INSTANCE_NO DynTag = 0x7000001a DT_MIPS_DELTA_RELOC DynTag = 0x7000001b DT_MIPS_DELTA_RELOC_NO DynTag = 0x7000001c DT_MIPS_DELTA_SYM DynTag = 0x7000001d DT_MIPS_DELTA_SYM_NO DynTag = 0x7000001e DT_MIPS_DELTA_CLASSSYM DynTag = 0x70000020 DT_MIPS_DELTA_CLASSSYM_NO DynTag = 0x70000021 DT_MIPS_CXX_FLAGS DynTag = 0x70000022 DT_MIPS_PIXIE_INIT DynTag = 0x70000023 DT_MIPS_SYMBOL_LIB DynTag = 0x70000024 DT_MIPS_LOCALPAGE_GOTIDX DynTag = 0x70000025 DT_MIPS_LOCAL_GOTIDX DynTag = 0x70000026 DT_MIPS_HIDDEN_GOTIDX DynTag = 0x70000027 DT_MIPS_PROTECTED_GOTIDX DynTag = 0x70000028 DT_MIPS_OPTIONS DynTag = 0x70000029 DT_MIPS_INTERFACE DynTag = 0x7000002a DT_MIPS_DYNSTR_ALIGN DynTag = 0x7000002b DT_MIPS_INTERFACE_SIZE DynTag = 0x7000002c DT_MIPS_RLD_TEXT_RESOLVE_ADDR DynTag = 0x7000002d DT_MIPS_PERF_SUFFIX DynTag = 0x7000002e DT_MIPS_COMPACT_SIZE DynTag = 0x7000002f DT_MIPS_GP_VALUE DynTag = 0x70000030 DT_MIPS_AUX_DYNAMIC DynTag = 0x70000031 DT_MIPS_PLTGOT DynTag = 0x70000032 DT_MIPS_RWPLT DynTag = 0x70000034 DT_MIPS_RLD_MAP_REL DynTag = 0x70000035 DT_PPC_GOT DynTag = 0x70000000 DT_PPC_OPT DynTag = 0x70000001 DT_PPC64_GLINK DynTag = 0x70000000 DT_PPC64_OPD DynTag = 0x70000001 DT_PPC64_OPDSZ DynTag = 0x70000002 DT_PPC64_OPT DynTag = 0x70000003 DT_SPARC_REGISTER DynTag = 0x70000001 DT_AUXILIARY DynTag = 0x7ffffffd DT_USED DynTag = 0x7ffffffe DT_FILTER DynTag = 0x7fffffff DT_HIPROC DynTag = 0x7fffffff /* Last processor-specific type. */ ) ``` ### func (DynTag) GoString ``` func (i DynTag) GoString() string ``` ### func (DynTag) String ``` func (i DynTag) String() string ``` type File --------- A File represents an open ELF file. ``` type File struct { FileHeader Sections []*Section Progs []*Prog // contains filtered or unexported fields } ``` ### func NewFile ``` func NewFile(r io.ReaderAt) (*File, error) ``` NewFile creates a new File for accessing an ELF binary in an underlying reader. The ELF binary is expected to start at position 0 in the ReaderAt. ### func Open ``` func Open(name string) (*File, error) ``` Open opens the named file using os.Open and prepares it for use as an ELF binary. ### func (\*File) Close ``` func (f *File) Close() error ``` Close closes the File. If the File was created using NewFile directly instead of Open, Close has no effect. ### func (\*File) DWARF ``` func (f *File) DWARF() (*dwarf.Data, error) ``` ### func (\*File) DynString 1.1 ``` func (f *File) DynString(tag DynTag) ([]string, error) ``` DynString returns the strings listed for the given tag in the file's dynamic section. The tag must be one that takes string values: DT\_NEEDED, DT\_SONAME, DT\_RPATH, or DT\_RUNPATH. ### func (\*File) DynamicSymbols 1.4 ``` func (f *File) DynamicSymbols() ([]Symbol, error) ``` DynamicSymbols returns the dynamic symbol table for f. The symbols will be listed in the order they appear in f. If f has a symbol version table, the returned Symbols will have initialized Version and Library fields. For compatibility with Symbols, DynamicSymbols omits the null symbol at index 0. After retrieving the symbols as symtab, an externally supplied index x corresponds to symtab[x-1], not symtab[x]. ### func (\*File) ImportedLibraries ``` func (f *File) ImportedLibraries() ([]string, error) ``` ImportedLibraries returns the names of all libraries referred to by the binary f that are expected to be linked with the binary at dynamic link time. ### func (\*File) ImportedSymbols ``` func (f *File) ImportedSymbols() ([]ImportedSymbol, error) ``` ImportedSymbols returns the names of all symbols referred to by the binary f that are expected to be satisfied by other libraries at dynamic load time. It does not return weak symbols. ### func (\*File) Section ``` func (f *File) Section(name string) *Section ``` Section returns a section with the given name, or nil if no such section exists. ### func (\*File) SectionByType ``` func (f *File) SectionByType(typ SectionType) *Section ``` SectionByType returns the first section in f with the given type, or nil if there is no such section. ### func (\*File) Symbols ``` func (f *File) Symbols() ([]Symbol, error) ``` Symbols returns the symbol table for f. The symbols will be listed in the order they appear in f. For compatibility with Go 1.0, Symbols omits the null symbol at index 0. After retrieving the symbols as symtab, an externally supplied index x corresponds to symtab[x-1], not symtab[x]. type FileHeader --------------- A FileHeader represents an ELF file header. ``` type FileHeader struct { Class Class Data Data Version Version OSABI OSABI ABIVersion uint8 ByteOrder binary.ByteOrder Type Type Machine Machine Entry uint64 // Go 1.1 } ``` type FormatError ---------------- ``` type FormatError struct { // contains filtered or unexported fields } ``` ### func (\*FormatError) Error ``` func (e *FormatError) Error() string ``` type Header32 ------------- ELF32 File header. ``` type Header32 struct { Ident [EI_NIDENT]byte /* File identification. */ Type uint16 /* File type. */ Machine uint16 /* Machine architecture. */ Version uint32 /* ELF format version. */ Entry uint32 /* Entry point. */ Phoff uint32 /* Program header file offset. */ Shoff uint32 /* Section header file offset. */ Flags uint32 /* Architecture-specific flags. */ Ehsize uint16 /* Size of ELF header in bytes. */ Phentsize uint16 /* Size of program header entry. */ Phnum uint16 /* Number of program header entries. */ Shentsize uint16 /* Size of section header entry. */ Shnum uint16 /* Number of section header entries. */ Shstrndx uint16 /* Section name strings section. */ } ``` type Header64 ------------- ELF64 file header. ``` type Header64 struct { Ident [EI_NIDENT]byte /* File identification. */ Type uint16 /* File type. */ Machine uint16 /* Machine architecture. */ Version uint32 /* ELF format version. */ Entry uint64 /* Entry point. */ Phoff uint64 /* Program header file offset. */ Shoff uint64 /* Section header file offset. */ Flags uint32 /* Architecture-specific flags. */ Ehsize uint16 /* Size of ELF header in bytes. */ Phentsize uint16 /* Size of program header entry. */ Phnum uint16 /* Number of program header entries. */ Shentsize uint16 /* Size of section header entry. */ Shnum uint16 /* Number of section header entries. */ Shstrndx uint16 /* Section name strings section. */ } ``` type ImportedSymbol ------------------- ``` type ImportedSymbol struct { Name string Version string Library string } ``` type Machine ------------ Machine is found in Header.Machine. ``` type Machine uint16 ``` ``` const ( EM_NONE Machine = 0 /* Unknown machine. */ EM_M32 Machine = 1 /* AT&T WE32100. */ EM_SPARC Machine = 2 /* Sun SPARC. */ EM_386 Machine = 3 /* Intel i386. */ EM_68K Machine = 4 /* Motorola 68000. */ EM_88K Machine = 5 /* Motorola 88000. */ EM_860 Machine = 7 /* Intel i860. */ EM_MIPS Machine = 8 /* MIPS R3000 Big-Endian only. */ EM_S370 Machine = 9 /* IBM System/370. */ EM_MIPS_RS3_LE Machine = 10 /* MIPS R3000 Little-Endian. */ EM_PARISC Machine = 15 /* HP PA-RISC. */ EM_VPP500 Machine = 17 /* Fujitsu VPP500. */ EM_SPARC32PLUS Machine = 18 /* SPARC v8plus. */ EM_960 Machine = 19 /* Intel 80960. */ EM_PPC Machine = 20 /* PowerPC 32-bit. */ EM_PPC64 Machine = 21 /* PowerPC 64-bit. */ EM_S390 Machine = 22 /* IBM System/390. */ EM_V800 Machine = 36 /* NEC V800. */ EM_FR20 Machine = 37 /* Fujitsu FR20. */ EM_RH32 Machine = 38 /* TRW RH-32. */ EM_RCE Machine = 39 /* Motorola RCE. */ EM_ARM Machine = 40 /* ARM. */ EM_SH Machine = 42 /* Hitachi SH. */ EM_SPARCV9 Machine = 43 /* SPARC v9 64-bit. */ EM_TRICORE Machine = 44 /* Siemens TriCore embedded processor. */ EM_ARC Machine = 45 /* Argonaut RISC Core. */ EM_H8_300 Machine = 46 /* Hitachi H8/300. */ EM_H8_300H Machine = 47 /* Hitachi H8/300H. */ EM_H8S Machine = 48 /* Hitachi H8S. */ EM_H8_500 Machine = 49 /* Hitachi H8/500. */ EM_IA_64 Machine = 50 /* Intel IA-64 Processor. */ EM_MIPS_X Machine = 51 /* Stanford MIPS-X. */ EM_COLDFIRE Machine = 52 /* Motorola ColdFire. */ EM_68HC12 Machine = 53 /* Motorola M68HC12. */ EM_MMA Machine = 54 /* Fujitsu MMA. */ EM_PCP Machine = 55 /* Siemens PCP. */ EM_NCPU Machine = 56 /* Sony nCPU. */ EM_NDR1 Machine = 57 /* Denso NDR1 microprocessor. */ EM_STARCORE Machine = 58 /* Motorola Star*Core processor. */ EM_ME16 Machine = 59 /* Toyota ME16 processor. */ EM_ST100 Machine = 60 /* STMicroelectronics ST100 processor. */ EM_TINYJ Machine = 61 /* Advanced Logic Corp. TinyJ processor. */ EM_X86_64 Machine = 62 /* Advanced Micro Devices x86-64 */ EM_PDSP Machine = 63 /* Sony DSP Processor */ EM_PDP10 Machine = 64 /* Digital Equipment Corp. PDP-10 */ EM_PDP11 Machine = 65 /* Digital Equipment Corp. PDP-11 */ EM_FX66 Machine = 66 /* Siemens FX66 microcontroller */ EM_ST9PLUS Machine = 67 /* STMicroelectronics ST9+ 8/16 bit microcontroller */ EM_ST7 Machine = 68 /* STMicroelectronics ST7 8-bit microcontroller */ EM_68HC16 Machine = 69 /* Motorola MC68HC16 Microcontroller */ EM_68HC11 Machine = 70 /* Motorola MC68HC11 Microcontroller */ EM_68HC08 Machine = 71 /* Motorola MC68HC08 Microcontroller */ EM_68HC05 Machine = 72 /* Motorola MC68HC05 Microcontroller */ EM_SVX Machine = 73 /* Silicon Graphics SVx */ EM_ST19 Machine = 74 /* STMicroelectronics ST19 8-bit microcontroller */ EM_VAX Machine = 75 /* Digital VAX */ EM_CRIS Machine = 76 /* Axis Communications 32-bit embedded processor */ EM_JAVELIN Machine = 77 /* Infineon Technologies 32-bit embedded processor */ EM_FIREPATH Machine = 78 /* Element 14 64-bit DSP Processor */ EM_ZSP Machine = 79 /* LSI Logic 16-bit DSP Processor */ EM_MMIX Machine = 80 /* Donald Knuth's educational 64-bit processor */ EM_HUANY Machine = 81 /* Harvard University machine-independent object files */ EM_PRISM Machine = 82 /* SiTera Prism */ EM_AVR Machine = 83 /* Atmel AVR 8-bit microcontroller */ EM_FR30 Machine = 84 /* Fujitsu FR30 */ EM_D10V Machine = 85 /* Mitsubishi D10V */ EM_D30V Machine = 86 /* Mitsubishi D30V */ EM_V850 Machine = 87 /* NEC v850 */ EM_M32R Machine = 88 /* Mitsubishi M32R */ EM_MN10300 Machine = 89 /* Matsushita MN10300 */ EM_MN10200 Machine = 90 /* Matsushita MN10200 */ EM_PJ Machine = 91 /* picoJava */ EM_OPENRISC Machine = 92 /* OpenRISC 32-bit embedded processor */ EM_ARC_COMPACT Machine = 93 /* ARC International ARCompact processor (old spelling/synonym: EM_ARC_A5) */ EM_XTENSA Machine = 94 /* Tensilica Xtensa Architecture */ EM_VIDEOCORE Machine = 95 /* Alphamosaic VideoCore processor */ EM_TMM_GPP Machine = 96 /* Thompson Multimedia General Purpose Processor */ EM_NS32K Machine = 97 /* National Semiconductor 32000 series */ EM_TPC Machine = 98 /* Tenor Network TPC processor */ EM_SNP1K Machine = 99 /* Trebia SNP 1000 processor */ EM_ST200 Machine = 100 /* STMicroelectronics (www.st.com) ST200 microcontroller */ EM_IP2K Machine = 101 /* Ubicom IP2xxx microcontroller family */ EM_MAX Machine = 102 /* MAX Processor */ EM_CR Machine = 103 /* National Semiconductor CompactRISC microprocessor */ EM_F2MC16 Machine = 104 /* Fujitsu F2MC16 */ EM_MSP430 Machine = 105 /* Texas Instruments embedded microcontroller msp430 */ EM_BLACKFIN Machine = 106 /* Analog Devices Blackfin (DSP) processor */ EM_SE_C33 Machine = 107 /* S1C33 Family of Seiko Epson processors */ EM_SEP Machine = 108 /* Sharp embedded microprocessor */ EM_ARCA Machine = 109 /* Arca RISC Microprocessor */ EM_UNICORE Machine = 110 /* Microprocessor series from PKU-Unity Ltd. and MPRC of Peking University */ EM_EXCESS Machine = 111 /* eXcess: 16/32/64-bit configurable embedded CPU */ EM_DXP Machine = 112 /* Icera Semiconductor Inc. Deep Execution Processor */ EM_ALTERA_NIOS2 Machine = 113 /* Altera Nios II soft-core processor */ EM_CRX Machine = 114 /* National Semiconductor CompactRISC CRX microprocessor */ EM_XGATE Machine = 115 /* Motorola XGATE embedded processor */ EM_C166 Machine = 116 /* Infineon C16x/XC16x processor */ EM_M16C Machine = 117 /* Renesas M16C series microprocessors */ EM_DSPIC30F Machine = 118 /* Microchip Technology dsPIC30F Digital Signal Controller */ EM_CE Machine = 119 /* Freescale Communication Engine RISC core */ EM_M32C Machine = 120 /* Renesas M32C series microprocessors */ EM_TSK3000 Machine = 131 /* Altium TSK3000 core */ EM_RS08 Machine = 132 /* Freescale RS08 embedded processor */ EM_SHARC Machine = 133 /* Analog Devices SHARC family of 32-bit DSP processors */ EM_ECOG2 Machine = 134 /* Cyan Technology eCOG2 microprocessor */ EM_SCORE7 Machine = 135 /* Sunplus S+core7 RISC processor */ EM_DSP24 Machine = 136 /* New Japan Radio (NJR) 24-bit DSP Processor */ EM_VIDEOCORE3 Machine = 137 /* Broadcom VideoCore III processor */ EM_LATTICEMICO32 Machine = 138 /* RISC processor for Lattice FPGA architecture */ EM_SE_C17 Machine = 139 /* Seiko Epson C17 family */ EM_TI_C6000 Machine = 140 /* The Texas Instruments TMS320C6000 DSP family */ EM_TI_C2000 Machine = 141 /* The Texas Instruments TMS320C2000 DSP family */ EM_TI_C5500 Machine = 142 /* The Texas Instruments TMS320C55x DSP family */ EM_TI_ARP32 Machine = 143 /* Texas Instruments Application Specific RISC Processor, 32bit fetch */ EM_TI_PRU Machine = 144 /* Texas Instruments Programmable Realtime Unit */ EM_MMDSP_PLUS Machine = 160 /* STMicroelectronics 64bit VLIW Data Signal Processor */ EM_CYPRESS_M8C Machine = 161 /* Cypress M8C microprocessor */ EM_R32C Machine = 162 /* Renesas R32C series microprocessors */ EM_TRIMEDIA Machine = 163 /* NXP Semiconductors TriMedia architecture family */ EM_QDSP6 Machine = 164 /* QUALCOMM DSP6 Processor */ EM_8051 Machine = 165 /* Intel 8051 and variants */ EM_STXP7X Machine = 166 /* STMicroelectronics STxP7x family of configurable and extensible RISC processors */ EM_NDS32 Machine = 167 /* Andes Technology compact code size embedded RISC processor family */ EM_ECOG1 Machine = 168 /* Cyan Technology eCOG1X family */ EM_ECOG1X Machine = 168 /* Cyan Technology eCOG1X family */ EM_MAXQ30 Machine = 169 /* Dallas Semiconductor MAXQ30 Core Micro-controllers */ EM_XIMO16 Machine = 170 /* New Japan Radio (NJR) 16-bit DSP Processor */ EM_MANIK Machine = 171 /* M2000 Reconfigurable RISC Microprocessor */ EM_CRAYNV2 Machine = 172 /* Cray Inc. NV2 vector architecture */ EM_RX Machine = 173 /* Renesas RX family */ EM_METAG Machine = 174 /* Imagination Technologies META processor architecture */ EM_MCST_ELBRUS Machine = 175 /* MCST Elbrus general purpose hardware architecture */ EM_ECOG16 Machine = 176 /* Cyan Technology eCOG16 family */ EM_CR16 Machine = 177 /* National Semiconductor CompactRISC CR16 16-bit microprocessor */ EM_ETPU Machine = 178 /* Freescale Extended Time Processing Unit */ EM_SLE9X Machine = 179 /* Infineon Technologies SLE9X core */ EM_L10M Machine = 180 /* Intel L10M */ EM_K10M Machine = 181 /* Intel K10M */ EM_AARCH64 Machine = 183 /* ARM 64-bit Architecture (AArch64) */ EM_AVR32 Machine = 185 /* Atmel Corporation 32-bit microprocessor family */ EM_STM8 Machine = 186 /* STMicroeletronics STM8 8-bit microcontroller */ EM_TILE64 Machine = 187 /* Tilera TILE64 multicore architecture family */ EM_TILEPRO Machine = 188 /* Tilera TILEPro multicore architecture family */ EM_MICROBLAZE Machine = 189 /* Xilinx MicroBlaze 32-bit RISC soft processor core */ EM_CUDA Machine = 190 /* NVIDIA CUDA architecture */ EM_TILEGX Machine = 191 /* Tilera TILE-Gx multicore architecture family */ EM_CLOUDSHIELD Machine = 192 /* CloudShield architecture family */ EM_COREA_1ST Machine = 193 /* KIPO-KAIST Core-A 1st generation processor family */ EM_COREA_2ND Machine = 194 /* KIPO-KAIST Core-A 2nd generation processor family */ EM_ARC_COMPACT2 Machine = 195 /* Synopsys ARCompact V2 */ EM_OPEN8 Machine = 196 /* Open8 8-bit RISC soft processor core */ EM_RL78 Machine = 197 /* Renesas RL78 family */ EM_VIDEOCORE5 Machine = 198 /* Broadcom VideoCore V processor */ EM_78KOR Machine = 199 /* Renesas 78KOR family */ EM_56800EX Machine = 200 /* Freescale 56800EX Digital Signal Controller (DSC) */ EM_BA1 Machine = 201 /* Beyond BA1 CPU architecture */ EM_BA2 Machine = 202 /* Beyond BA2 CPU architecture */ EM_XCORE Machine = 203 /* XMOS xCORE processor family */ EM_MCHP_PIC Machine = 204 /* Microchip 8-bit PIC(r) family */ EM_INTEL205 Machine = 205 /* Reserved by Intel */ EM_INTEL206 Machine = 206 /* Reserved by Intel */ EM_INTEL207 Machine = 207 /* Reserved by Intel */ EM_INTEL208 Machine = 208 /* Reserved by Intel */ EM_INTEL209 Machine = 209 /* Reserved by Intel */ EM_KM32 Machine = 210 /* KM211 KM32 32-bit processor */ EM_KMX32 Machine = 211 /* KM211 KMX32 32-bit processor */ EM_KMX16 Machine = 212 /* KM211 KMX16 16-bit processor */ EM_KMX8 Machine = 213 /* KM211 KMX8 8-bit processor */ EM_KVARC Machine = 214 /* KM211 KVARC processor */ EM_CDP Machine = 215 /* Paneve CDP architecture family */ EM_COGE Machine = 216 /* Cognitive Smart Memory Processor */ EM_COOL Machine = 217 /* Bluechip Systems CoolEngine */ EM_NORC Machine = 218 /* Nanoradio Optimized RISC */ EM_CSR_KALIMBA Machine = 219 /* CSR Kalimba architecture family */ EM_Z80 Machine = 220 /* Zilog Z80 */ EM_VISIUM Machine = 221 /* Controls and Data Services VISIUMcore processor */ EM_FT32 Machine = 222 /* FTDI Chip FT32 high performance 32-bit RISC architecture */ EM_MOXIE Machine = 223 /* Moxie processor family */ EM_AMDGPU Machine = 224 /* AMD GPU architecture */ EM_RISCV Machine = 243 /* RISC-V */ EM_LANAI Machine = 244 /* Lanai 32-bit processor */ EM_BPF Machine = 247 /* Linux BPF – in-kernel virtual machine */ EM_LOONGARCH Machine = 258 /* LoongArch */ /* Non-standard or deprecated. */ EM_486 Machine = 6 /* Intel i486. */ EM_MIPS_RS4_BE Machine = 10 /* MIPS R4000 Big-Endian */ EM_ALPHA_STD Machine = 41 /* Digital Alpha (standard value). */ EM_ALPHA Machine = 0x9026 /* Alpha (written in the absence of an ABI) */ ) ``` ### func (Machine) GoString ``` func (i Machine) GoString() string ``` ### func (Machine) String ``` func (i Machine) String() string ``` type NType ---------- NType values; used in core files. ``` type NType int ``` ``` const ( NT_PRSTATUS NType = 1 /* Process status. */ NT_FPREGSET NType = 2 /* Floating point registers. */ NT_PRPSINFO NType = 3 /* Process state info. */ ) ``` ### func (NType) GoString ``` func (i NType) GoString() string ``` ### func (NType) String ``` func (i NType) String() string ``` type OSABI ---------- OSABI is found in Header.Ident[EI\_OSABI] and Header.OSABI. ``` type OSABI byte ``` ``` const ( ELFOSABI_NONE OSABI = 0 /* UNIX System V ABI */ ELFOSABI_HPUX OSABI = 1 /* HP-UX operating system */ ELFOSABI_NETBSD OSABI = 2 /* NetBSD */ ELFOSABI_LINUX OSABI = 3 /* Linux */ ELFOSABI_HURD OSABI = 4 /* Hurd */ ELFOSABI_86OPEN OSABI = 5 /* 86Open common IA32 ABI */ ELFOSABI_SOLARIS OSABI = 6 /* Solaris */ ELFOSABI_AIX OSABI = 7 /* AIX */ ELFOSABI_IRIX OSABI = 8 /* IRIX */ ELFOSABI_FREEBSD OSABI = 9 /* FreeBSD */ ELFOSABI_TRU64 OSABI = 10 /* TRU64 UNIX */ ELFOSABI_MODESTO OSABI = 11 /* Novell Modesto */ ELFOSABI_OPENBSD OSABI = 12 /* OpenBSD */ ELFOSABI_OPENVMS OSABI = 13 /* Open VMS */ ELFOSABI_NSK OSABI = 14 /* HP Non-Stop Kernel */ ELFOSABI_AROS OSABI = 15 /* Amiga Research OS */ ELFOSABI_FENIXOS OSABI = 16 /* The FenixOS highly scalable multi-core OS */ ELFOSABI_CLOUDABI OSABI = 17 /* Nuxi CloudABI */ ELFOSABI_ARM OSABI = 97 /* ARM */ ELFOSABI_STANDALONE OSABI = 255 /* Standalone (embedded) application */ ) ``` ### func (OSABI) GoString ``` func (i OSABI) GoString() string ``` ### func (OSABI) String ``` func (i OSABI) String() string ``` type Prog --------- A Prog represents a single ELF program header in an ELF binary. ``` type Prog struct { ProgHeader // Embed ReaderAt for ReadAt method. // Do not embed SectionReader directly // to avoid having Read and Seek. // If a client wants Read and Seek it must use // Open() to avoid fighting over the seek offset // with other clients. io.ReaderAt // contains filtered or unexported fields } ``` ### func (\*Prog) Open ``` func (p *Prog) Open() io.ReadSeeker ``` Open returns a new ReadSeeker reading the ELF program body. type Prog32 ----------- ELF32 Program header. ``` type Prog32 struct { Type uint32 /* Entry type. */ Off uint32 /* File offset of contents. */ Vaddr uint32 /* Virtual address in memory image. */ Paddr uint32 /* Physical address (not used). */ Filesz uint32 /* Size of contents in file. */ Memsz uint32 /* Size of contents in memory. */ Flags uint32 /* Access permission flags. */ Align uint32 /* Alignment in memory and file. */ } ``` type Prog64 ----------- ELF64 Program header. ``` type Prog64 struct { Type uint32 /* Entry type. */ Flags uint32 /* Access permission flags. */ Off uint64 /* File offset of contents. */ Vaddr uint64 /* Virtual address in memory image. */ Paddr uint64 /* Physical address (not used). */ Filesz uint64 /* Size of contents in file. */ Memsz uint64 /* Size of contents in memory. */ Align uint64 /* Alignment in memory and file. */ } ``` type ProgFlag ------------- Prog.Flag ``` type ProgFlag uint32 ``` ``` const ( PF_X ProgFlag = 0x1 /* Executable. */ PF_W ProgFlag = 0x2 /* Writable. */ PF_R ProgFlag = 0x4 /* Readable. */ PF_MASKOS ProgFlag = 0x0ff00000 /* Operating system-specific. */ PF_MASKPROC ProgFlag = 0xf0000000 /* Processor-specific. */ ) ``` ### func (ProgFlag) GoString ``` func (i ProgFlag) GoString() string ``` ### func (ProgFlag) String ``` func (i ProgFlag) String() string ``` type ProgHeader --------------- A ProgHeader represents a single ELF program header. ``` type ProgHeader struct { Type ProgType Flags ProgFlag Off uint64 Vaddr uint64 Paddr uint64 Filesz uint64 Memsz uint64 Align uint64 } ``` type ProgType ------------- Prog.Type ``` type ProgType int ``` ``` const ( PT_NULL ProgType = 0 /* Unused entry. */ PT_LOAD ProgType = 1 /* Loadable segment. */ PT_DYNAMIC ProgType = 2 /* Dynamic linking information segment. */ PT_INTERP ProgType = 3 /* Pathname of interpreter. */ PT_NOTE ProgType = 4 /* Auxiliary information. */ PT_SHLIB ProgType = 5 /* Reserved (not used). */ PT_PHDR ProgType = 6 /* Location of program header itself. */ PT_TLS ProgType = 7 /* Thread local storage segment */ PT_LOOS ProgType = 0x60000000 /* First OS-specific. */ PT_GNU_EH_FRAME ProgType = 0x6474e550 /* Frame unwind information */ PT_GNU_STACK ProgType = 0x6474e551 /* Stack flags */ PT_GNU_RELRO ProgType = 0x6474e552 /* Read only after relocs */ PT_GNU_PROPERTY ProgType = 0x6474e553 /* GNU property */ PT_GNU_MBIND_LO ProgType = 0x6474e555 /* Mbind segments start */ PT_GNU_MBIND_HI ProgType = 0x6474f554 /* Mbind segments finish */ PT_PAX_FLAGS ProgType = 0x65041580 /* PAX flags */ PT_OPENBSD_RANDOMIZE ProgType = 0x65a3dbe6 /* Random data */ PT_OPENBSD_WXNEEDED ProgType = 0x65a3dbe7 /* W^X violations */ PT_OPENBSD_BOOTDATA ProgType = 0x65a41be6 /* Boot arguments */ PT_SUNW_EH_FRAME ProgType = 0x6474e550 /* Frame unwind information */ PT_SUNWSTACK ProgType = 0x6ffffffb /* Stack segment */ PT_HIOS ProgType = 0x6fffffff /* Last OS-specific. */ PT_LOPROC ProgType = 0x70000000 /* First processor-specific type. */ PT_ARM_ARCHEXT ProgType = 0x70000000 /* Architecture compatibility */ PT_ARM_EXIDX ProgType = 0x70000001 /* Exception unwind tables */ PT_AARCH64_ARCHEXT ProgType = 0x70000000 /* Architecture compatibility */ PT_AARCH64_UNWIND ProgType = 0x70000001 /* Exception unwind tables */ PT_MIPS_REGINFO ProgType = 0x70000000 /* Register usage */ PT_MIPS_RTPROC ProgType = 0x70000001 /* Runtime procedures */ PT_MIPS_OPTIONS ProgType = 0x70000002 /* Options */ PT_MIPS_ABIFLAGS ProgType = 0x70000003 /* ABI flags */ PT_S390_PGSTE ProgType = 0x70000000 /* 4k page table size */ PT_HIPROC ProgType = 0x7fffffff /* Last processor-specific type. */ ) ``` ### func (ProgType) GoString ``` func (i ProgType) GoString() string ``` ### func (ProgType) String ``` func (i ProgType) String() string ``` type R\_386 ----------- Relocation types for 386. ``` type R_386 int ``` ``` const ( R_386_NONE R_386 = 0 /* No relocation. */ R_386_32 R_386 = 1 /* Add symbol value. */ R_386_PC32 R_386 = 2 /* Add PC-relative symbol value. */ R_386_GOT32 R_386 = 3 /* Add PC-relative GOT offset. */ R_386_PLT32 R_386 = 4 /* Add PC-relative PLT offset. */ R_386_COPY R_386 = 5 /* Copy data from shared object. */ R_386_GLOB_DAT R_386 = 6 /* Set GOT entry to data address. */ R_386_JMP_SLOT R_386 = 7 /* Set GOT entry to code address. */ R_386_RELATIVE R_386 = 8 /* Add load address of shared object. */ R_386_GOTOFF R_386 = 9 /* Add GOT-relative symbol address. */ R_386_GOTPC R_386 = 10 /* Add PC-relative GOT table address. */ R_386_32PLT R_386 = 11 R_386_TLS_TPOFF R_386 = 14 /* Negative offset in static TLS block */ R_386_TLS_IE R_386 = 15 /* Absolute address of GOT for -ve static TLS */ R_386_TLS_GOTIE R_386 = 16 /* GOT entry for negative static TLS block */ R_386_TLS_LE R_386 = 17 /* Negative offset relative to static TLS */ R_386_TLS_GD R_386 = 18 /* 32 bit offset to GOT (index,off) pair */ R_386_TLS_LDM R_386 = 19 /* 32 bit offset to GOT (index,zero) pair */ R_386_16 R_386 = 20 R_386_PC16 R_386 = 21 R_386_8 R_386 = 22 R_386_PC8 R_386 = 23 R_386_TLS_GD_32 R_386 = 24 /* 32 bit offset to GOT (index,off) pair */ R_386_TLS_GD_PUSH R_386 = 25 /* pushl instruction for Sun ABI GD sequence */ R_386_TLS_GD_CALL R_386 = 26 /* call instruction for Sun ABI GD sequence */ R_386_TLS_GD_POP R_386 = 27 /* popl instruction for Sun ABI GD sequence */ R_386_TLS_LDM_32 R_386 = 28 /* 32 bit offset to GOT (index,zero) pair */ R_386_TLS_LDM_PUSH R_386 = 29 /* pushl instruction for Sun ABI LD sequence */ R_386_TLS_LDM_CALL R_386 = 30 /* call instruction for Sun ABI LD sequence */ R_386_TLS_LDM_POP R_386 = 31 /* popl instruction for Sun ABI LD sequence */ R_386_TLS_LDO_32 R_386 = 32 /* 32 bit offset from start of TLS block */ R_386_TLS_IE_32 R_386 = 33 /* 32 bit offset to GOT static TLS offset entry */ R_386_TLS_LE_32 R_386 = 34 /* 32 bit offset within static TLS block */ R_386_TLS_DTPMOD32 R_386 = 35 /* GOT entry containing TLS index */ R_386_TLS_DTPOFF32 R_386 = 36 /* GOT entry containing TLS offset */ R_386_TLS_TPOFF32 R_386 = 37 /* GOT entry of -ve static TLS offset */ R_386_SIZE32 R_386 = 38 R_386_TLS_GOTDESC R_386 = 39 R_386_TLS_DESC_CALL R_386 = 40 R_386_TLS_DESC R_386 = 41 R_386_IRELATIVE R_386 = 42 R_386_GOT32X R_386 = 43 ) ``` ### func (R\_386) GoString ``` func (i R_386) GoString() string ``` ### func (R\_386) String ``` func (i R_386) String() string ``` type R\_390 1.7 --------------- Relocation types for s390x processors. ``` type R_390 int ``` ``` const ( R_390_NONE R_390 = 0 R_390_8 R_390 = 1 R_390_12 R_390 = 2 R_390_16 R_390 = 3 R_390_32 R_390 = 4 R_390_PC32 R_390 = 5 R_390_GOT12 R_390 = 6 R_390_GOT32 R_390 = 7 R_390_PLT32 R_390 = 8 R_390_COPY R_390 = 9 R_390_GLOB_DAT R_390 = 10 R_390_JMP_SLOT R_390 = 11 R_390_RELATIVE R_390 = 12 R_390_GOTOFF R_390 = 13 R_390_GOTPC R_390 = 14 R_390_GOT16 R_390 = 15 R_390_PC16 R_390 = 16 R_390_PC16DBL R_390 = 17 R_390_PLT16DBL R_390 = 18 R_390_PC32DBL R_390 = 19 R_390_PLT32DBL R_390 = 20 R_390_GOTPCDBL R_390 = 21 R_390_64 R_390 = 22 R_390_PC64 R_390 = 23 R_390_GOT64 R_390 = 24 R_390_PLT64 R_390 = 25 R_390_GOTENT R_390 = 26 R_390_GOTOFF16 R_390 = 27 R_390_GOTOFF64 R_390 = 28 R_390_GOTPLT12 R_390 = 29 R_390_GOTPLT16 R_390 = 30 R_390_GOTPLT32 R_390 = 31 R_390_GOTPLT64 R_390 = 32 R_390_GOTPLTENT R_390 = 33 R_390_GOTPLTOFF16 R_390 = 34 R_390_GOTPLTOFF32 R_390 = 35 R_390_GOTPLTOFF64 R_390 = 36 R_390_TLS_LOAD R_390 = 37 R_390_TLS_GDCALL R_390 = 38 R_390_TLS_LDCALL R_390 = 39 R_390_TLS_GD32 R_390 = 40 R_390_TLS_GD64 R_390 = 41 R_390_TLS_GOTIE12 R_390 = 42 R_390_TLS_GOTIE32 R_390 = 43 R_390_TLS_GOTIE64 R_390 = 44 R_390_TLS_LDM32 R_390 = 45 R_390_TLS_LDM64 R_390 = 46 R_390_TLS_IE32 R_390 = 47 R_390_TLS_IE64 R_390 = 48 R_390_TLS_IEENT R_390 = 49 R_390_TLS_LE32 R_390 = 50 R_390_TLS_LE64 R_390 = 51 R_390_TLS_LDO32 R_390 = 52 R_390_TLS_LDO64 R_390 = 53 R_390_TLS_DTPMOD R_390 = 54 R_390_TLS_DTPOFF R_390 = 55 R_390_TLS_TPOFF R_390 = 56 R_390_20 R_390 = 57 R_390_GOT20 R_390 = 58 R_390_GOTPLT20 R_390 = 59 R_390_TLS_GOTIE20 R_390 = 60 ) ``` ### func (R\_390) GoString 1.7 ``` func (i R_390) GoString() string ``` ### func (R\_390) String 1.7 ``` func (i R_390) String() string ``` type R\_AARCH64 1.4 ------------------- Relocation types for AArch64 (aka arm64) ``` type R_AARCH64 int ``` ``` const ( R_AARCH64_NONE R_AARCH64 = 0 R_AARCH64_P32_ABS32 R_AARCH64 = 1 R_AARCH64_P32_ABS16 R_AARCH64 = 2 R_AARCH64_P32_PREL32 R_AARCH64 = 3 R_AARCH64_P32_PREL16 R_AARCH64 = 4 R_AARCH64_P32_MOVW_UABS_G0 R_AARCH64 = 5 R_AARCH64_P32_MOVW_UABS_G0_NC R_AARCH64 = 6 R_AARCH64_P32_MOVW_UABS_G1 R_AARCH64 = 7 R_AARCH64_P32_MOVW_SABS_G0 R_AARCH64 = 8 R_AARCH64_P32_LD_PREL_LO19 R_AARCH64 = 9 R_AARCH64_P32_ADR_PREL_LO21 R_AARCH64 = 10 R_AARCH64_P32_ADR_PREL_PG_HI21 R_AARCH64 = 11 R_AARCH64_P32_ADD_ABS_LO12_NC R_AARCH64 = 12 R_AARCH64_P32_LDST8_ABS_LO12_NC R_AARCH64 = 13 R_AARCH64_P32_LDST16_ABS_LO12_NC R_AARCH64 = 14 R_AARCH64_P32_LDST32_ABS_LO12_NC R_AARCH64 = 15 R_AARCH64_P32_LDST64_ABS_LO12_NC R_AARCH64 = 16 R_AARCH64_P32_LDST128_ABS_LO12_NC R_AARCH64 = 17 R_AARCH64_P32_TSTBR14 R_AARCH64 = 18 R_AARCH64_P32_CONDBR19 R_AARCH64 = 19 R_AARCH64_P32_JUMP26 R_AARCH64 = 20 R_AARCH64_P32_CALL26 R_AARCH64 = 21 R_AARCH64_P32_GOT_LD_PREL19 R_AARCH64 = 25 R_AARCH64_P32_ADR_GOT_PAGE R_AARCH64 = 26 R_AARCH64_P32_LD32_GOT_LO12_NC R_AARCH64 = 27 R_AARCH64_P32_TLSGD_ADR_PAGE21 R_AARCH64 = 81 R_AARCH64_P32_TLSGD_ADD_LO12_NC R_AARCH64 = 82 R_AARCH64_P32_TLSIE_ADR_GOTTPREL_PAGE21 R_AARCH64 = 103 R_AARCH64_P32_TLSIE_LD32_GOTTPREL_LO12_NC R_AARCH64 = 104 R_AARCH64_P32_TLSIE_LD_GOTTPREL_PREL19 R_AARCH64 = 105 R_AARCH64_P32_TLSLE_MOVW_TPREL_G1 R_AARCH64 = 106 R_AARCH64_P32_TLSLE_MOVW_TPREL_G0 R_AARCH64 = 107 R_AARCH64_P32_TLSLE_MOVW_TPREL_G0_NC R_AARCH64 = 108 R_AARCH64_P32_TLSLE_ADD_TPREL_HI12 R_AARCH64 = 109 R_AARCH64_P32_TLSLE_ADD_TPREL_LO12 R_AARCH64 = 110 R_AARCH64_P32_TLSLE_ADD_TPREL_LO12_NC R_AARCH64 = 111 R_AARCH64_P32_TLSDESC_LD_PREL19 R_AARCH64 = 122 R_AARCH64_P32_TLSDESC_ADR_PREL21 R_AARCH64 = 123 R_AARCH64_P32_TLSDESC_ADR_PAGE21 R_AARCH64 = 124 R_AARCH64_P32_TLSDESC_LD32_LO12_NC R_AARCH64 = 125 R_AARCH64_P32_TLSDESC_ADD_LO12_NC R_AARCH64 = 126 R_AARCH64_P32_TLSDESC_CALL R_AARCH64 = 127 R_AARCH64_P32_COPY R_AARCH64 = 180 R_AARCH64_P32_GLOB_DAT R_AARCH64 = 181 R_AARCH64_P32_JUMP_SLOT R_AARCH64 = 182 R_AARCH64_P32_RELATIVE R_AARCH64 = 183 R_AARCH64_P32_TLS_DTPMOD R_AARCH64 = 184 R_AARCH64_P32_TLS_DTPREL R_AARCH64 = 185 R_AARCH64_P32_TLS_TPREL R_AARCH64 = 186 R_AARCH64_P32_TLSDESC R_AARCH64 = 187 R_AARCH64_P32_IRELATIVE R_AARCH64 = 188 R_AARCH64_NULL R_AARCH64 = 256 R_AARCH64_ABS64 R_AARCH64 = 257 R_AARCH64_ABS32 R_AARCH64 = 258 R_AARCH64_ABS16 R_AARCH64 = 259 R_AARCH64_PREL64 R_AARCH64 = 260 R_AARCH64_PREL32 R_AARCH64 = 261 R_AARCH64_PREL16 R_AARCH64 = 262 R_AARCH64_MOVW_UABS_G0 R_AARCH64 = 263 R_AARCH64_MOVW_UABS_G0_NC R_AARCH64 = 264 R_AARCH64_MOVW_UABS_G1 R_AARCH64 = 265 R_AARCH64_MOVW_UABS_G1_NC R_AARCH64 = 266 R_AARCH64_MOVW_UABS_G2 R_AARCH64 = 267 R_AARCH64_MOVW_UABS_G2_NC R_AARCH64 = 268 R_AARCH64_MOVW_UABS_G3 R_AARCH64 = 269 R_AARCH64_MOVW_SABS_G0 R_AARCH64 = 270 R_AARCH64_MOVW_SABS_G1 R_AARCH64 = 271 R_AARCH64_MOVW_SABS_G2 R_AARCH64 = 272 R_AARCH64_LD_PREL_LO19 R_AARCH64 = 273 R_AARCH64_ADR_PREL_LO21 R_AARCH64 = 274 R_AARCH64_ADR_PREL_PG_HI21 R_AARCH64 = 275 R_AARCH64_ADR_PREL_PG_HI21_NC R_AARCH64 = 276 R_AARCH64_ADD_ABS_LO12_NC R_AARCH64 = 277 R_AARCH64_LDST8_ABS_LO12_NC R_AARCH64 = 278 R_AARCH64_TSTBR14 R_AARCH64 = 279 R_AARCH64_CONDBR19 R_AARCH64 = 280 R_AARCH64_JUMP26 R_AARCH64 = 282 R_AARCH64_CALL26 R_AARCH64 = 283 R_AARCH64_LDST16_ABS_LO12_NC R_AARCH64 = 284 R_AARCH64_LDST32_ABS_LO12_NC R_AARCH64 = 285 R_AARCH64_LDST64_ABS_LO12_NC R_AARCH64 = 286 R_AARCH64_LDST128_ABS_LO12_NC R_AARCH64 = 299 R_AARCH64_GOT_LD_PREL19 R_AARCH64 = 309 R_AARCH64_LD64_GOTOFF_LO15 R_AARCH64 = 310 R_AARCH64_ADR_GOT_PAGE R_AARCH64 = 311 R_AARCH64_LD64_GOT_LO12_NC R_AARCH64 = 312 R_AARCH64_LD64_GOTPAGE_LO15 R_AARCH64 = 313 R_AARCH64_TLSGD_ADR_PREL21 R_AARCH64 = 512 R_AARCH64_TLSGD_ADR_PAGE21 R_AARCH64 = 513 R_AARCH64_TLSGD_ADD_LO12_NC R_AARCH64 = 514 R_AARCH64_TLSGD_MOVW_G1 R_AARCH64 = 515 R_AARCH64_TLSGD_MOVW_G0_NC R_AARCH64 = 516 R_AARCH64_TLSLD_ADR_PREL21 R_AARCH64 = 517 R_AARCH64_TLSLD_ADR_PAGE21 R_AARCH64 = 518 R_AARCH64_TLSIE_MOVW_GOTTPREL_G1 R_AARCH64 = 539 R_AARCH64_TLSIE_MOVW_GOTTPREL_G0_NC R_AARCH64 = 540 R_AARCH64_TLSIE_ADR_GOTTPREL_PAGE21 R_AARCH64 = 541 R_AARCH64_TLSIE_LD64_GOTTPREL_LO12_NC R_AARCH64 = 542 R_AARCH64_TLSIE_LD_GOTTPREL_PREL19 R_AARCH64 = 543 R_AARCH64_TLSLE_MOVW_TPREL_G2 R_AARCH64 = 544 R_AARCH64_TLSLE_MOVW_TPREL_G1 R_AARCH64 = 545 R_AARCH64_TLSLE_MOVW_TPREL_G1_NC R_AARCH64 = 546 R_AARCH64_TLSLE_MOVW_TPREL_G0 R_AARCH64 = 547 R_AARCH64_TLSLE_MOVW_TPREL_G0_NC R_AARCH64 = 548 R_AARCH64_TLSLE_ADD_TPREL_HI12 R_AARCH64 = 549 R_AARCH64_TLSLE_ADD_TPREL_LO12 R_AARCH64 = 550 R_AARCH64_TLSLE_ADD_TPREL_LO12_NC R_AARCH64 = 551 R_AARCH64_TLSDESC_LD_PREL19 R_AARCH64 = 560 R_AARCH64_TLSDESC_ADR_PREL21 R_AARCH64 = 561 R_AARCH64_TLSDESC_ADR_PAGE21 R_AARCH64 = 562 R_AARCH64_TLSDESC_LD64_LO12_NC R_AARCH64 = 563 R_AARCH64_TLSDESC_ADD_LO12_NC R_AARCH64 = 564 R_AARCH64_TLSDESC_OFF_G1 R_AARCH64 = 565 R_AARCH64_TLSDESC_OFF_G0_NC R_AARCH64 = 566 R_AARCH64_TLSDESC_LDR R_AARCH64 = 567 R_AARCH64_TLSDESC_ADD R_AARCH64 = 568 R_AARCH64_TLSDESC_CALL R_AARCH64 = 569 R_AARCH64_TLSLE_LDST128_TPREL_LO12 R_AARCH64 = 570 R_AARCH64_TLSLE_LDST128_TPREL_LO12_NC R_AARCH64 = 571 R_AARCH64_TLSLD_LDST128_DTPREL_LO12 R_AARCH64 = 572 R_AARCH64_TLSLD_LDST128_DTPREL_LO12_NC R_AARCH64 = 573 R_AARCH64_COPY R_AARCH64 = 1024 R_AARCH64_GLOB_DAT R_AARCH64 = 1025 R_AARCH64_JUMP_SLOT R_AARCH64 = 1026 R_AARCH64_RELATIVE R_AARCH64 = 1027 R_AARCH64_TLS_DTPMOD64 R_AARCH64 = 1028 R_AARCH64_TLS_DTPREL64 R_AARCH64 = 1029 R_AARCH64_TLS_TPREL64 R_AARCH64 = 1030 R_AARCH64_TLSDESC R_AARCH64 = 1031 R_AARCH64_IRELATIVE R_AARCH64 = 1032 ) ``` ### func (R\_AARCH64) GoString 1.4 ``` func (i R_AARCH64) GoString() string ``` ### func (R\_AARCH64) String 1.4 ``` func (i R_AARCH64) String() string ``` type R\_ALPHA ------------- Relocation types for Alpha. ``` type R_ALPHA int ``` ``` const ( R_ALPHA_NONE R_ALPHA = 0 /* No reloc */ R_ALPHA_REFLONG R_ALPHA = 1 /* Direct 32 bit */ R_ALPHA_REFQUAD R_ALPHA = 2 /* Direct 64 bit */ R_ALPHA_GPREL32 R_ALPHA = 3 /* GP relative 32 bit */ R_ALPHA_LITERAL R_ALPHA = 4 /* GP relative 16 bit w/optimization */ R_ALPHA_LITUSE R_ALPHA = 5 /* Optimization hint for LITERAL */ R_ALPHA_GPDISP R_ALPHA = 6 /* Add displacement to GP */ R_ALPHA_BRADDR R_ALPHA = 7 /* PC+4 relative 23 bit shifted */ R_ALPHA_HINT R_ALPHA = 8 /* PC+4 relative 16 bit shifted */ R_ALPHA_SREL16 R_ALPHA = 9 /* PC relative 16 bit */ R_ALPHA_SREL32 R_ALPHA = 10 /* PC relative 32 bit */ R_ALPHA_SREL64 R_ALPHA = 11 /* PC relative 64 bit */ R_ALPHA_OP_PUSH R_ALPHA = 12 /* OP stack push */ R_ALPHA_OP_STORE R_ALPHA = 13 /* OP stack pop and store */ R_ALPHA_OP_PSUB R_ALPHA = 14 /* OP stack subtract */ R_ALPHA_OP_PRSHIFT R_ALPHA = 15 /* OP stack right shift */ R_ALPHA_GPVALUE R_ALPHA = 16 R_ALPHA_GPRELHIGH R_ALPHA = 17 R_ALPHA_GPRELLOW R_ALPHA = 18 R_ALPHA_IMMED_GP_16 R_ALPHA = 19 R_ALPHA_IMMED_GP_HI32 R_ALPHA = 20 R_ALPHA_IMMED_SCN_HI32 R_ALPHA = 21 R_ALPHA_IMMED_BR_HI32 R_ALPHA = 22 R_ALPHA_IMMED_LO32 R_ALPHA = 23 R_ALPHA_COPY R_ALPHA = 24 /* Copy symbol at runtime */ R_ALPHA_GLOB_DAT R_ALPHA = 25 /* Create GOT entry */ R_ALPHA_JMP_SLOT R_ALPHA = 26 /* Create PLT entry */ R_ALPHA_RELATIVE R_ALPHA = 27 /* Adjust by program base */ ) ``` ### func (R\_ALPHA) GoString ``` func (i R_ALPHA) GoString() string ``` ### func (R\_ALPHA) String ``` func (i R_ALPHA) String() string ``` type R\_ARM ----------- Relocation types for ARM. ``` type R_ARM int ``` ``` const ( R_ARM_NONE R_ARM = 0 /* No relocation. */ R_ARM_PC24 R_ARM = 1 R_ARM_ABS32 R_ARM = 2 R_ARM_REL32 R_ARM = 3 R_ARM_PC13 R_ARM = 4 R_ARM_ABS16 R_ARM = 5 R_ARM_ABS12 R_ARM = 6 R_ARM_THM_ABS5 R_ARM = 7 R_ARM_ABS8 R_ARM = 8 R_ARM_SBREL32 R_ARM = 9 R_ARM_THM_PC22 R_ARM = 10 R_ARM_THM_PC8 R_ARM = 11 R_ARM_AMP_VCALL9 R_ARM = 12 R_ARM_SWI24 R_ARM = 13 R_ARM_THM_SWI8 R_ARM = 14 R_ARM_XPC25 R_ARM = 15 R_ARM_THM_XPC22 R_ARM = 16 R_ARM_TLS_DTPMOD32 R_ARM = 17 R_ARM_TLS_DTPOFF32 R_ARM = 18 R_ARM_TLS_TPOFF32 R_ARM = 19 R_ARM_COPY R_ARM = 20 /* Copy data from shared object. */ R_ARM_GLOB_DAT R_ARM = 21 /* Set GOT entry to data address. */ R_ARM_JUMP_SLOT R_ARM = 22 /* Set GOT entry to code address. */ R_ARM_RELATIVE R_ARM = 23 /* Add load address of shared object. */ R_ARM_GOTOFF R_ARM = 24 /* Add GOT-relative symbol address. */ R_ARM_GOTPC R_ARM = 25 /* Add PC-relative GOT table address. */ R_ARM_GOT32 R_ARM = 26 /* Add PC-relative GOT offset. */ R_ARM_PLT32 R_ARM = 27 /* Add PC-relative PLT offset. */ R_ARM_CALL R_ARM = 28 R_ARM_JUMP24 R_ARM = 29 R_ARM_THM_JUMP24 R_ARM = 30 R_ARM_BASE_ABS R_ARM = 31 R_ARM_ALU_PCREL_7_0 R_ARM = 32 R_ARM_ALU_PCREL_15_8 R_ARM = 33 R_ARM_ALU_PCREL_23_15 R_ARM = 34 R_ARM_LDR_SBREL_11_10_NC R_ARM = 35 R_ARM_ALU_SBREL_19_12_NC R_ARM = 36 R_ARM_ALU_SBREL_27_20_CK R_ARM = 37 R_ARM_TARGET1 R_ARM = 38 R_ARM_SBREL31 R_ARM = 39 R_ARM_V4BX R_ARM = 40 R_ARM_TARGET2 R_ARM = 41 R_ARM_PREL31 R_ARM = 42 R_ARM_MOVW_ABS_NC R_ARM = 43 R_ARM_MOVT_ABS R_ARM = 44 R_ARM_MOVW_PREL_NC R_ARM = 45 R_ARM_MOVT_PREL R_ARM = 46 R_ARM_THM_MOVW_ABS_NC R_ARM = 47 R_ARM_THM_MOVT_ABS R_ARM = 48 R_ARM_THM_MOVW_PREL_NC R_ARM = 49 R_ARM_THM_MOVT_PREL R_ARM = 50 R_ARM_THM_JUMP19 R_ARM = 51 R_ARM_THM_JUMP6 R_ARM = 52 R_ARM_THM_ALU_PREL_11_0 R_ARM = 53 R_ARM_THM_PC12 R_ARM = 54 R_ARM_ABS32_NOI R_ARM = 55 R_ARM_REL32_NOI R_ARM = 56 R_ARM_ALU_PC_G0_NC R_ARM = 57 R_ARM_ALU_PC_G0 R_ARM = 58 R_ARM_ALU_PC_G1_NC R_ARM = 59 R_ARM_ALU_PC_G1 R_ARM = 60 R_ARM_ALU_PC_G2 R_ARM = 61 R_ARM_LDR_PC_G1 R_ARM = 62 R_ARM_LDR_PC_G2 R_ARM = 63 R_ARM_LDRS_PC_G0 R_ARM = 64 R_ARM_LDRS_PC_G1 R_ARM = 65 R_ARM_LDRS_PC_G2 R_ARM = 66 R_ARM_LDC_PC_G0 R_ARM = 67 R_ARM_LDC_PC_G1 R_ARM = 68 R_ARM_LDC_PC_G2 R_ARM = 69 R_ARM_ALU_SB_G0_NC R_ARM = 70 R_ARM_ALU_SB_G0 R_ARM = 71 R_ARM_ALU_SB_G1_NC R_ARM = 72 R_ARM_ALU_SB_G1 R_ARM = 73 R_ARM_ALU_SB_G2 R_ARM = 74 R_ARM_LDR_SB_G0 R_ARM = 75 R_ARM_LDR_SB_G1 R_ARM = 76 R_ARM_LDR_SB_G2 R_ARM = 77 R_ARM_LDRS_SB_G0 R_ARM = 78 R_ARM_LDRS_SB_G1 R_ARM = 79 R_ARM_LDRS_SB_G2 R_ARM = 80 R_ARM_LDC_SB_G0 R_ARM = 81 R_ARM_LDC_SB_G1 R_ARM = 82 R_ARM_LDC_SB_G2 R_ARM = 83 R_ARM_MOVW_BREL_NC R_ARM = 84 R_ARM_MOVT_BREL R_ARM = 85 R_ARM_MOVW_BREL R_ARM = 86 R_ARM_THM_MOVW_BREL_NC R_ARM = 87 R_ARM_THM_MOVT_BREL R_ARM = 88 R_ARM_THM_MOVW_BREL R_ARM = 89 R_ARM_TLS_GOTDESC R_ARM = 90 R_ARM_TLS_CALL R_ARM = 91 R_ARM_TLS_DESCSEQ R_ARM = 92 R_ARM_THM_TLS_CALL R_ARM = 93 R_ARM_PLT32_ABS R_ARM = 94 R_ARM_GOT_ABS R_ARM = 95 R_ARM_GOT_PREL R_ARM = 96 R_ARM_GOT_BREL12 R_ARM = 97 R_ARM_GOTOFF12 R_ARM = 98 R_ARM_GOTRELAX R_ARM = 99 R_ARM_GNU_VTENTRY R_ARM = 100 R_ARM_GNU_VTINHERIT R_ARM = 101 R_ARM_THM_JUMP11 R_ARM = 102 R_ARM_THM_JUMP8 R_ARM = 103 R_ARM_TLS_GD32 R_ARM = 104 R_ARM_TLS_LDM32 R_ARM = 105 R_ARM_TLS_LDO32 R_ARM = 106 R_ARM_TLS_IE32 R_ARM = 107 R_ARM_TLS_LE32 R_ARM = 108 R_ARM_TLS_LDO12 R_ARM = 109 R_ARM_TLS_LE12 R_ARM = 110 R_ARM_TLS_IE12GP R_ARM = 111 R_ARM_PRIVATE_0 R_ARM = 112 R_ARM_PRIVATE_1 R_ARM = 113 R_ARM_PRIVATE_2 R_ARM = 114 R_ARM_PRIVATE_3 R_ARM = 115 R_ARM_PRIVATE_4 R_ARM = 116 R_ARM_PRIVATE_5 R_ARM = 117 R_ARM_PRIVATE_6 R_ARM = 118 R_ARM_PRIVATE_7 R_ARM = 119 R_ARM_PRIVATE_8 R_ARM = 120 R_ARM_PRIVATE_9 R_ARM = 121 R_ARM_PRIVATE_10 R_ARM = 122 R_ARM_PRIVATE_11 R_ARM = 123 R_ARM_PRIVATE_12 R_ARM = 124 R_ARM_PRIVATE_13 R_ARM = 125 R_ARM_PRIVATE_14 R_ARM = 126 R_ARM_PRIVATE_15 R_ARM = 127 R_ARM_ME_TOO R_ARM = 128 R_ARM_THM_TLS_DESCSEQ16 R_ARM = 129 R_ARM_THM_TLS_DESCSEQ32 R_ARM = 130 R_ARM_THM_GOT_BREL12 R_ARM = 131 R_ARM_THM_ALU_ABS_G0_NC R_ARM = 132 R_ARM_THM_ALU_ABS_G1_NC R_ARM = 133 R_ARM_THM_ALU_ABS_G2_NC R_ARM = 134 R_ARM_THM_ALU_ABS_G3 R_ARM = 135 R_ARM_IRELATIVE R_ARM = 160 R_ARM_RXPC25 R_ARM = 249 R_ARM_RSBREL32 R_ARM = 250 R_ARM_THM_RPC22 R_ARM = 251 R_ARM_RREL32 R_ARM = 252 R_ARM_RABS32 R_ARM = 253 R_ARM_RPC24 R_ARM = 254 R_ARM_RBASE R_ARM = 255 ) ``` ### func (R\_ARM) GoString ``` func (i R_ARM) GoString() string ``` ### func (R\_ARM) String ``` func (i R_ARM) String() string ``` type R\_LARCH 1.19 ------------------ Relocation types for LoongArch. ``` type R_LARCH int ``` ``` const ( R_LARCH_NONE R_LARCH = 0 R_LARCH_32 R_LARCH = 1 R_LARCH_64 R_LARCH = 2 R_LARCH_RELATIVE R_LARCH = 3 R_LARCH_COPY R_LARCH = 4 R_LARCH_JUMP_SLOT R_LARCH = 5 R_LARCH_TLS_DTPMOD32 R_LARCH = 6 R_LARCH_TLS_DTPMOD64 R_LARCH = 7 R_LARCH_TLS_DTPREL32 R_LARCH = 8 R_LARCH_TLS_DTPREL64 R_LARCH = 9 R_LARCH_TLS_TPREL32 R_LARCH = 10 R_LARCH_TLS_TPREL64 R_LARCH = 11 R_LARCH_IRELATIVE R_LARCH = 12 R_LARCH_MARK_LA R_LARCH = 20 R_LARCH_MARK_PCREL R_LARCH = 21 R_LARCH_SOP_PUSH_PCREL R_LARCH = 22 R_LARCH_SOP_PUSH_ABSOLUTE R_LARCH = 23 R_LARCH_SOP_PUSH_DUP R_LARCH = 24 R_LARCH_SOP_PUSH_GPREL R_LARCH = 25 R_LARCH_SOP_PUSH_TLS_TPREL R_LARCH = 26 R_LARCH_SOP_PUSH_TLS_GOT R_LARCH = 27 R_LARCH_SOP_PUSH_TLS_GD R_LARCH = 28 R_LARCH_SOP_PUSH_PLT_PCREL R_LARCH = 29 R_LARCH_SOP_ASSERT R_LARCH = 30 R_LARCH_SOP_NOT R_LARCH = 31 R_LARCH_SOP_SUB R_LARCH = 32 R_LARCH_SOP_SL R_LARCH = 33 R_LARCH_SOP_SR R_LARCH = 34 R_LARCH_SOP_ADD R_LARCH = 35 R_LARCH_SOP_AND R_LARCH = 36 R_LARCH_SOP_IF_ELSE R_LARCH = 37 R_LARCH_SOP_POP_32_S_10_5 R_LARCH = 38 R_LARCH_SOP_POP_32_U_10_12 R_LARCH = 39 R_LARCH_SOP_POP_32_S_10_12 R_LARCH = 40 R_LARCH_SOP_POP_32_S_10_16 R_LARCH = 41 R_LARCH_SOP_POP_32_S_10_16_S2 R_LARCH = 42 R_LARCH_SOP_POP_32_S_5_20 R_LARCH = 43 R_LARCH_SOP_POP_32_S_0_5_10_16_S2 R_LARCH = 44 R_LARCH_SOP_POP_32_S_0_10_10_16_S2 R_LARCH = 45 R_LARCH_SOP_POP_32_U R_LARCH = 46 R_LARCH_ADD8 R_LARCH = 47 R_LARCH_ADD16 R_LARCH = 48 R_LARCH_ADD24 R_LARCH = 49 R_LARCH_ADD32 R_LARCH = 50 R_LARCH_ADD64 R_LARCH = 51 R_LARCH_SUB8 R_LARCH = 52 R_LARCH_SUB16 R_LARCH = 53 R_LARCH_SUB24 R_LARCH = 54 R_LARCH_SUB32 R_LARCH = 55 R_LARCH_SUB64 R_LARCH = 56 R_LARCH_GNU_VTINHERIT R_LARCH = 57 R_LARCH_GNU_VTENTRY R_LARCH = 58 R_LARCH_B16 R_LARCH = 64 R_LARCH_B21 R_LARCH = 65 R_LARCH_B26 R_LARCH = 66 R_LARCH_ABS_HI20 R_LARCH = 67 R_LARCH_ABS_LO12 R_LARCH = 68 R_LARCH_ABS64_LO20 R_LARCH = 69 R_LARCH_ABS64_HI12 R_LARCH = 70 R_LARCH_PCALA_HI20 R_LARCH = 71 R_LARCH_PCALA_LO12 R_LARCH = 72 R_LARCH_PCALA64_LO20 R_LARCH = 73 R_LARCH_PCALA64_HI12 R_LARCH = 74 R_LARCH_GOT_PC_HI20 R_LARCH = 75 R_LARCH_GOT_PC_LO12 R_LARCH = 76 R_LARCH_GOT64_PC_LO20 R_LARCH = 77 R_LARCH_GOT64_PC_HI12 R_LARCH = 78 R_LARCH_GOT_HI20 R_LARCH = 79 R_LARCH_GOT_LO12 R_LARCH = 80 R_LARCH_GOT64_LO20 R_LARCH = 81 R_LARCH_GOT64_HI12 R_LARCH = 82 R_LARCH_TLS_LE_HI20 R_LARCH = 83 R_LARCH_TLS_LE_LO12 R_LARCH = 84 R_LARCH_TLS_LE64_LO20 R_LARCH = 85 R_LARCH_TLS_LE64_HI12 R_LARCH = 86 R_LARCH_TLS_IE_PC_HI20 R_LARCH = 87 R_LARCH_TLS_IE_PC_LO12 R_LARCH = 88 R_LARCH_TLS_IE64_PC_LO20 R_LARCH = 89 R_LARCH_TLS_IE64_PC_HI12 R_LARCH = 90 R_LARCH_TLS_IE_HI20 R_LARCH = 91 R_LARCH_TLS_IE_LO12 R_LARCH = 92 R_LARCH_TLS_IE64_LO20 R_LARCH = 93 R_LARCH_TLS_IE64_HI12 R_LARCH = 94 R_LARCH_TLS_LD_PC_HI20 R_LARCH = 95 R_LARCH_TLS_LD_HI20 R_LARCH = 96 R_LARCH_TLS_GD_PC_HI20 R_LARCH = 97 R_LARCH_TLS_GD_HI20 R_LARCH = 98 R_LARCH_32_PCREL R_LARCH = 99 R_LARCH_RELAX R_LARCH = 100 ) ``` ### func (R\_LARCH) GoString 1.19 ``` func (i R_LARCH) GoString() string ``` ### func (R\_LARCH) String 1.19 ``` func (i R_LARCH) String() string ``` type R\_MIPS 1.6 ---------------- Relocation types for MIPS. ``` type R_MIPS int ``` ``` const ( R_MIPS_NONE R_MIPS = 0 R_MIPS_16 R_MIPS = 1 R_MIPS_32 R_MIPS = 2 R_MIPS_REL32 R_MIPS = 3 R_MIPS_26 R_MIPS = 4 R_MIPS_HI16 R_MIPS = 5 /* high 16 bits of symbol value */ R_MIPS_LO16 R_MIPS = 6 /* low 16 bits of symbol value */ R_MIPS_GPREL16 R_MIPS = 7 /* GP-relative reference */ R_MIPS_LITERAL R_MIPS = 8 /* Reference to literal section */ R_MIPS_GOT16 R_MIPS = 9 /* Reference to global offset table */ R_MIPS_PC16 R_MIPS = 10 /* 16 bit PC relative reference */ R_MIPS_CALL16 R_MIPS = 11 /* 16 bit call through glbl offset tbl */ R_MIPS_GPREL32 R_MIPS = 12 R_MIPS_SHIFT5 R_MIPS = 16 R_MIPS_SHIFT6 R_MIPS = 17 R_MIPS_64 R_MIPS = 18 R_MIPS_GOT_DISP R_MIPS = 19 R_MIPS_GOT_PAGE R_MIPS = 20 R_MIPS_GOT_OFST R_MIPS = 21 R_MIPS_GOT_HI16 R_MIPS = 22 R_MIPS_GOT_LO16 R_MIPS = 23 R_MIPS_SUB R_MIPS = 24 R_MIPS_INSERT_A R_MIPS = 25 R_MIPS_INSERT_B R_MIPS = 26 R_MIPS_DELETE R_MIPS = 27 R_MIPS_HIGHER R_MIPS = 28 R_MIPS_HIGHEST R_MIPS = 29 R_MIPS_CALL_HI16 R_MIPS = 30 R_MIPS_CALL_LO16 R_MIPS = 31 R_MIPS_SCN_DISP R_MIPS = 32 R_MIPS_REL16 R_MIPS = 33 R_MIPS_ADD_IMMEDIATE R_MIPS = 34 R_MIPS_PJUMP R_MIPS = 35 R_MIPS_RELGOT R_MIPS = 36 R_MIPS_JALR R_MIPS = 37 R_MIPS_TLS_DTPMOD32 R_MIPS = 38 /* Module number 32 bit */ R_MIPS_TLS_DTPREL32 R_MIPS = 39 /* Module-relative offset 32 bit */ R_MIPS_TLS_DTPMOD64 R_MIPS = 40 /* Module number 64 bit */ R_MIPS_TLS_DTPREL64 R_MIPS = 41 /* Module-relative offset 64 bit */ R_MIPS_TLS_GD R_MIPS = 42 /* 16 bit GOT offset for GD */ R_MIPS_TLS_LDM R_MIPS = 43 /* 16 bit GOT offset for LDM */ R_MIPS_TLS_DTPREL_HI16 R_MIPS = 44 /* Module-relative offset, high 16 bits */ R_MIPS_TLS_DTPREL_LO16 R_MIPS = 45 /* Module-relative offset, low 16 bits */ R_MIPS_TLS_GOTTPREL R_MIPS = 46 /* 16 bit GOT offset for IE */ R_MIPS_TLS_TPREL32 R_MIPS = 47 /* TP-relative offset, 32 bit */ R_MIPS_TLS_TPREL64 R_MIPS = 48 /* TP-relative offset, 64 bit */ R_MIPS_TLS_TPREL_HI16 R_MIPS = 49 /* TP-relative offset, high 16 bits */ R_MIPS_TLS_TPREL_LO16 R_MIPS = 50 /* TP-relative offset, low 16 bits */ ) ``` ### func (R\_MIPS) GoString 1.6 ``` func (i R_MIPS) GoString() string ``` ### func (R\_MIPS) String 1.6 ``` func (i R_MIPS) String() string ``` type R\_PPC ----------- Relocation types for PowerPC. Values that are shared by both R\_PPC and R\_PPC64 are prefixed with R\_POWERPC\_ in the ELF standard. For the R\_PPC type, the relevant shared relocations have been renamed with the prefix R\_PPC\_. The original name follows the value in a comment. ``` type R_PPC int ``` ``` const ( R_PPC_NONE R_PPC = 0 // R_POWERPC_NONE R_PPC_ADDR32 R_PPC = 1 // R_POWERPC_ADDR32 R_PPC_ADDR24 R_PPC = 2 // R_POWERPC_ADDR24 R_PPC_ADDR16 R_PPC = 3 // R_POWERPC_ADDR16 R_PPC_ADDR16_LO R_PPC = 4 // R_POWERPC_ADDR16_LO R_PPC_ADDR16_HI R_PPC = 5 // R_POWERPC_ADDR16_HI R_PPC_ADDR16_HA R_PPC = 6 // R_POWERPC_ADDR16_HA R_PPC_ADDR14 R_PPC = 7 // R_POWERPC_ADDR14 R_PPC_ADDR14_BRTAKEN R_PPC = 8 // R_POWERPC_ADDR14_BRTAKEN R_PPC_ADDR14_BRNTAKEN R_PPC = 9 // R_POWERPC_ADDR14_BRNTAKEN R_PPC_REL24 R_PPC = 10 // R_POWERPC_REL24 R_PPC_REL14 R_PPC = 11 // R_POWERPC_REL14 R_PPC_REL14_BRTAKEN R_PPC = 12 // R_POWERPC_REL14_BRTAKEN R_PPC_REL14_BRNTAKEN R_PPC = 13 // R_POWERPC_REL14_BRNTAKEN R_PPC_GOT16 R_PPC = 14 // R_POWERPC_GOT16 R_PPC_GOT16_LO R_PPC = 15 // R_POWERPC_GOT16_LO R_PPC_GOT16_HI R_PPC = 16 // R_POWERPC_GOT16_HI R_PPC_GOT16_HA R_PPC = 17 // R_POWERPC_GOT16_HA R_PPC_PLTREL24 R_PPC = 18 R_PPC_COPY R_PPC = 19 // R_POWERPC_COPY R_PPC_GLOB_DAT R_PPC = 20 // R_POWERPC_GLOB_DAT R_PPC_JMP_SLOT R_PPC = 21 // R_POWERPC_JMP_SLOT R_PPC_RELATIVE R_PPC = 22 // R_POWERPC_RELATIVE R_PPC_LOCAL24PC R_PPC = 23 R_PPC_UADDR32 R_PPC = 24 // R_POWERPC_UADDR32 R_PPC_UADDR16 R_PPC = 25 // R_POWERPC_UADDR16 R_PPC_REL32 R_PPC = 26 // R_POWERPC_REL32 R_PPC_PLT32 R_PPC = 27 // R_POWERPC_PLT32 R_PPC_PLTREL32 R_PPC = 28 // R_POWERPC_PLTREL32 R_PPC_PLT16_LO R_PPC = 29 // R_POWERPC_PLT16_LO R_PPC_PLT16_HI R_PPC = 30 // R_POWERPC_PLT16_HI R_PPC_PLT16_HA R_PPC = 31 // R_POWERPC_PLT16_HA R_PPC_SDAREL16 R_PPC = 32 R_PPC_SECTOFF R_PPC = 33 // R_POWERPC_SECTOFF R_PPC_SECTOFF_LO R_PPC = 34 // R_POWERPC_SECTOFF_LO R_PPC_SECTOFF_HI R_PPC = 35 // R_POWERPC_SECTOFF_HI R_PPC_SECTOFF_HA R_PPC = 36 // R_POWERPC_SECTOFF_HA R_PPC_TLS R_PPC = 67 // R_POWERPC_TLS R_PPC_DTPMOD32 R_PPC = 68 // R_POWERPC_DTPMOD32 R_PPC_TPREL16 R_PPC = 69 // R_POWERPC_TPREL16 R_PPC_TPREL16_LO R_PPC = 70 // R_POWERPC_TPREL16_LO R_PPC_TPREL16_HI R_PPC = 71 // R_POWERPC_TPREL16_HI R_PPC_TPREL16_HA R_PPC = 72 // R_POWERPC_TPREL16_HA R_PPC_TPREL32 R_PPC = 73 // R_POWERPC_TPREL32 R_PPC_DTPREL16 R_PPC = 74 // R_POWERPC_DTPREL16 R_PPC_DTPREL16_LO R_PPC = 75 // R_POWERPC_DTPREL16_LO R_PPC_DTPREL16_HI R_PPC = 76 // R_POWERPC_DTPREL16_HI R_PPC_DTPREL16_HA R_PPC = 77 // R_POWERPC_DTPREL16_HA R_PPC_DTPREL32 R_PPC = 78 // R_POWERPC_DTPREL32 R_PPC_GOT_TLSGD16 R_PPC = 79 // R_POWERPC_GOT_TLSGD16 R_PPC_GOT_TLSGD16_LO R_PPC = 80 // R_POWERPC_GOT_TLSGD16_LO R_PPC_GOT_TLSGD16_HI R_PPC = 81 // R_POWERPC_GOT_TLSGD16_HI R_PPC_GOT_TLSGD16_HA R_PPC = 82 // R_POWERPC_GOT_TLSGD16_HA R_PPC_GOT_TLSLD16 R_PPC = 83 // R_POWERPC_GOT_TLSLD16 R_PPC_GOT_TLSLD16_LO R_PPC = 84 // R_POWERPC_GOT_TLSLD16_LO R_PPC_GOT_TLSLD16_HI R_PPC = 85 // R_POWERPC_GOT_TLSLD16_HI R_PPC_GOT_TLSLD16_HA R_PPC = 86 // R_POWERPC_GOT_TLSLD16_HA R_PPC_GOT_TPREL16 R_PPC = 87 // R_POWERPC_GOT_TPREL16 R_PPC_GOT_TPREL16_LO R_PPC = 88 // R_POWERPC_GOT_TPREL16_LO R_PPC_GOT_TPREL16_HI R_PPC = 89 // R_POWERPC_GOT_TPREL16_HI R_PPC_GOT_TPREL16_HA R_PPC = 90 // R_POWERPC_GOT_TPREL16_HA R_PPC_EMB_NADDR32 R_PPC = 101 R_PPC_EMB_NADDR16 R_PPC = 102 R_PPC_EMB_NADDR16_LO R_PPC = 103 R_PPC_EMB_NADDR16_HI R_PPC = 104 R_PPC_EMB_NADDR16_HA R_PPC = 105 R_PPC_EMB_SDAI16 R_PPC = 106 R_PPC_EMB_SDA2I16 R_PPC = 107 R_PPC_EMB_SDA2REL R_PPC = 108 R_PPC_EMB_SDA21 R_PPC = 109 R_PPC_EMB_MRKREF R_PPC = 110 R_PPC_EMB_RELSEC16 R_PPC = 111 R_PPC_EMB_RELST_LO R_PPC = 112 R_PPC_EMB_RELST_HI R_PPC = 113 R_PPC_EMB_RELST_HA R_PPC = 114 R_PPC_EMB_BIT_FLD R_PPC = 115 R_PPC_EMB_RELSDA R_PPC = 116 ) ``` ### func (R\_PPC) GoString ``` func (i R_PPC) GoString() string ``` ### func (R\_PPC) String ``` func (i R_PPC) String() string ``` type R\_PPC64 1.5 ----------------- Relocation types for 64-bit PowerPC or Power Architecture processors. Values that are shared by both R\_PPC and R\_PPC64 are prefixed with R\_POWERPC\_ in the ELF standard. For the R\_PPC64 type, the relevant shared relocations have been renamed with the prefix R\_PPC64\_. The original name follows the value in a comment. ``` type R_PPC64 int ``` ``` const ( R_PPC64_NONE R_PPC64 = 0 // R_POWERPC_NONE R_PPC64_ADDR32 R_PPC64 = 1 // R_POWERPC_ADDR32 R_PPC64_ADDR24 R_PPC64 = 2 // R_POWERPC_ADDR24 R_PPC64_ADDR16 R_PPC64 = 3 // R_POWERPC_ADDR16 R_PPC64_ADDR16_LO R_PPC64 = 4 // R_POWERPC_ADDR16_LO R_PPC64_ADDR16_HI R_PPC64 = 5 // R_POWERPC_ADDR16_HI R_PPC64_ADDR16_HA R_PPC64 = 6 // R_POWERPC_ADDR16_HA R_PPC64_ADDR14 R_PPC64 = 7 // R_POWERPC_ADDR14 R_PPC64_ADDR14_BRTAKEN R_PPC64 = 8 // R_POWERPC_ADDR14_BRTAKEN R_PPC64_ADDR14_BRNTAKEN R_PPC64 = 9 // R_POWERPC_ADDR14_BRNTAKEN R_PPC64_REL24 R_PPC64 = 10 // R_POWERPC_REL24 R_PPC64_REL14 R_PPC64 = 11 // R_POWERPC_REL14 R_PPC64_REL14_BRTAKEN R_PPC64 = 12 // R_POWERPC_REL14_BRTAKEN R_PPC64_REL14_BRNTAKEN R_PPC64 = 13 // R_POWERPC_REL14_BRNTAKEN R_PPC64_GOT16 R_PPC64 = 14 // R_POWERPC_GOT16 R_PPC64_GOT16_LO R_PPC64 = 15 // R_POWERPC_GOT16_LO R_PPC64_GOT16_HI R_PPC64 = 16 // R_POWERPC_GOT16_HI R_PPC64_GOT16_HA R_PPC64 = 17 // R_POWERPC_GOT16_HA R_PPC64_COPY R_PPC64 = 19 // R_POWERPC_COPY R_PPC64_GLOB_DAT R_PPC64 = 20 // R_POWERPC_GLOB_DAT R_PPC64_JMP_SLOT R_PPC64 = 21 // R_POWERPC_JMP_SLOT R_PPC64_RELATIVE R_PPC64 = 22 // R_POWERPC_RELATIVE R_PPC64_UADDR32 R_PPC64 = 24 // R_POWERPC_UADDR32 R_PPC64_UADDR16 R_PPC64 = 25 // R_POWERPC_UADDR16 R_PPC64_REL32 R_PPC64 = 26 // R_POWERPC_REL32 R_PPC64_PLT32 R_PPC64 = 27 // R_POWERPC_PLT32 R_PPC64_PLTREL32 R_PPC64 = 28 // R_POWERPC_PLTREL32 R_PPC64_PLT16_LO R_PPC64 = 29 // R_POWERPC_PLT16_LO R_PPC64_PLT16_HI R_PPC64 = 30 // R_POWERPC_PLT16_HI R_PPC64_PLT16_HA R_PPC64 = 31 // R_POWERPC_PLT16_HA R_PPC64_SECTOFF R_PPC64 = 33 // R_POWERPC_SECTOFF R_PPC64_SECTOFF_LO R_PPC64 = 34 // R_POWERPC_SECTOFF_LO R_PPC64_SECTOFF_HI R_PPC64 = 35 // R_POWERPC_SECTOFF_HI R_PPC64_SECTOFF_HA R_PPC64 = 36 // R_POWERPC_SECTOFF_HA R_PPC64_REL30 R_PPC64 = 37 // R_POWERPC_ADDR30 R_PPC64_ADDR64 R_PPC64 = 38 R_PPC64_ADDR16_HIGHER R_PPC64 = 39 R_PPC64_ADDR16_HIGHERA R_PPC64 = 40 R_PPC64_ADDR16_HIGHEST R_PPC64 = 41 R_PPC64_ADDR16_HIGHESTA R_PPC64 = 42 R_PPC64_UADDR64 R_PPC64 = 43 R_PPC64_REL64 R_PPC64 = 44 R_PPC64_PLT64 R_PPC64 = 45 R_PPC64_PLTREL64 R_PPC64 = 46 R_PPC64_TOC16 R_PPC64 = 47 R_PPC64_TOC16_LO R_PPC64 = 48 R_PPC64_TOC16_HI R_PPC64 = 49 R_PPC64_TOC16_HA R_PPC64 = 50 R_PPC64_TOC R_PPC64 = 51 R_PPC64_PLTGOT16 R_PPC64 = 52 R_PPC64_PLTGOT16_LO R_PPC64 = 53 R_PPC64_PLTGOT16_HI R_PPC64 = 54 R_PPC64_PLTGOT16_HA R_PPC64 = 55 R_PPC64_ADDR16_DS R_PPC64 = 56 R_PPC64_ADDR16_LO_DS R_PPC64 = 57 R_PPC64_GOT16_DS R_PPC64 = 58 R_PPC64_GOT16_LO_DS R_PPC64 = 59 R_PPC64_PLT16_LO_DS R_PPC64 = 60 R_PPC64_SECTOFF_DS R_PPC64 = 61 R_PPC64_SECTOFF_LO_DS R_PPC64 = 62 R_PPC64_TOC16_DS R_PPC64 = 63 R_PPC64_TOC16_LO_DS R_PPC64 = 64 R_PPC64_PLTGOT16_DS R_PPC64 = 65 R_PPC64_PLTGOT_LO_DS R_PPC64 = 66 R_PPC64_TLS R_PPC64 = 67 // R_POWERPC_TLS R_PPC64_DTPMOD64 R_PPC64 = 68 // R_POWERPC_DTPMOD64 R_PPC64_TPREL16 R_PPC64 = 69 // R_POWERPC_TPREL16 R_PPC64_TPREL16_LO R_PPC64 = 70 // R_POWERPC_TPREL16_LO R_PPC64_TPREL16_HI R_PPC64 = 71 // R_POWERPC_TPREL16_HI R_PPC64_TPREL16_HA R_PPC64 = 72 // R_POWERPC_TPREL16_HA R_PPC64_TPREL64 R_PPC64 = 73 // R_POWERPC_TPREL64 R_PPC64_DTPREL16 R_PPC64 = 74 // R_POWERPC_DTPREL16 R_PPC64_DTPREL16_LO R_PPC64 = 75 // R_POWERPC_DTPREL16_LO R_PPC64_DTPREL16_HI R_PPC64 = 76 // R_POWERPC_DTPREL16_HI R_PPC64_DTPREL16_HA R_PPC64 = 77 // R_POWERPC_DTPREL16_HA R_PPC64_DTPREL64 R_PPC64 = 78 // R_POWERPC_DTPREL64 R_PPC64_GOT_TLSGD16 R_PPC64 = 79 // R_POWERPC_GOT_TLSGD16 R_PPC64_GOT_TLSGD16_LO R_PPC64 = 80 // R_POWERPC_GOT_TLSGD16_LO R_PPC64_GOT_TLSGD16_HI R_PPC64 = 81 // R_POWERPC_GOT_TLSGD16_HI R_PPC64_GOT_TLSGD16_HA R_PPC64 = 82 // R_POWERPC_GOT_TLSGD16_HA R_PPC64_GOT_TLSLD16 R_PPC64 = 83 // R_POWERPC_GOT_TLSLD16 R_PPC64_GOT_TLSLD16_LO R_PPC64 = 84 // R_POWERPC_GOT_TLSLD16_LO R_PPC64_GOT_TLSLD16_HI R_PPC64 = 85 // R_POWERPC_GOT_TLSLD16_HI R_PPC64_GOT_TLSLD16_HA R_PPC64 = 86 // R_POWERPC_GOT_TLSLD16_HA R_PPC64_GOT_TPREL16_DS R_PPC64 = 87 // R_POWERPC_GOT_TPREL16_DS R_PPC64_GOT_TPREL16_LO_DS R_PPC64 = 88 // R_POWERPC_GOT_TPREL16_LO_DS R_PPC64_GOT_TPREL16_HI R_PPC64 = 89 // R_POWERPC_GOT_TPREL16_HI R_PPC64_GOT_TPREL16_HA R_PPC64 = 90 // R_POWERPC_GOT_TPREL16_HA R_PPC64_GOT_DTPREL16_DS R_PPC64 = 91 // R_POWERPC_GOT_DTPREL16_DS R_PPC64_GOT_DTPREL16_LO_DS R_PPC64 = 92 // R_POWERPC_GOT_DTPREL16_LO_DS R_PPC64_GOT_DTPREL16_HI R_PPC64 = 93 // R_POWERPC_GOT_DTPREL16_HI R_PPC64_GOT_DTPREL16_HA R_PPC64 = 94 // R_POWERPC_GOT_DTPREL16_HA R_PPC64_TPREL16_DS R_PPC64 = 95 R_PPC64_TPREL16_LO_DS R_PPC64 = 96 R_PPC64_TPREL16_HIGHER R_PPC64 = 97 R_PPC64_TPREL16_HIGHERA R_PPC64 = 98 R_PPC64_TPREL16_HIGHEST R_PPC64 = 99 R_PPC64_TPREL16_HIGHESTA R_PPC64 = 100 R_PPC64_DTPREL16_DS R_PPC64 = 101 R_PPC64_DTPREL16_LO_DS R_PPC64 = 102 R_PPC64_DTPREL16_HIGHER R_PPC64 = 103 R_PPC64_DTPREL16_HIGHERA R_PPC64 = 104 R_PPC64_DTPREL16_HIGHEST R_PPC64 = 105 R_PPC64_DTPREL16_HIGHESTA R_PPC64 = 106 R_PPC64_TLSGD R_PPC64 = 107 R_PPC64_TLSLD R_PPC64 = 108 R_PPC64_TOCSAVE R_PPC64 = 109 R_PPC64_ADDR16_HIGH R_PPC64 = 110 R_PPC64_ADDR16_HIGHA R_PPC64 = 111 R_PPC64_TPREL16_HIGH R_PPC64 = 112 R_PPC64_TPREL16_HIGHA R_PPC64 = 113 R_PPC64_DTPREL16_HIGH R_PPC64 = 114 R_PPC64_DTPREL16_HIGHA R_PPC64 = 115 R_PPC64_REL24_NOTOC R_PPC64 = 116 R_PPC64_ADDR64_LOCAL R_PPC64 = 117 R_PPC64_ENTRY R_PPC64 = 118 R_PPC64_PLTSEQ R_PPC64 = 119 R_PPC64_PLTCALL R_PPC64 = 120 R_PPC64_PLTSEQ_NOTOC R_PPC64 = 121 R_PPC64_PLTCALL_NOTOC R_PPC64 = 122 R_PPC64_PCREL_OPT R_PPC64 = 123 R_PPC64_D34 R_PPC64 = 128 R_PPC64_D34_LO R_PPC64 = 129 R_PPC64_D34_HI30 R_PPC64 = 130 R_PPC64_D34_HA30 R_PPC64 = 131 R_PPC64_PCREL34 R_PPC64 = 132 R_PPC64_GOT_PCREL34 R_PPC64 = 133 R_PPC64_PLT_PCREL34 R_PPC64 = 134 R_PPC64_PLT_PCREL34_NOTOC R_PPC64 = 135 R_PPC64_ADDR16_HIGHER34 R_PPC64 = 136 R_PPC64_ADDR16_HIGHERA34 R_PPC64 = 137 R_PPC64_ADDR16_HIGHEST34 R_PPC64 = 138 R_PPC64_ADDR16_HIGHESTA34 R_PPC64 = 139 R_PPC64_REL16_HIGHER34 R_PPC64 = 140 R_PPC64_REL16_HIGHERA34 R_PPC64 = 141 R_PPC64_REL16_HIGHEST34 R_PPC64 = 142 R_PPC64_REL16_HIGHESTA34 R_PPC64 = 143 R_PPC64_D28 R_PPC64 = 144 R_PPC64_PCREL28 R_PPC64 = 145 R_PPC64_TPREL34 R_PPC64 = 146 R_PPC64_DTPREL34 R_PPC64 = 147 R_PPC64_GOT_TLSGD_PCREL34 R_PPC64 = 148 R_PPC64_GOT_TLSLD_PCREL34 R_PPC64 = 149 R_PPC64_GOT_TPREL_PCREL34 R_PPC64 = 150 R_PPC64_GOT_DTPREL_PCREL34 R_PPC64 = 151 R_PPC64_REL16_HIGH R_PPC64 = 240 R_PPC64_REL16_HIGHA R_PPC64 = 241 R_PPC64_REL16_HIGHER R_PPC64 = 242 R_PPC64_REL16_HIGHERA R_PPC64 = 243 R_PPC64_REL16_HIGHEST R_PPC64 = 244 R_PPC64_REL16_HIGHESTA R_PPC64 = 245 R_PPC64_REL16DX_HA R_PPC64 = 246 // R_POWERPC_REL16DX_HA R_PPC64_JMP_IREL R_PPC64 = 247 R_PPC64_IRELATIVE R_PPC64 = 248 // R_POWERPC_IRELATIVE R_PPC64_REL16 R_PPC64 = 249 // R_POWERPC_REL16 R_PPC64_REL16_LO R_PPC64 = 250 // R_POWERPC_REL16_LO R_PPC64_REL16_HI R_PPC64 = 251 // R_POWERPC_REL16_HI R_PPC64_REL16_HA R_PPC64 = 252 // R_POWERPC_REL16_HA R_PPC64_GNU_VTINHERIT R_PPC64 = 253 R_PPC64_GNU_VTENTRY R_PPC64 = 254 ) ``` ### func (R\_PPC64) GoString 1.5 ``` func (i R_PPC64) GoString() string ``` ### func (R\_PPC64) String 1.5 ``` func (i R_PPC64) String() string ``` type R\_RISCV 1.11 ------------------ Relocation types for RISC-V processors. ``` type R_RISCV int ``` ``` const ( R_RISCV_NONE R_RISCV = 0 /* No relocation. */ R_RISCV_32 R_RISCV = 1 /* Add 32 bit zero extended symbol value */ R_RISCV_64 R_RISCV = 2 /* Add 64 bit symbol value. */ R_RISCV_RELATIVE R_RISCV = 3 /* Add load address of shared object. */ R_RISCV_COPY R_RISCV = 4 /* Copy data from shared object. */ R_RISCV_JUMP_SLOT R_RISCV = 5 /* Set GOT entry to code address. */ R_RISCV_TLS_DTPMOD32 R_RISCV = 6 /* 32 bit ID of module containing symbol */ R_RISCV_TLS_DTPMOD64 R_RISCV = 7 /* ID of module containing symbol */ R_RISCV_TLS_DTPREL32 R_RISCV = 8 /* 32 bit relative offset in TLS block */ R_RISCV_TLS_DTPREL64 R_RISCV = 9 /* Relative offset in TLS block */ R_RISCV_TLS_TPREL32 R_RISCV = 10 /* 32 bit relative offset in static TLS block */ R_RISCV_TLS_TPREL64 R_RISCV = 11 /* Relative offset in static TLS block */ R_RISCV_BRANCH R_RISCV = 16 /* PC-relative branch */ R_RISCV_JAL R_RISCV = 17 /* PC-relative jump */ R_RISCV_CALL R_RISCV = 18 /* PC-relative call */ R_RISCV_CALL_PLT R_RISCV = 19 /* PC-relative call (PLT) */ R_RISCV_GOT_HI20 R_RISCV = 20 /* PC-relative GOT reference */ R_RISCV_TLS_GOT_HI20 R_RISCV = 21 /* PC-relative TLS IE GOT offset */ R_RISCV_TLS_GD_HI20 R_RISCV = 22 /* PC-relative TLS GD reference */ R_RISCV_PCREL_HI20 R_RISCV = 23 /* PC-relative reference */ R_RISCV_PCREL_LO12_I R_RISCV = 24 /* PC-relative reference */ R_RISCV_PCREL_LO12_S R_RISCV = 25 /* PC-relative reference */ R_RISCV_HI20 R_RISCV = 26 /* Absolute address */ R_RISCV_LO12_I R_RISCV = 27 /* Absolute address */ R_RISCV_LO12_S R_RISCV = 28 /* Absolute address */ R_RISCV_TPREL_HI20 R_RISCV = 29 /* TLS LE thread offset */ R_RISCV_TPREL_LO12_I R_RISCV = 30 /* TLS LE thread offset */ R_RISCV_TPREL_LO12_S R_RISCV = 31 /* TLS LE thread offset */ R_RISCV_TPREL_ADD R_RISCV = 32 /* TLS LE thread usage */ R_RISCV_ADD8 R_RISCV = 33 /* 8-bit label addition */ R_RISCV_ADD16 R_RISCV = 34 /* 16-bit label addition */ R_RISCV_ADD32 R_RISCV = 35 /* 32-bit label addition */ R_RISCV_ADD64 R_RISCV = 36 /* 64-bit label addition */ R_RISCV_SUB8 R_RISCV = 37 /* 8-bit label subtraction */ R_RISCV_SUB16 R_RISCV = 38 /* 16-bit label subtraction */ R_RISCV_SUB32 R_RISCV = 39 /* 32-bit label subtraction */ R_RISCV_SUB64 R_RISCV = 40 /* 64-bit label subtraction */ R_RISCV_GNU_VTINHERIT R_RISCV = 41 /* GNU C++ vtable hierarchy */ R_RISCV_GNU_VTENTRY R_RISCV = 42 /* GNU C++ vtable member usage */ R_RISCV_ALIGN R_RISCV = 43 /* Alignment statement */ R_RISCV_RVC_BRANCH R_RISCV = 44 /* PC-relative branch offset */ R_RISCV_RVC_JUMP R_RISCV = 45 /* PC-relative jump offset */ R_RISCV_RVC_LUI R_RISCV = 46 /* Absolute address */ R_RISCV_GPREL_I R_RISCV = 47 /* GP-relative reference */ R_RISCV_GPREL_S R_RISCV = 48 /* GP-relative reference */ R_RISCV_TPREL_I R_RISCV = 49 /* TP-relative TLS LE load */ R_RISCV_TPREL_S R_RISCV = 50 /* TP-relative TLS LE store */ R_RISCV_RELAX R_RISCV = 51 /* Instruction pair can be relaxed */ R_RISCV_SUB6 R_RISCV = 52 /* Local label subtraction */ R_RISCV_SET6 R_RISCV = 53 /* Local label subtraction */ R_RISCV_SET8 R_RISCV = 54 /* Local label subtraction */ R_RISCV_SET16 R_RISCV = 55 /* Local label subtraction */ R_RISCV_SET32 R_RISCV = 56 /* Local label subtraction */ R_RISCV_32_PCREL R_RISCV = 57 /* 32-bit PC relative */ ) ``` ### func (R\_RISCV) GoString 1.11 ``` func (i R_RISCV) GoString() string ``` ### func (R\_RISCV) String 1.11 ``` func (i R_RISCV) String() string ``` type R\_SPARC ------------- Relocation types for SPARC. ``` type R_SPARC int ``` ``` const ( R_SPARC_NONE R_SPARC = 0 R_SPARC_8 R_SPARC = 1 R_SPARC_16 R_SPARC = 2 R_SPARC_32 R_SPARC = 3 R_SPARC_DISP8 R_SPARC = 4 R_SPARC_DISP16 R_SPARC = 5 R_SPARC_DISP32 R_SPARC = 6 R_SPARC_WDISP30 R_SPARC = 7 R_SPARC_WDISP22 R_SPARC = 8 R_SPARC_HI22 R_SPARC = 9 R_SPARC_22 R_SPARC = 10 R_SPARC_13 R_SPARC = 11 R_SPARC_LO10 R_SPARC = 12 R_SPARC_GOT10 R_SPARC = 13 R_SPARC_GOT13 R_SPARC = 14 R_SPARC_GOT22 R_SPARC = 15 R_SPARC_PC10 R_SPARC = 16 R_SPARC_PC22 R_SPARC = 17 R_SPARC_WPLT30 R_SPARC = 18 R_SPARC_COPY R_SPARC = 19 R_SPARC_GLOB_DAT R_SPARC = 20 R_SPARC_JMP_SLOT R_SPARC = 21 R_SPARC_RELATIVE R_SPARC = 22 R_SPARC_UA32 R_SPARC = 23 R_SPARC_PLT32 R_SPARC = 24 R_SPARC_HIPLT22 R_SPARC = 25 R_SPARC_LOPLT10 R_SPARC = 26 R_SPARC_PCPLT32 R_SPARC = 27 R_SPARC_PCPLT22 R_SPARC = 28 R_SPARC_PCPLT10 R_SPARC = 29 R_SPARC_10 R_SPARC = 30 R_SPARC_11 R_SPARC = 31 R_SPARC_64 R_SPARC = 32 R_SPARC_OLO10 R_SPARC = 33 R_SPARC_HH22 R_SPARC = 34 R_SPARC_HM10 R_SPARC = 35 R_SPARC_LM22 R_SPARC = 36 R_SPARC_PC_HH22 R_SPARC = 37 R_SPARC_PC_HM10 R_SPARC = 38 R_SPARC_PC_LM22 R_SPARC = 39 R_SPARC_WDISP16 R_SPARC = 40 R_SPARC_WDISP19 R_SPARC = 41 R_SPARC_GLOB_JMP R_SPARC = 42 R_SPARC_7 R_SPARC = 43 R_SPARC_5 R_SPARC = 44 R_SPARC_6 R_SPARC = 45 R_SPARC_DISP64 R_SPARC = 46 R_SPARC_PLT64 R_SPARC = 47 R_SPARC_HIX22 R_SPARC = 48 R_SPARC_LOX10 R_SPARC = 49 R_SPARC_H44 R_SPARC = 50 R_SPARC_M44 R_SPARC = 51 R_SPARC_L44 R_SPARC = 52 R_SPARC_REGISTER R_SPARC = 53 R_SPARC_UA64 R_SPARC = 54 R_SPARC_UA16 R_SPARC = 55 ) ``` ### func (R\_SPARC) GoString ``` func (i R_SPARC) GoString() string ``` ### func (R\_SPARC) String ``` func (i R_SPARC) String() string ``` type R\_X86\_64 --------------- Relocation types for x86-64. ``` type R_X86_64 int ``` ``` const ( R_X86_64_NONE R_X86_64 = 0 /* No relocation. */ R_X86_64_64 R_X86_64 = 1 /* Add 64 bit symbol value. */ R_X86_64_PC32 R_X86_64 = 2 /* PC-relative 32 bit signed sym value. */ R_X86_64_GOT32 R_X86_64 = 3 /* PC-relative 32 bit GOT offset. */ R_X86_64_PLT32 R_X86_64 = 4 /* PC-relative 32 bit PLT offset. */ R_X86_64_COPY R_X86_64 = 5 /* Copy data from shared object. */ R_X86_64_GLOB_DAT R_X86_64 = 6 /* Set GOT entry to data address. */ R_X86_64_JMP_SLOT R_X86_64 = 7 /* Set GOT entry to code address. */ R_X86_64_RELATIVE R_X86_64 = 8 /* Add load address of shared object. */ R_X86_64_GOTPCREL R_X86_64 = 9 /* Add 32 bit signed pcrel offset to GOT. */ R_X86_64_32 R_X86_64 = 10 /* Add 32 bit zero extended symbol value */ R_X86_64_32S R_X86_64 = 11 /* Add 32 bit sign extended symbol value */ R_X86_64_16 R_X86_64 = 12 /* Add 16 bit zero extended symbol value */ R_X86_64_PC16 R_X86_64 = 13 /* Add 16 bit signed extended pc relative symbol value */ R_X86_64_8 R_X86_64 = 14 /* Add 8 bit zero extended symbol value */ R_X86_64_PC8 R_X86_64 = 15 /* Add 8 bit signed extended pc relative symbol value */ R_X86_64_DTPMOD64 R_X86_64 = 16 /* ID of module containing symbol */ R_X86_64_DTPOFF64 R_X86_64 = 17 /* Offset in TLS block */ R_X86_64_TPOFF64 R_X86_64 = 18 /* Offset in static TLS block */ R_X86_64_TLSGD R_X86_64 = 19 /* PC relative offset to GD GOT entry */ R_X86_64_TLSLD R_X86_64 = 20 /* PC relative offset to LD GOT entry */ R_X86_64_DTPOFF32 R_X86_64 = 21 /* Offset in TLS block */ R_X86_64_GOTTPOFF R_X86_64 = 22 /* PC relative offset to IE GOT entry */ R_X86_64_TPOFF32 R_X86_64 = 23 /* Offset in static TLS block */ R_X86_64_PC64 R_X86_64 = 24 /* PC relative 64-bit sign extended symbol value. */ R_X86_64_GOTOFF64 R_X86_64 = 25 R_X86_64_GOTPC32 R_X86_64 = 26 R_X86_64_GOT64 R_X86_64 = 27 R_X86_64_GOTPCREL64 R_X86_64 = 28 R_X86_64_GOTPC64 R_X86_64 = 29 R_X86_64_GOTPLT64 R_X86_64 = 30 R_X86_64_PLTOFF64 R_X86_64 = 31 R_X86_64_SIZE32 R_X86_64 = 32 R_X86_64_SIZE64 R_X86_64 = 33 R_X86_64_GOTPC32_TLSDESC R_X86_64 = 34 R_X86_64_TLSDESC_CALL R_X86_64 = 35 R_X86_64_TLSDESC R_X86_64 = 36 R_X86_64_IRELATIVE R_X86_64 = 37 R_X86_64_RELATIVE64 R_X86_64 = 38 R_X86_64_PC32_BND R_X86_64 = 39 R_X86_64_PLT32_BND R_X86_64 = 40 R_X86_64_GOTPCRELX R_X86_64 = 41 R_X86_64_REX_GOTPCRELX R_X86_64 = 42 ) ``` ### func (R\_X86\_64) GoString ``` func (i R_X86_64) GoString() string ``` ### func (R\_X86\_64) String ``` func (i R_X86_64) String() string ``` type Rel32 ---------- ELF32 Relocations that don't need an addend field. ``` type Rel32 struct { Off uint32 /* Location to be relocated. */ Info uint32 /* Relocation type and symbol index. */ } ``` type Rel64 ---------- ELF64 relocations that don't need an addend field. ``` type Rel64 struct { Off uint64 /* Location to be relocated. */ Info uint64 /* Relocation type and symbol index. */ } ``` type Rela32 ----------- ELF32 Relocations that need an addend field. ``` type Rela32 struct { Off uint32 /* Location to be relocated. */ Info uint32 /* Relocation type and symbol index. */ Addend int32 /* Addend. */ } ``` type Rela64 ----------- ELF64 relocations that need an addend field. ``` type Rela64 struct { Off uint64 /* Location to be relocated. */ Info uint64 /* Relocation type and symbol index. */ Addend int64 /* Addend. */ } ``` type Section ------------ A Section represents a single section in an ELF file. ``` type Section struct { SectionHeader // Embed ReaderAt for ReadAt method. // Do not embed SectionReader directly // to avoid having Read and Seek. // If a client wants Read and Seek it must use // Open() to avoid fighting over the seek offset // with other clients. // // ReaderAt may be nil if the section is not easily available // in a random-access form. For example, a compressed section // may have a nil ReaderAt. io.ReaderAt // contains filtered or unexported fields } ``` ### func (\*Section) Data ``` func (s *Section) Data() ([]byte, error) ``` Data reads and returns the contents of the ELF section. Even if the section is stored compressed in the ELF file, Data returns uncompressed data. For an SHT\_NOBITS section, Data always returns a non-nil error. ### func (\*Section) Open ``` func (s *Section) Open() io.ReadSeeker ``` Open returns a new ReadSeeker reading the ELF section. Even if the section is stored compressed in the ELF file, the ReadSeeker reads uncompressed data. For an SHT\_NOBITS section, all calls to the opened reader will return a non-nil error. type Section32 -------------- ELF32 Section header. ``` type Section32 struct { Name uint32 /* Section name (index into the section header string table). */ Type uint32 /* Section type. */ Flags uint32 /* Section flags. */ Addr uint32 /* Address in memory image. */ Off uint32 /* Offset in file. */ Size uint32 /* Size in bytes. */ Link uint32 /* Index of a related section. */ Info uint32 /* Depends on section type. */ Addralign uint32 /* Alignment in bytes. */ Entsize uint32 /* Size of each entry in section. */ } ``` type Section64 -------------- ELF64 Section header. ``` type Section64 struct { Name uint32 /* Section name (index into the section header string table). */ Type uint32 /* Section type. */ Flags uint64 /* Section flags. */ Addr uint64 /* Address in memory image. */ Off uint64 /* Offset in file. */ Size uint64 /* Size in bytes. */ Link uint32 /* Index of a related section. */ Info uint32 /* Depends on section type. */ Addralign uint64 /* Alignment in bytes. */ Entsize uint64 /* Size of each entry in section. */ } ``` type SectionFlag ---------------- Section flags. ``` type SectionFlag uint32 ``` ``` const ( SHF_WRITE SectionFlag = 0x1 /* Section contains writable data. */ SHF_ALLOC SectionFlag = 0x2 /* Section occupies memory. */ SHF_EXECINSTR SectionFlag = 0x4 /* Section contains instructions. */ SHF_MERGE SectionFlag = 0x10 /* Section may be merged. */ SHF_STRINGS SectionFlag = 0x20 /* Section contains strings. */ SHF_INFO_LINK SectionFlag = 0x40 /* sh_info holds section index. */ SHF_LINK_ORDER SectionFlag = 0x80 /* Special ordering requirements. */ SHF_OS_NONCONFORMING SectionFlag = 0x100 /* OS-specific processing required. */ SHF_GROUP SectionFlag = 0x200 /* Member of section group. */ SHF_TLS SectionFlag = 0x400 /* Section contains TLS data. */ SHF_COMPRESSED SectionFlag = 0x800 /* Section is compressed. */ SHF_MASKOS SectionFlag = 0x0ff00000 /* OS-specific semantics. */ SHF_MASKPROC SectionFlag = 0xf0000000 /* Processor-specific semantics. */ ) ``` ### func (SectionFlag) GoString ``` func (i SectionFlag) GoString() string ``` ### func (SectionFlag) String ``` func (i SectionFlag) String() string ``` type SectionHeader ------------------ A SectionHeader represents a single ELF section header. ``` type SectionHeader struct { Name string Type SectionType Flags SectionFlag Addr uint64 Offset uint64 Size uint64 Link uint32 Info uint32 Addralign uint64 Entsize uint64 // FileSize is the size of this section in the file in bytes. // If a section is compressed, FileSize is the size of the // compressed data, while Size (above) is the size of the // uncompressed data. FileSize uint64 // Go 1.6 } ``` type SectionIndex ----------------- Special section indices. ``` type SectionIndex int ``` ``` const ( SHN_UNDEF SectionIndex = 0 /* Undefined, missing, irrelevant. */ SHN_LORESERVE SectionIndex = 0xff00 /* First of reserved range. */ SHN_LOPROC SectionIndex = 0xff00 /* First processor-specific. */ SHN_HIPROC SectionIndex = 0xff1f /* Last processor-specific. */ SHN_LOOS SectionIndex = 0xff20 /* First operating system-specific. */ SHN_HIOS SectionIndex = 0xff3f /* Last operating system-specific. */ SHN_ABS SectionIndex = 0xfff1 /* Absolute values. */ SHN_COMMON SectionIndex = 0xfff2 /* Common data. */ SHN_XINDEX SectionIndex = 0xffff /* Escape; index stored elsewhere. */ SHN_HIRESERVE SectionIndex = 0xffff /* Last of reserved range. */ ) ``` ### func (SectionIndex) GoString ``` func (i SectionIndex) GoString() string ``` ### func (SectionIndex) String ``` func (i SectionIndex) String() string ``` type SectionType ---------------- Section type. ``` type SectionType uint32 ``` ``` const ( SHT_NULL SectionType = 0 /* inactive */ SHT_PROGBITS SectionType = 1 /* program defined information */ SHT_SYMTAB SectionType = 2 /* symbol table section */ SHT_STRTAB SectionType = 3 /* string table section */ SHT_RELA SectionType = 4 /* relocation section with addends */ SHT_HASH SectionType = 5 /* symbol hash table section */ SHT_DYNAMIC SectionType = 6 /* dynamic section */ SHT_NOTE SectionType = 7 /* note section */ SHT_NOBITS SectionType = 8 /* no space section */ SHT_REL SectionType = 9 /* relocation section - no addends */ SHT_SHLIB SectionType = 10 /* reserved - purpose unknown */ SHT_DYNSYM SectionType = 11 /* dynamic symbol table section */ SHT_INIT_ARRAY SectionType = 14 /* Initialization function pointers. */ SHT_FINI_ARRAY SectionType = 15 /* Termination function pointers. */ SHT_PREINIT_ARRAY SectionType = 16 /* Pre-initialization function ptrs. */ SHT_GROUP SectionType = 17 /* Section group. */ SHT_SYMTAB_SHNDX SectionType = 18 /* Section indexes (see SHN_XINDEX). */ SHT_LOOS SectionType = 0x60000000 /* First of OS specific semantics */ SHT_GNU_ATTRIBUTES SectionType = 0x6ffffff5 /* GNU object attributes */ SHT_GNU_HASH SectionType = 0x6ffffff6 /* GNU hash table */ SHT_GNU_LIBLIST SectionType = 0x6ffffff7 /* GNU prelink library list */ SHT_GNU_VERDEF SectionType = 0x6ffffffd /* GNU version definition section */ SHT_GNU_VERNEED SectionType = 0x6ffffffe /* GNU version needs section */ SHT_GNU_VERSYM SectionType = 0x6fffffff /* GNU version symbol table */ SHT_HIOS SectionType = 0x6fffffff /* Last of OS specific semantics */ SHT_LOPROC SectionType = 0x70000000 /* reserved range for processor */ SHT_MIPS_ABIFLAGS SectionType = 0x7000002a /* .MIPS.abiflags */ SHT_HIPROC SectionType = 0x7fffffff /* specific section header types */ SHT_LOUSER SectionType = 0x80000000 /* reserved range for application */ SHT_HIUSER SectionType = 0xffffffff /* specific indexes */ ) ``` ### func (SectionType) GoString ``` func (i SectionType) GoString() string ``` ### func (SectionType) String ``` func (i SectionType) String() string ``` type Sym32 ---------- ELF32 Symbol. ``` type Sym32 struct { Name uint32 Value uint32 Size uint32 Info uint8 Other uint8 Shndx uint16 } ``` type Sym64 ---------- ELF64 symbol table entries. ``` type Sym64 struct { Name uint32 /* String table index of name. */ Info uint8 /* Type and binding information. */ Other uint8 /* Reserved (not used). */ Shndx uint16 /* Section index of symbol. */ Value uint64 /* Symbol value. */ Size uint64 /* Size of associated object. */ } ``` type SymBind ------------ Symbol Binding - ELFNN\_ST\_BIND - st\_info ``` type SymBind int ``` ``` const ( STB_LOCAL SymBind = 0 /* Local symbol */ STB_GLOBAL SymBind = 1 /* Global symbol */ STB_WEAK SymBind = 2 /* like global - lower precedence */ STB_LOOS SymBind = 10 /* Reserved range for operating system */ STB_HIOS SymBind = 12 /* specific semantics. */ STB_LOPROC SymBind = 13 /* reserved range for processor */ STB_HIPROC SymBind = 15 /* specific semantics. */ ) ``` ### func ST\_BIND ``` func ST_BIND(info uint8) SymBind ``` ### func (SymBind) GoString ``` func (i SymBind) GoString() string ``` ### func (SymBind) String ``` func (i SymBind) String() string ``` type SymType ------------ Symbol type - ELFNN\_ST\_TYPE - st\_info ``` type SymType int ``` ``` const ( STT_NOTYPE SymType = 0 /* Unspecified type. */ STT_OBJECT SymType = 1 /* Data object. */ STT_FUNC SymType = 2 /* Function. */ STT_SECTION SymType = 3 /* Section. */ STT_FILE SymType = 4 /* Source file. */ STT_COMMON SymType = 5 /* Uninitialized common block. */ STT_TLS SymType = 6 /* TLS object. */ STT_LOOS SymType = 10 /* Reserved range for operating system */ STT_HIOS SymType = 12 /* specific semantics. */ STT_LOPROC SymType = 13 /* reserved range for processor */ STT_HIPROC SymType = 15 /* specific semantics. */ ) ``` ### func ST\_TYPE ``` func ST_TYPE(info uint8) SymType ``` ### func (SymType) GoString ``` func (i SymType) GoString() string ``` ### func (SymType) String ``` func (i SymType) String() string ``` type SymVis ----------- Symbol visibility - ELFNN\_ST\_VISIBILITY - st\_other ``` type SymVis int ``` ``` const ( STV_DEFAULT SymVis = 0x0 /* Default visibility (see binding). */ STV_INTERNAL SymVis = 0x1 /* Special meaning in relocatable objects. */ STV_HIDDEN SymVis = 0x2 /* Not visible. */ STV_PROTECTED SymVis = 0x3 /* Visible but not preemptible. */ ) ``` ### func ST\_VISIBILITY ``` func ST_VISIBILITY(other uint8) SymVis ``` ### func (SymVis) GoString ``` func (i SymVis) GoString() string ``` ### func (SymVis) String ``` func (i SymVis) String() string ``` type Symbol ----------- A Symbol represents an entry in an ELF symbol table section. ``` type Symbol struct { Name string Info, Other byte Section SectionIndex Value, Size uint64 // Version and Library are present only for the dynamic symbol // table. Version string // Go 1.13 Library string // Go 1.13 } ``` type Type --------- Type is found in Header.Type. ``` type Type uint16 ``` ``` const ( ET_NONE Type = 0 /* Unknown type. */ ET_REL Type = 1 /* Relocatable. */ ET_EXEC Type = 2 /* Executable. */ ET_DYN Type = 3 /* Shared object. */ ET_CORE Type = 4 /* Core file. */ ET_LOOS Type = 0xfe00 /* First operating system specific. */ ET_HIOS Type = 0xfeff /* Last operating system-specific. */ ET_LOPROC Type = 0xff00 /* First processor-specific. */ ET_HIPROC Type = 0xffff /* Last processor-specific. */ ) ``` ### func (Type) GoString ``` func (i Type) GoString() string ``` ### func (Type) String ``` func (i Type) String() string ``` type Version ------------ Version is found in Header.Ident[EI\_VERSION] and Header.Version. ``` type Version byte ``` ``` const ( EV_NONE Version = 0 EV_CURRENT Version = 1 ) ``` ### func (Version) GoString ``` func (i Version) GoString() string ``` ### func (Version) String ``` func (i Version) String() string ```
programming_docs
spring_boot Documentation Overview Documentation Overview ====================== This section provides a brief overview of Spring Boot reference documentation. It serves as a map for the rest of the document. The latest copy of this document is available at [docs.spring.io/spring-boot/docs/current/reference/](https://docs.spring.io/spring-boot/docs/current/reference/). 1. First Steps --------------- If you are getting started with Spring Boot or 'Spring' in general, start with [the following topics](getting-started#getting-started): * **From scratch:** [Overview](getting-started#getting-started.introducing-spring-boot) | [Requirements](getting-started#getting-started.system-requirements) | [Installation](getting-started#getting-started.installing) * **Tutorial:** [Part 1](getting-started#getting-started.first-application) | [Part 2](getting-started#getting-started.first-application.code) * **Running your example:** [Part 1](getting-started#getting-started.first-application.run) | [Part 2](getting-started#getting-started.first-application.executable-jar) 2. Upgrading From an Earlier Version ------------------------------------- You should always ensure that you are running a [supported version](https://github.com/spring-projects/spring-boot/wiki/Supported-Versions) of Spring Boot. Depending on the version that you are upgrading to, you can find some additional tips here: * **From 1.x:** [Upgrading from 1.x](upgrading#upgrading.from-1x) * **To a new feature release:** [Upgrading to New Feature Release](upgrading#upgrading.to-feature) * **Spring Boot CLI:** [Upgrading the Spring Boot CLI](upgrading#upgrading.cli) 3. Developing with Spring Boot ------------------------------- Ready to actually start using Spring Boot? [We have you covered](using#using): * **Build systems:** [Maven](using#using.build-systems.maven) | [Gradle](using#using.build-systems.gradle) | [Ant](using#using.build-systems.ant) | [Starters](using#using.build-systems.starters) * **Best practices:** [Code Structure](using#using.structuring-your-code) | [@Configuration](using#using.configuration-classes) | [@EnableAutoConfiguration](using#using.auto-configuration) | [Beans and Dependency Injection](using#using.spring-beans-and-dependency-injection) * **Running your code:** [IDE](using#using.running-your-application.from-an-ide) | [Packaged](using#using.running-your-application.as-a-packaged-application) | [Maven](using#using.running-your-application.with-the-maven-plugin) | [Gradle](using#using.running-your-application.with-the-gradle-plugin) * **Packaging your app:** [Production jars](using#using.packaging-for-production) * **Spring Boot CLI:** [Using the CLI](cli#cli) 4. Learning About Spring Boot Features --------------------------------------- Need more details about Spring Boot’s core features? [The following content is for you](features#features): * **Spring Application:** [SpringApplication](features#features.spring-application) * **External Configuration:** [External Configuration](features#features.external-config) * **Profiles:** [Profiles](features#features.profiles) * **Logging:** [Logging](features#features.logging) 5. Web ------- If you develop Spring Boot web applications, take a look at the following content: * **Servlet Web Applications:** [Spring MVC, Jersey, Embedded Servlet Containers](web#web.servlet) * **Reactive Web Applications:** [Spring Webflux, Embedded Servlet Containers](web#web.reactive) * **Graceful Shutdown:** [Graceful Shutdown](web#web.graceful-shutdown) * **Spring Security:** [Default Security Configuration, Auto-configuration for OAuth2, SAML](web#web.security) * **Spring Session:** [Auto-configuration for Spring Session](web#web.spring-session) * **Spring HATEOAS:** [Auto-configuration for Spring HATEOAS](web#web.spring-hateoas) 6. Data -------- If your application deals with a datastore, you can see how to configure that here: * **SQL:** [Configuring a SQL Datastore, Embedded Database support, Connection pools, and more.](data#data.sql) * **NOSQL:** [Auto-configuration for NOSQL stores such as Redis, MongoDB, Neo4j, and others.](data#data.nosql) 7. Messaging ------------- If your application uses any messaging protocol, see one or more of the following sections: * **JMS:** [Auto-configuration for ActiveMQ and Artemis, Sending and Receiving messages through JMS](messaging#messaging.jms) * **AMQP:** [Auto-configuration for RabbitMQ](messaging#messaging.amqp) * **Kafka:** [Auto-configuration for Spring Kafka](messaging#messaging.kafka) * **RSocket:** [Auto-configuration for Spring Framework’s RSocket Support](messaging#messaging.rsocket) * **Spring Integration:** [Auto-configuration for Spring Integration](messaging#messaging.spring-integration) 8. IO ------ If your application needs IO capabilities, see one or more of the following sections: * **Caching:** [Caching support EhCache, Hazelcast, Infinispan and more](io#io.caching) * **Quartz:** [Quartz Scheduling](io#io.quartz) * **Mail:** [Sending Email](io#io.email) * **Validation:** [JSR-303 Validation](io#io.validation) * **REST Clients:** [Calling REST Services with RestTemplate and WebClient](io#io.rest-client) * **Webservices:** [Auto-configuration for Spring Web Services](io#io.webservices) * **JTA:** [Distributed Transactions with JTA](io#io.jta) 9. Container Images -------------------- Spring Boot provides first-class support for building efficient container images. You can read more about it here: * **Efficient Container Images:** [Tips to optimize container images such as Docker images](container-images#container-images.efficient-images) * **Dockerfiles:** [Building container images using dockerfiles](container-images#container-images.dockerfiles) * **Cloud Native Buildpacks:** [Support for Cloud Native Buildpacks with Maven and Gradle](container-images#container-images.buildpacks) 10. Advanced Topics -------------------- Finally, we have a few topics for more advanced users: * **Spring Boot Applications Deployment:** [Cloud Deployment](deployment#deployment.cloud) | [OS Service](deployment#deployment.installing.nix-services) * **Build tool plugins:** [Maven](build-tool-plugins#build-tool-plugins.maven) | [Gradle](build-tool-plugins#build-tool-plugins.gradle) * **Appendix:** [Application Properties](application-properties#appendix.application-properties) | [Configuration Metadata](configuration-metadata#appendix.configuration-metadata) | [Auto-configuration Classes](auto-configuration-classes#appendix.auto-configuration-classes) | [Test Auto-configuration Annotations](test-auto-configuration#appendix.test-auto-configuration) | [Executable Jars](executable-jar#appendix.executable-jar) | [Dependency Versions](dependency-versions#appendix.dependency-versions) spring_boot Spring Boot Reference Documentation Spring Boot Reference Documentation =================================== This document is also available as [a single HTML page](https://docs.spring.io/spring-boot/docs/2.7.0/reference/htmlsingle/) and as [a PDF](https://docs.spring.io/spring-boot/docs/2.7.0/reference/pdf/spring-boot-reference.pdf). The reference documentation consists of the following sections: | | | | --- | --- | | [Legal](https://docs.spring.io/spring-boot/docs/2.7.0/reference/html/legal.html#legal) | Legal information. | | [Getting Help](getting-help#getting-help) | Resources for getting help. | | [Documentation Overview](documentation#documentation) | About the Documentation, First Steps, and more. | | [Getting Started](getting-started#getting-started) | Introducing Spring Boot, System Requirements, Servlet Containers, Installing Spring Boot, and Developing Your First Spring Boot Application | | [Upgrading Spring Boot Applications](upgrading#upgrading) | Upgrading from 1.x, Upgrading to a new feature release, and Upgrading the Spring Boot CLI. | | [Using Spring Boot](using#using) | Build Systems, Structuring Your Code, Configuration, Spring Beans and Dependency Injection, DevTools, and more. | | [Core Features](features#features) | Profiles, Logging, Security, Caching, Spring Integration, Testing, and more. | | [Web](web#web) | Servlet Web, Reactive Web, GraphQL, Embedded Container Support, Graceful Shutdown, and more. | | [Data](data#data) | SQL and NOSQL data access. | | [IO](io#io) | Caching, Quartz Scheduler, REST clients, Sending email, Spring Web Services, and more. | | [Messaging](messaging#messaging) | JMS, AMQP, Apache Kafka, RSocket, WebSocket, and Spring Integration. | | [Container Images](container-images#container-images) | Efficient container images and Building container images with Dockerfiles and Cloud Native Buildpacks. | | [Production-ready Features](actuator#actuator) | Monitoring, Metrics, Auditing, and more. | | [Deploying Spring Boot Applications](deployment#deployment) | Deploying to the Cloud, and Installing as a Unix application. | | [Spring Boot CLI](cli#cli) | Installing the CLI, Using the CLI, Configuring the CLI, and more. | | [Build Tool Plugins](build-tool-plugins#build-tool-plugins) | Maven Plugin, Gradle Plugin, Antlib, and more. | | [“How-to” Guides](howto#howto) | Application Development, Configuration, Embedded Servers, Data Access, and many more. | The reference documentation has the following appendices: | | | | --- | --- | | [Application Properties](application-properties#appendix.application-properties) | Common application properties that you can use to configure your application. | | [Configuration Metadata](configuration-metadata#appendix.configuration-metadata) | Metadata that you can use to describe configuration properties. | | [Auto-configuration Classes](auto-configuration-classes#appendix.auto-configuration-classes) | Auto-configuration classes provided by Spring Boot. | | [Test Auto-configuration Annotations](test-auto-configuration#appendix.test-auto-configuration) | Test auto-configuration annotations that you can use to test slices of your application. | | [Executable Jars](executable-jar#appendix.executable-jar) | Spring Boot’s executable jars, their launchers, and their format. | | [Dependency Versions](dependency-versions#appendix.dependency-versions) | Details of the dependencies that are managed by Spring Boot. | spring_boot Common Application Properties Common Application Properties ============================= Various properties can be specified inside your `application.properties` file, inside your `application.yml` file, or as command line switches. This appendix provides a list of common Spring Boot properties and references to the underlying classes that consume them. | | | | --- | --- | | | Spring Boot provides various conversion mechanism with advanced value formatting, make sure to review [the properties conversion section](features#features.external-config.typesafe-configuration-properties.conversion). | | | | | --- | --- | | | Property contributions can come from additional jar files on your classpath, so you should not consider this an exhaustive list. Also, you can define your own properties. | 1. Core Properties ------------------- | Name | Description | Default Value | | --- | --- | --- | | [`debug`](#application-properties.core.debug) | Enable debug logs. | `false` | | [`info.*`](#application-properties.core.info) | Arbitrary properties to add to the info endpoint. | | | [`logging.charset.console`](#application-properties.core.logging.charset.console) | Charset to use for console output. | | | [`logging.charset.file`](#application-properties.core.logging.charset.file) | Charset to use for file output. | | | [`logging.config`](#application-properties.core.logging.config) | Location of the logging configuration file. For instance, `classpath:logback.xml` for Logback. | | | [`logging.exception-conversion-word`](#application-properties.core.logging.exception-conversion-word) | Conversion word used when logging exceptions. | `%wEx` | | [`logging.file.name`](#application-properties.core.logging.file.name) | Log file name (for instance, `myapp.log`). Names can be an exact location or relative to the current directory. | | | [`logging.file.path`](#application-properties.core.logging.file.path) | Location of the log file. For instance, `/var/log`. | | | [`logging.group.*`](#application-properties.core.logging.group) | Log groups to quickly change multiple loggers at the same time. For instance, `logging.group.db=org.hibernate,org.springframework.jdbc`. | | | [`logging.level.*`](#application-properties.core.logging.level) | Log levels severity mapping. For instance, `logging.level.org.springframework=DEBUG`. | | | [`logging.log4j2.config.override`](#application-properties.core.logging.log4j2.config.override) | Overriding configuration files used to create a composite configuration. | | | [`logging.logback.rollingpolicy.clean-history-on-start`](#application-properties.core.logging.logback.rollingpolicy.clean-history-on-start) | Whether to clean the archive log files on startup. | `false` | | [`logging.logback.rollingpolicy.file-name-pattern`](#application-properties.core.logging.logback.rollingpolicy.file-name-pattern) | Pattern for rolled-over log file names. | `${LOG_FILE}.%d{yyyy-MM-dd}.%i.gz` | | [`logging.logback.rollingpolicy.max-file-size`](#application-properties.core.logging.logback.rollingpolicy.max-file-size) | Maximum log file size. | `10MB` | | [`logging.logback.rollingpolicy.max-history`](#application-properties.core.logging.logback.rollingpolicy.max-history) | Maximum number of archive log files to keep. | `7` | | [`logging.logback.rollingpolicy.total-size-cap`](#application-properties.core.logging.logback.rollingpolicy.total-size-cap) | Total size of log backups to be kept. | `0B` | | [`logging.pattern.console`](#application-properties.core.logging.pattern.console) | Appender pattern for output to the console. Supported only with the default Logback setup. | `%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}` | | [`logging.pattern.dateformat`](#application-properties.core.logging.pattern.dateformat) | Appender pattern for log date format. Supported only with the default Logback setup. | `yyyy-MM-dd HH:mm:ss.SSS` | | [`logging.pattern.file`](#application-properties.core.logging.pattern.file) | Appender pattern for output to a file. Supported only with the default Logback setup. | `%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}} ${LOG_LEVEL_PATTERN:-%5p} ${PID:- } --- [%t] %-40.40logger{39} : %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}` | | [`logging.pattern.level`](#application-properties.core.logging.pattern.level) | Appender pattern for log level. Supported only with the default Logback setup. | `%5p` | | [`logging.register-shutdown-hook`](#application-properties.core.logging.register-shutdown-hook) | Register a shutdown hook for the logging system when it is initialized. Disabled automatically when deployed as a war file. | `true` | | [`spring.aop.auto`](#application-properties.core.spring.aop.auto) | Add @EnableAspectJAutoProxy. | `true` | | [`spring.aop.proxy-target-class`](#application-properties.core.spring.aop.proxy-target-class) | Whether subclass-based (CGLIB) proxies are to be created (true), as opposed to standard Java interface-based proxies (false). | `true` | | [`spring.application.admin.enabled`](#application-properties.core.spring.application.admin.enabled) | Whether to enable admin features for the application. | `false` | | [`spring.application.admin.jmx-name`](#application-properties.core.spring.application.admin.jmx-name) | JMX name of the application admin MBean. | `org.springframework.boot:type=Admin,name=SpringApplication` | | [`spring.application.name`](#application-properties.core.spring.application.name) | Application name. | | | [`spring.autoconfigure.exclude`](#application-properties.core.spring.autoconfigure.exclude) | Auto-configuration classes to exclude. | | | [`spring.banner.charset`](#application-properties.core.spring.banner.charset) | Banner file encoding. | `UTF-8` | | [`spring.banner.image.bitdepth`](#application-properties.core.spring.banner.image.bitdepth) | Bit depth to use for ANSI colors. Supported values are 4 (16 color) or 8 (256 color). | `4` | | [`spring.banner.image.height`](#application-properties.core.spring.banner.image.height) | Height of the banner image in chars (default based on image height). | | | [`spring.banner.image.invert`](#application-properties.core.spring.banner.image.invert) | Whether images should be inverted for dark terminal themes. | `false` | | [`spring.banner.image.location`](#application-properties.core.spring.banner.image.location) | Banner image file location (jpg or png can also be used). | `classpath:banner.gif` | | [`spring.banner.image.margin`](#application-properties.core.spring.banner.image.margin) | Left hand image margin in chars. | `2` | | [`spring.banner.image.pixelmode`](#application-properties.core.spring.banner.image.pixelmode) | Pixel mode to use when rendering the image. | `TEXT` | | [`spring.banner.image.width`](#application-properties.core.spring.banner.image.width) | Width of the banner image in chars. | `76` | | [`spring.banner.location`](#application-properties.core.spring.banner.location) | Banner text resource location. | `classpath:banner.txt` | | [`spring.beaninfo.ignore`](#application-properties.core.spring.beaninfo.ignore) | Whether to skip search of BeanInfo classes. | `true` | | [`spring.codec.log-request-details`](#application-properties.core.spring.codec.log-request-details) | Whether to log form data at DEBUG level, and headers at TRACE level. | `false` | | [`spring.codec.max-in-memory-size`](#application-properties.core.spring.codec.max-in-memory-size) | Limit on the number of bytes that can be buffered whenever the input stream needs to be aggregated. This applies only to the auto-configured WebFlux server and WebClient instances. By default this is not set, in which case individual codec defaults apply. Most codecs are limited to 256K by default. | | | [`spring.config.activate.on-cloud-platform`](#application-properties.core.spring.config.activate.on-cloud-platform) | Required cloud platform for the document to be included. | | | [`spring.config.activate.on-profile`](#application-properties.core.spring.config.activate.on-profile) | Profile expressions that should match for the document to be included. | | | [`spring.config.additional-location`](#application-properties.core.spring.config.additional-location) | Config file locations used in addition to the defaults. | | | [`spring.config.import`](#application-properties.core.spring.config.import) | Import additional config data. | | | [`spring.config.location`](#application-properties.core.spring.config.location) | Config file locations that replace the defaults. | | | [`spring.config.name`](#application-properties.core.spring.config.name) | Config file name. | `application` | | [`spring.config.use-legacy-processing`](#application-properties.core.spring.config.use-legacy-processing) | Whether to enable configuration data processing legacy mode. | `false` | | [`spring.info.build.encoding`](#application-properties.core.spring.info.build.encoding) | File encoding. | `UTF-8` | | [`spring.info.build.location`](#application-properties.core.spring.info.build.location) | Location of the generated build-info.properties file. | `classpath:META-INF/build-info.properties` | | [`spring.info.git.encoding`](#application-properties.core.spring.info.git.encoding) | File encoding. | `UTF-8` | | [`spring.info.git.location`](#application-properties.core.spring.info.git.location) | Location of the generated git.properties file. | `classpath:git.properties` | | [`spring.jmx.default-domain`](#application-properties.core.spring.jmx.default-domain) | JMX domain name. | | | [`spring.jmx.enabled`](#application-properties.core.spring.jmx.enabled) | Expose management beans to the JMX domain. | `false` | | [`spring.jmx.server`](#application-properties.core.spring.jmx.server) | MBeanServer bean name. | `mbeanServer` | | [`spring.jmx.unique-names`](#application-properties.core.spring.jmx.unique-names) | Whether unique runtime object names should be ensured. | `false` | | [`spring.lifecycle.timeout-per-shutdown-phase`](#application-properties.core.spring.lifecycle.timeout-per-shutdown-phase) | Timeout for the shutdown of any phase (group of SmartLifecycle beans with the same 'phase' value). | `30s` | | [`spring.main.allow-bean-definition-overriding`](#application-properties.core.spring.main.allow-bean-definition-overriding) | Whether bean definition overriding, by registering a definition with the same name as an existing definition, is allowed. | `false` | | [`spring.main.allow-circular-references`](#application-properties.core.spring.main.allow-circular-references) | Whether to allow circular references between beans and automatically try to resolve them. | `false` | | [`spring.main.banner-mode`](#application-properties.core.spring.main.banner-mode) | Mode used to display the banner when the application runs. | `console` | | [`spring.main.cloud-platform`](#application-properties.core.spring.main.cloud-platform) | Override the Cloud Platform auto-detection. | | | [`spring.main.lazy-initialization`](#application-properties.core.spring.main.lazy-initialization) | Whether initialization should be performed lazily. | `false` | | [`spring.main.log-startup-info`](#application-properties.core.spring.main.log-startup-info) | Whether to log information about the application when it starts. | `true` | | [`spring.main.register-shutdown-hook`](#application-properties.core.spring.main.register-shutdown-hook) | Whether the application should have a shutdown hook registered. | `true` | | [`spring.main.sources`](#application-properties.core.spring.main.sources) | Sources (class names, package names, or XML resource locations) to include in the ApplicationContext. | | | [`spring.main.web-application-type`](#application-properties.core.spring.main.web-application-type) | Flag to explicitly request a specific type of web application. If not set, auto-detected based on the classpath. | | | [`spring.mandatory-file-encoding`](#application-properties.core.spring.mandatory-file-encoding) | Expected character encoding the application must use. | | | [`spring.messages.always-use-message-format`](#application-properties.core.spring.messages.always-use-message-format) | Whether to always apply the MessageFormat rules, parsing even messages without arguments. | `false` | | [`spring.messages.basename`](#application-properties.core.spring.messages.basename) | Comma-separated list of basenames (essentially a fully-qualified classpath location), each following the ResourceBundle convention with relaxed support for slash based locations. If it doesn't contain a package qualifier (such as "org.mypackage"), it will be resolved from the classpath root. | `messages` | | [`spring.messages.cache-duration`](#application-properties.core.spring.messages.cache-duration) | Loaded resource bundle files cache duration. When not set, bundles are cached forever. If a duration suffix is not specified, seconds will be used. | | | [`spring.messages.encoding`](#application-properties.core.spring.messages.encoding) | Message bundles encoding. | `UTF-8` | | [`spring.messages.fallback-to-system-locale`](#application-properties.core.spring.messages.fallback-to-system-locale) | Whether to fall back to the system Locale if no files for a specific Locale have been found. if this is turned off, the only fallback will be the default file (e.g. "messages.properties" for basename "messages"). | `true` | | [`spring.messages.use-code-as-default-message`](#application-properties.core.spring.messages.use-code-as-default-message) | Whether to use the message code as the default message instead of throwing a "NoSuchMessageException". Recommended during development only. | `false` | | [`spring.output.ansi.enabled`](#application-properties.core.spring.output.ansi.enabled) | Configures the ANSI output. | `detect` | | [`spring.pid.fail-on-write-error`](#application-properties.core.spring.pid.fail-on-write-error) | Fails if ApplicationPidFileWriter is used but it cannot write the PID file. | | | [`spring.pid.file`](#application-properties.core.spring.pid.file) | Location of the PID file to write (if ApplicationPidFileWriter is used). | | | [`spring.profiles.active`](#application-properties.core.spring.profiles.active) | Comma-separated list of active profiles. Can be overridden by a command line switch. | | | [`spring.profiles.default`](#application-properties.core.spring.profiles.default) | Name of the profile to enable if no profile is active. | `default` | | [`spring.profiles.group.*`](#application-properties.core.spring.profiles.group) | Profile groups to define a logical name for a related group of profiles. | | | [`spring.profiles.include`](#application-properties.core.spring.profiles.include) | Unconditionally activate the specified comma-separated list of profiles (or list of profiles if using YAML). | | | [`spring.quartz.auto-startup`](#application-properties.core.spring.quartz.auto-startup) | Whether to automatically start the scheduler after initialization. | `true` | | [`spring.quartz.jdbc.comment-prefix`](#application-properties.core.spring.quartz.jdbc.comment-prefix) | Prefixes for single-line comments in SQL initialization scripts. | `[#, --]` | | [`spring.quartz.jdbc.initialize-schema`](#application-properties.core.spring.quartz.jdbc.initialize-schema) | Database schema initialization mode. | `embedded` | | [`spring.quartz.jdbc.platform`](#application-properties.core.spring.quartz.jdbc.platform) | Platform to use in initialization scripts if the @@platform@@ placeholder is used. Auto-detected by default. | | | [`spring.quartz.jdbc.schema`](#application-properties.core.spring.quartz.jdbc.schema) | Path to the SQL file to use to initialize the database schema. | `classpath:org/quartz/impl/jdbcjobstore/tables_@@platform@@.sql` | | [`spring.quartz.job-store-type`](#application-properties.core.spring.quartz.job-store-type) | Quartz job store type. | `memory` | | [`spring.quartz.overwrite-existing-jobs`](#application-properties.core.spring.quartz.overwrite-existing-jobs) | Whether configured jobs should overwrite existing job definitions. | `false` | | [`spring.quartz.properties.*`](#application-properties.core.spring.quartz.properties) | Additional Quartz Scheduler properties. | | | [`spring.quartz.scheduler-name`](#application-properties.core.spring.quartz.scheduler-name) | Name of the scheduler. | `quartzScheduler` | | [`spring.quartz.startup-delay`](#application-properties.core.spring.quartz.startup-delay) | Delay after which the scheduler is started once initialization completes. Setting this property makes sense if no jobs should be run before the entire application has started up. | `0s` | | [`spring.quartz.wait-for-jobs-to-complete-on-shutdown`](#application-properties.core.spring.quartz.wait-for-jobs-to-complete-on-shutdown) | Whether to wait for running jobs to complete on shutdown. | `false` | | [`spring.reactor.debug-agent.enabled`](#application-properties.core.spring.reactor.debug-agent.enabled) | Whether the Reactor Debug Agent should be enabled when reactor-tools is present. | `true` | | [`spring.task.execution.pool.allow-core-thread-timeout`](#application-properties.core.spring.task.execution.pool.allow-core-thread-timeout) | Whether core threads are allowed to time out. This enables dynamic growing and shrinking of the pool. | `true` | | [`spring.task.execution.pool.core-size`](#application-properties.core.spring.task.execution.pool.core-size) | Core number of threads. | `8` | | [`spring.task.execution.pool.keep-alive`](#application-properties.core.spring.task.execution.pool.keep-alive) | Time limit for which threads may remain idle before being terminated. | `60s` | | [`spring.task.execution.pool.max-size`](#application-properties.core.spring.task.execution.pool.max-size) | Maximum allowed number of threads. If tasks are filling up the queue, the pool can expand up to that size to accommodate the load. Ignored if the queue is unbounded. | | | [`spring.task.execution.pool.queue-capacity`](#application-properties.core.spring.task.execution.pool.queue-capacity) | Queue capacity. An unbounded capacity does not increase the pool and therefore ignores the "max-size" property. | | | [`spring.task.execution.shutdown.await-termination`](#application-properties.core.spring.task.execution.shutdown.await-termination) | Whether the executor should wait for scheduled tasks to complete on shutdown. | `false` | | [`spring.task.execution.shutdown.await-termination-period`](#application-properties.core.spring.task.execution.shutdown.await-termination-period) | Maximum time the executor should wait for remaining tasks to complete. | | | [`spring.task.execution.thread-name-prefix`](#application-properties.core.spring.task.execution.thread-name-prefix) | Prefix to use for the names of newly created threads. | `task-` | | [`spring.task.scheduling.pool.size`](#application-properties.core.spring.task.scheduling.pool.size) | Maximum allowed number of threads. | `1` | | [`spring.task.scheduling.shutdown.await-termination`](#application-properties.core.spring.task.scheduling.shutdown.await-termination) | Whether the executor should wait for scheduled tasks to complete on shutdown. | `false` | | [`spring.task.scheduling.shutdown.await-termination-period`](#application-properties.core.spring.task.scheduling.shutdown.await-termination-period) | Maximum time the executor should wait for remaining tasks to complete. | | | [`spring.task.scheduling.thread-name-prefix`](#application-properties.core.spring.task.scheduling.thread-name-prefix) | Prefix to use for the names of newly created threads. | `scheduling-` | | [`trace`](#application-properties.core.trace) | Enable trace logs. | `false` | 2. Cache Properties -------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.cache.cache-names`](#application-properties.cache.spring.cache.cache-names) | Comma-separated list of cache names to create if supported by the underlying cache manager. Usually, this disables the ability to create additional caches on-the-fly. | | | [`spring.cache.caffeine.spec`](#application-properties.cache.spring.cache.caffeine.spec) | The spec to use to create caches. See CaffeineSpec for more details on the spec format. | | | [`spring.cache.couchbase.expiration`](#application-properties.cache.spring.cache.couchbase.expiration) | Entry expiration. By default the entries never expire. Note that this value is ultimately converted to seconds. | | | [`spring.cache.ehcache.config`](#application-properties.cache.spring.cache.ehcache.config) | The location of the configuration file to use to initialize EhCache. | | | [`spring.cache.infinispan.config`](#application-properties.cache.spring.cache.infinispan.config) | The location of the configuration file to use to initialize Infinispan. | | | [`spring.cache.jcache.config`](#application-properties.cache.spring.cache.jcache.config) | The location of the configuration file to use to initialize the cache manager. The configuration file is dependent of the underlying cache implementation. | | | [`spring.cache.jcache.provider`](#application-properties.cache.spring.cache.jcache.provider) | Fully qualified name of the CachingProvider implementation to use to retrieve the JSR-107 compliant cache manager. Needed only if more than one JSR-107 implementation is available on the classpath. | | | [`spring.cache.redis.cache-null-values`](#application-properties.cache.spring.cache.redis.cache-null-values) | Allow caching null values. | `true` | | [`spring.cache.redis.enable-statistics`](#application-properties.cache.spring.cache.redis.enable-statistics) | Whether to enable cache statistics. | `false` | | [`spring.cache.redis.key-prefix`](#application-properties.cache.spring.cache.redis.key-prefix) | Key prefix. | | | [`spring.cache.redis.time-to-live`](#application-properties.cache.spring.cache.redis.time-to-live) | Entry expiration. By default the entries never expire. | | | [`spring.cache.redis.use-key-prefix`](#application-properties.cache.spring.cache.redis.use-key-prefix) | Whether to use the key prefix when writing to Redis. | `true` | | [`spring.cache.type`](#application-properties.cache.spring.cache.type) | Cache type. By default, auto-detected according to the environment. | | 3. Mail Properties ------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.mail.default-encoding`](#application-properties.mail.spring.mail.default-encoding) | Default MimeMessage encoding. | `UTF-8` | | [`spring.mail.host`](#application-properties.mail.spring.mail.host) | SMTP server host. For instance, 'smtp.example.com'. | | | [`spring.mail.jndi-name`](#application-properties.mail.spring.mail.jndi-name) | Session JNDI name. When set, takes precedence over other Session settings. | | | [`spring.mail.password`](#application-properties.mail.spring.mail.password) | Login password of the SMTP server. | | | [`spring.mail.port`](#application-properties.mail.spring.mail.port) | SMTP server port. | | | [`spring.mail.properties.*`](#application-properties.mail.spring.mail.properties) | Additional JavaMail Session properties. | | | [`spring.mail.protocol`](#application-properties.mail.spring.mail.protocol) | Protocol used by the SMTP server. | `smtp` | | [`spring.mail.test-connection`](#application-properties.mail.spring.mail.test-connection) | Whether to test that the mail server is available on startup. | `false` | | [`spring.mail.username`](#application-properties.mail.spring.mail.username) | Login user of the SMTP server. | | | [`spring.sendgrid.api-key`](#application-properties.mail.spring.sendgrid.api-key) | SendGrid API key. | | | [`spring.sendgrid.proxy.host`](#application-properties.mail.spring.sendgrid.proxy.host) | SendGrid proxy host. | | | [`spring.sendgrid.proxy.port`](#application-properties.mail.spring.sendgrid.proxy.port) | SendGrid proxy port. | | 4. JSON Properties ------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.gson.date-format`](#application-properties.json.spring.gson.date-format) | Format to use when serializing Date objects. | | | [`spring.gson.disable-html-escaping`](#application-properties.json.spring.gson.disable-html-escaping) | Whether to disable the escaping of HTML characters such as '<', '>', etc. | | | [`spring.gson.disable-inner-class-serialization`](#application-properties.json.spring.gson.disable-inner-class-serialization) | Whether to exclude inner classes during serialization. | | | [`spring.gson.enable-complex-map-key-serialization`](#application-properties.json.spring.gson.enable-complex-map-key-serialization) | Whether to enable serialization of complex map keys (i.e. non-primitives). | | | [`spring.gson.exclude-fields-without-expose-annotation`](#application-properties.json.spring.gson.exclude-fields-without-expose-annotation) | Whether to exclude all fields from consideration for serialization or deserialization that do not have the "Expose" annotation. | | | [`spring.gson.field-naming-policy`](#application-properties.json.spring.gson.field-naming-policy) | Naming policy that should be applied to an object's field during serialization and deserialization. | | | [`spring.gson.generate-non-executable-json`](#application-properties.json.spring.gson.generate-non-executable-json) | Whether to generate non-executable JSON by prefixing the output with some special text. | | | [`spring.gson.lenient`](#application-properties.json.spring.gson.lenient) | Whether to be lenient about parsing JSON that doesn't conform to RFC 4627. | | | [`spring.gson.long-serialization-policy`](#application-properties.json.spring.gson.long-serialization-policy) | Serialization policy for Long and long types. | | | [`spring.gson.pretty-printing`](#application-properties.json.spring.gson.pretty-printing) | Whether to output serialized JSON that fits in a page for pretty printing. | | | [`spring.gson.serialize-nulls`](#application-properties.json.spring.gson.serialize-nulls) | Whether to serialize null fields. | | | [`spring.jackson.constructor-detector`](#application-properties.json.spring.jackson.constructor-detector) | Strategy to use to auto-detect constructor, and in particular behavior with single-argument constructors. | `default` | | [`spring.jackson.date-format`](#application-properties.json.spring.jackson.date-format) | Date format string or a fully-qualified date format class name. For instance, 'yyyy-MM-dd HH:mm:ss'. | | | [`spring.jackson.default-leniency`](#application-properties.json.spring.jackson.default-leniency) | Global default setting (if any) for leniency. | | | [`spring.jackson.default-property-inclusion`](#application-properties.json.spring.jackson.default-property-inclusion) | Controls the inclusion of properties during serialization. Configured with one of the values in Jackson's JsonInclude.Include enumeration. | | | [`spring.jackson.deserialization.*`](#application-properties.json.spring.jackson.deserialization) | Jackson on/off features that affect the way Java objects are deserialized. | | | [`spring.jackson.generator.*`](#application-properties.json.spring.jackson.generator) | Jackson on/off features for generators. | | | [`spring.jackson.locale`](#application-properties.json.spring.jackson.locale) | Locale used for formatting. | | | [`spring.jackson.mapper.*`](#application-properties.json.spring.jackson.mapper) | Jackson general purpose on/off features. | | | [`spring.jackson.parser.*`](#application-properties.json.spring.jackson.parser) | Jackson on/off features for parsers. | | | [`spring.jackson.property-naming-strategy`](#application-properties.json.spring.jackson.property-naming-strategy) | One of the constants on Jackson's PropertyNamingStrategies. Can also be a fully-qualified class name of a PropertyNamingStrategy implementation. | | | [`spring.jackson.serialization.*`](#application-properties.json.spring.jackson.serialization) | Jackson on/off features that affect the way Java objects are serialized. | | | [`spring.jackson.time-zone`](#application-properties.json.spring.jackson.time-zone) | Time zone used when formatting dates. For instance, "America/Los\_Angeles" or "GMT+10". | | | [`spring.jackson.visibility.*`](#application-properties.json.spring.jackson.visibility) | Jackson visibility thresholds that can be used to limit which methods (and fields) are auto-detected. | | 5. Data Properties ------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.couchbase.connection-string`](#application-properties.data.spring.couchbase.connection-string) | Connection string used to locate the Couchbase cluster. | | | [`spring.couchbase.env.io.idle-http-connection-timeout`](#application-properties.data.spring.couchbase.env.io.idle-http-connection-timeout) | Length of time an HTTP connection may remain idle before it is closed and removed from the pool. | `4500ms` | | [`spring.couchbase.env.io.max-endpoints`](#application-properties.data.spring.couchbase.env.io.max-endpoints) | Maximum number of sockets per node. | `12` | | [`spring.couchbase.env.io.min-endpoints`](#application-properties.data.spring.couchbase.env.io.min-endpoints) | Minimum number of sockets per node. | `1` | | [`spring.couchbase.env.ssl.enabled`](#application-properties.data.spring.couchbase.env.ssl.enabled) | Whether to enable SSL support. Enabled automatically if a "keyStore" is provided unless specified otherwise. | | | [`spring.couchbase.env.ssl.key-store`](#application-properties.data.spring.couchbase.env.ssl.key-store) | Path to the JVM key store that holds the certificates. | | | [`spring.couchbase.env.ssl.key-store-password`](#application-properties.data.spring.couchbase.env.ssl.key-store-password) | Password used to access the key store. | | | [`spring.couchbase.env.timeouts.analytics`](#application-properties.data.spring.couchbase.env.timeouts.analytics) | Timeout for the analytics service. | `75s` | | [`spring.couchbase.env.timeouts.connect`](#application-properties.data.spring.couchbase.env.timeouts.connect) | Bucket connect timeout. | `10s` | | [`spring.couchbase.env.timeouts.disconnect`](#application-properties.data.spring.couchbase.env.timeouts.disconnect) | Bucket disconnect timeout. | `10s` | | [`spring.couchbase.env.timeouts.key-value`](#application-properties.data.spring.couchbase.env.timeouts.key-value) | Timeout for operations on a specific key-value. | `2500ms` | | [`spring.couchbase.env.timeouts.key-value-durable`](#application-properties.data.spring.couchbase.env.timeouts.key-value-durable) | Timeout for operations on a specific key-value with a durability level. | `10s` | | [`spring.couchbase.env.timeouts.management`](#application-properties.data.spring.couchbase.env.timeouts.management) | Timeout for the management operations. | `75s` | | [`spring.couchbase.env.timeouts.query`](#application-properties.data.spring.couchbase.env.timeouts.query) | N1QL query operations timeout. | `75s` | | [`spring.couchbase.env.timeouts.search`](#application-properties.data.spring.couchbase.env.timeouts.search) | Timeout for the search service. | `75s` | | [`spring.couchbase.env.timeouts.view`](#application-properties.data.spring.couchbase.env.timeouts.view) | Regular and geospatial view operations timeout. | `75s` | | [`spring.couchbase.password`](#application-properties.data.spring.couchbase.password) | Cluster password. | | | [`spring.couchbase.username`](#application-properties.data.spring.couchbase.username) | Cluster username. | | | [`spring.dao.exceptiontranslation.enabled`](#application-properties.data.spring.dao.exceptiontranslation.enabled) | Whether to enable the PersistenceExceptionTranslationPostProcessor. | `true` | | [`spring.data.cassandra.compression`](#application-properties.data.spring.data.cassandra.compression) | Compression supported by the Cassandra binary protocol. | `none` | | [`spring.data.cassandra.config`](#application-properties.data.spring.data.cassandra.config) | Location of the configuration file to use. | | | [`spring.data.cassandra.connection.connect-timeout`](#application-properties.data.spring.data.cassandra.connection.connect-timeout) | Timeout to use when establishing driver connections. | `5s` | | [`spring.data.cassandra.connection.init-query-timeout`](#application-properties.data.spring.data.cassandra.connection.init-query-timeout) | Timeout to use for internal queries that run as part of the initialization process, just after a connection is opened. | `5s` | | [`spring.data.cassandra.contact-points`](#application-properties.data.spring.data.cassandra.contact-points) | Cluster node addresses in the form 'host:port', or a simple 'host' to use the configured port. | `[127.0.0.1:9042]` | | [`spring.data.cassandra.controlconnection.timeout`](#application-properties.data.spring.data.cassandra.controlconnection.timeout) | Timeout to use for control queries. | `5s` | | [`spring.data.cassandra.keyspace-name`](#application-properties.data.spring.data.cassandra.keyspace-name) | Keyspace name to use. | | | [`spring.data.cassandra.local-datacenter`](#application-properties.data.spring.data.cassandra.local-datacenter) | Datacenter that is considered "local". Contact points should be from this datacenter. | | | [`spring.data.cassandra.password`](#application-properties.data.spring.data.cassandra.password) | Login password of the server. | | | [`spring.data.cassandra.pool.heartbeat-interval`](#application-properties.data.spring.data.cassandra.pool.heartbeat-interval) | Heartbeat interval after which a message is sent on an idle connection to make sure it's still alive. | `30s` | | [`spring.data.cassandra.pool.idle-timeout`](#application-properties.data.spring.data.cassandra.pool.idle-timeout) | Idle timeout before an idle connection is removed. | `5s` | | [`spring.data.cassandra.port`](#application-properties.data.spring.data.cassandra.port) | Port to use if a contact point does not specify one. | `9042` | | [`spring.data.cassandra.repositories.type`](#application-properties.data.spring.data.cassandra.repositories.type) | Type of Cassandra repositories to enable. | `auto` | | [`spring.data.cassandra.request.consistency`](#application-properties.data.spring.data.cassandra.request.consistency) | Queries consistency level. | | | [`spring.data.cassandra.request.page-size`](#application-properties.data.spring.data.cassandra.request.page-size) | How many rows will be retrieved simultaneously in a single network round-trip. | `5000` | | [`spring.data.cassandra.request.serial-consistency`](#application-properties.data.spring.data.cassandra.request.serial-consistency) | Queries serial consistency level. | | | [`spring.data.cassandra.request.throttler.drain-interval`](#application-properties.data.spring.data.cassandra.request.throttler.drain-interval) | How often the throttler attempts to dequeue requests. Set this high enough that each attempt will process multiple entries in the queue, but not delay requests too much. | | | [`spring.data.cassandra.request.throttler.max-concurrent-requests`](#application-properties.data.spring.data.cassandra.request.throttler.max-concurrent-requests) | Maximum number of requests that are allowed to execute in parallel. | `0` | | [`spring.data.cassandra.request.throttler.max-queue-size`](#application-properties.data.spring.data.cassandra.request.throttler.max-queue-size) | Maximum number of requests that can be enqueued when the throttling threshold is exceeded. | `0` | | [`spring.data.cassandra.request.throttler.max-requests-per-second`](#application-properties.data.spring.data.cassandra.request.throttler.max-requests-per-second) | Maximum allowed request rate. | `0` | | [`spring.data.cassandra.request.throttler.type`](#application-properties.data.spring.data.cassandra.request.throttler.type) | Request throttling type. | `none` | | [`spring.data.cassandra.request.timeout`](#application-properties.data.spring.data.cassandra.request.timeout) | How long the driver waits for a request to complete. | `2s` | | [`spring.data.cassandra.schema-action`](#application-properties.data.spring.data.cassandra.schema-action) | Schema action to take at startup. | `none` | | [`spring.data.cassandra.session-name`](#application-properties.data.spring.data.cassandra.session-name) | Name of the Cassandra session. | | | [`spring.data.cassandra.ssl`](#application-properties.data.spring.data.cassandra.ssl) | Enable SSL support. | `false` | | [`spring.data.cassandra.username`](#application-properties.data.spring.data.cassandra.username) | Login user of the server. | | | [`spring.data.couchbase.auto-index`](#application-properties.data.spring.data.couchbase.auto-index) | Automatically create views and indexes. Use the meta-data provided by "@ViewIndexed", "@N1qlPrimaryIndexed" and "@N1qlSecondaryIndexed". | `false` | | [`spring.data.couchbase.bucket-name`](#application-properties.data.spring.data.couchbase.bucket-name) | Name of the bucket to connect to. | | | [`spring.data.couchbase.field-naming-strategy`](#application-properties.data.spring.data.couchbase.field-naming-strategy) | Fully qualified name of the FieldNamingStrategy to use. | | | [`spring.data.couchbase.repositories.type`](#application-properties.data.spring.data.couchbase.repositories.type) | Type of Couchbase repositories to enable. | `auto` | | [`spring.data.couchbase.scope-name`](#application-properties.data.spring.data.couchbase.scope-name) | Name of the scope used for all collection access. | | | [`spring.data.couchbase.type-key`](#application-properties.data.spring.data.couchbase.type-key) | Name of the field that stores the type information for complex types when using "MappingCouchbaseConverter". | `_class` | | [`spring.data.elasticsearch.repositories.enabled`](#application-properties.data.spring.data.elasticsearch.repositories.enabled) | Whether to enable Elasticsearch repositories. | `true` | | [`spring.data.jdbc.repositories.enabled`](#application-properties.data.spring.data.jdbc.repositories.enabled) | Whether to enable JDBC repositories. | `true` | | [`spring.data.jpa.repositories.bootstrap-mode`](#application-properties.data.spring.data.jpa.repositories.bootstrap-mode) | Bootstrap mode for JPA repositories. | `default` | | [`spring.data.jpa.repositories.enabled`](#application-properties.data.spring.data.jpa.repositories.enabled) | Whether to enable JPA repositories. | `true` | | [`spring.data.ldap.repositories.enabled`](#application-properties.data.spring.data.ldap.repositories.enabled) | Whether to enable LDAP repositories. | `true` | | [`spring.data.mongodb.authentication-database`](#application-properties.data.spring.data.mongodb.authentication-database) | Authentication database name. | | | [`spring.data.mongodb.auto-index-creation`](#application-properties.data.spring.data.mongodb.auto-index-creation) | Whether to enable auto-index creation. | | | [`spring.data.mongodb.database`](#application-properties.data.spring.data.mongodb.database) | Database name. | | | [`spring.data.mongodb.field-naming-strategy`](#application-properties.data.spring.data.mongodb.field-naming-strategy) | Fully qualified name of the FieldNamingStrategy to use. | | | [`spring.data.mongodb.gridfs.bucket`](#application-properties.data.spring.data.mongodb.gridfs.bucket) | GridFS bucket name. | | | [`spring.data.mongodb.gridfs.database`](#application-properties.data.spring.data.mongodb.gridfs.database) | GridFS database name. | | | [`spring.data.mongodb.host`](#application-properties.data.spring.data.mongodb.host) | Mongo server host. Cannot be set with URI. | | | [`spring.data.mongodb.password`](#application-properties.data.spring.data.mongodb.password) | Login password of the mongo server. Cannot be set with URI. | | | [`spring.data.mongodb.port`](#application-properties.data.spring.data.mongodb.port) | Mongo server port. Cannot be set with URI. | | | [`spring.data.mongodb.replica-set-name`](#application-properties.data.spring.data.mongodb.replica-set-name) | Required replica set name for the cluster. Cannot be set with URI. | | | [`spring.data.mongodb.repositories.type`](#application-properties.data.spring.data.mongodb.repositories.type) | Type of Mongo repositories to enable. | `auto` | | [`spring.data.mongodb.uri`](#application-properties.data.spring.data.mongodb.uri) | Mongo database URI. Overrides host, port, username, password, and database. | `mongodb://localhost/test` | | [`spring.data.mongodb.username`](#application-properties.data.spring.data.mongodb.username) | Login user of the mongo server. Cannot be set with URI. | | | [`spring.data.mongodb.uuid-representation`](#application-properties.data.spring.data.mongodb.uuid-representation) | Representation to use when converting a UUID to a BSON binary value. | `java-legacy` | | [`spring.data.neo4j.database`](#application-properties.data.spring.data.neo4j.database) | Database name to use. By default, the server decides the default database to use. | | | [`spring.data.neo4j.repositories.type`](#application-properties.data.spring.data.neo4j.repositories.type) | Type of Neo4j repositories to enable. | `auto` | | [`spring.data.r2dbc.repositories.enabled`](#application-properties.data.spring.data.r2dbc.repositories.enabled) | Whether to enable R2DBC repositories. | `true` | | [`spring.data.redis.repositories.enabled`](#application-properties.data.spring.data.redis.repositories.enabled) | Whether to enable Redis repositories. | `true` | | [`spring.data.rest.base-path`](#application-properties.data.spring.data.rest.base-path) | Base path to be used by Spring Data REST to expose repository resources. | | | [`spring.data.rest.default-media-type`](#application-properties.data.spring.data.rest.default-media-type) | Content type to use as a default when none is specified. | | | [`spring.data.rest.default-page-size`](#application-properties.data.spring.data.rest.default-page-size) | Default size of pages. | | | [`spring.data.rest.detection-strategy`](#application-properties.data.spring.data.rest.detection-strategy) | Strategy to use to determine which repositories get exposed. | `default` | | [`spring.data.rest.enable-enum-translation`](#application-properties.data.spring.data.rest.enable-enum-translation) | Whether to enable enum value translation through the Spring Data REST default resource bundle. | | | [`spring.data.rest.limit-param-name`](#application-properties.data.spring.data.rest.limit-param-name) | Name of the URL query string parameter that indicates how many results to return at once. | | | [`spring.data.rest.max-page-size`](#application-properties.data.spring.data.rest.max-page-size) | Maximum size of pages. | | | [`spring.data.rest.page-param-name`](#application-properties.data.spring.data.rest.page-param-name) | Name of the URL query string parameter that indicates what page to return. | | | [`spring.data.rest.return-body-on-create`](#application-properties.data.spring.data.rest.return-body-on-create) | Whether to return a response body after creating an entity. | | | [`spring.data.rest.return-body-on-update`](#application-properties.data.spring.data.rest.return-body-on-update) | Whether to return a response body after updating an entity. | | | [`spring.data.rest.sort-param-name`](#application-properties.data.spring.data.rest.sort-param-name) | Name of the URL query string parameter that indicates what direction to sort results. | | | [`spring.data.solr.host`](#application-properties.data.spring.data.solr.host) | Solr host. Ignored if "zk-host" is set. | `http://127.0.0.1:8983/solr` | | [`spring.data.solr.zk-host`](#application-properties.data.spring.data.solr.zk-host) | ZooKeeper host address in the form HOST:PORT. | | | [`spring.data.web.pageable.default-page-size`](#application-properties.data.spring.data.web.pageable.default-page-size) | Default page size. | `20` | | [`spring.data.web.pageable.max-page-size`](#application-properties.data.spring.data.web.pageable.max-page-size) | Maximum page size to be accepted. | `2000` | | [`spring.data.web.pageable.one-indexed-parameters`](#application-properties.data.spring.data.web.pageable.one-indexed-parameters) | Whether to expose and assume 1-based page number indexes. Defaults to "false", meaning a page number of 0 in the request equals the first page. | `false` | | [`spring.data.web.pageable.page-parameter`](#application-properties.data.spring.data.web.pageable.page-parameter) | Page index parameter name. | `page` | | [`spring.data.web.pageable.prefix`](#application-properties.data.spring.data.web.pageable.prefix) | General prefix to be prepended to the page number and page size parameters. | | | [`spring.data.web.pageable.qualifier-delimiter`](#application-properties.data.spring.data.web.pageable.qualifier-delimiter) | Delimiter to be used between the qualifier and the actual page number and size properties. | `_` | | [`spring.data.web.pageable.size-parameter`](#application-properties.data.spring.data.web.pageable.size-parameter) | Page size parameter name. | `size` | | [`spring.data.web.sort.sort-parameter`](#application-properties.data.spring.data.web.sort.sort-parameter) | Sort parameter name. | `sort` | | [`spring.datasource.dbcp2.abandoned-usage-tracking` `spring.datasource.dbcp2.access-to-underlying-connection-allowed` `spring.datasource.dbcp2.auto-commit-on-return` `spring.datasource.dbcp2.cache-state` `spring.datasource.dbcp2.clear-statement-pool-on-return` `spring.datasource.dbcp2.connection-factory-class-name` `spring.datasource.dbcp2.connection-init-sqls` `spring.datasource.dbcp2.default-auto-commit` `spring.datasource.dbcp2.default-catalog` `spring.datasource.dbcp2.default-query-timeout` `spring.datasource.dbcp2.default-read-only` `spring.datasource.dbcp2.default-schema` `spring.datasource.dbcp2.default-transaction-isolation` `spring.datasource.dbcp2.disconnection-sql-codes` `spring.datasource.dbcp2.driver` `spring.datasource.dbcp2.driver-class-name` `spring.datasource.dbcp2.eviction-policy-class-name` `spring.datasource.dbcp2.fast-fail-validation` `spring.datasource.dbcp2.initial-size` `spring.datasource.dbcp2.jmx-name` `spring.datasource.dbcp2.lifo` `spring.datasource.dbcp2.log-abandoned` `spring.datasource.dbcp2.log-expired-connections` `spring.datasource.dbcp2.login-timeout` `spring.datasource.dbcp2.max-conn-lifetime-millis` `spring.datasource.dbcp2.max-idle` `spring.datasource.dbcp2.max-open-prepared-statements` `spring.datasource.dbcp2.max-total` `spring.datasource.dbcp2.max-wait-millis` `spring.datasource.dbcp2.min-evictable-idle-time-millis` `spring.datasource.dbcp2.min-idle` `spring.datasource.dbcp2.num-tests-per-eviction-run` `spring.datasource.dbcp2.password` `spring.datasource.dbcp2.pool-prepared-statements` `spring.datasource.dbcp2.remove-abandoned-on-borrow` `spring.datasource.dbcp2.remove-abandoned-on-maintenance` `spring.datasource.dbcp2.remove-abandoned-timeout` `spring.datasource.dbcp2.rollback-on-return` `spring.datasource.dbcp2.soft-min-evictable-idle-time-millis` `spring.datasource.dbcp2.test-on-borrow` `spring.datasource.dbcp2.test-on-create` `spring.datasource.dbcp2.test-on-return` `spring.datasource.dbcp2.test-while-idle` `spring.datasource.dbcp2.time-between-eviction-runs-millis` `spring.datasource.dbcp2.url` `spring.datasource.dbcp2.username` `spring.datasource.dbcp2.validation-query` `spring.datasource.dbcp2.validation-query-timeout`](#application-properties.data.spring.datasource.dbcp2) | Commons DBCP2 specific settings bound to an instance of DBCP2's BasicDataSource | | | [`spring.datasource.driver-class-name`](#application-properties.data.spring.datasource.driver-class-name) | Fully qualified name of the JDBC driver. Auto-detected based on the URL by default. | | | [`spring.datasource.embedded-database-connection`](#application-properties.data.spring.datasource.embedded-database-connection) | Connection details for an embedded database. Defaults to the most suitable embedded database that is available on the classpath. | | | [`spring.datasource.generate-unique-name`](#application-properties.data.spring.datasource.generate-unique-name) | Whether to generate a random datasource name. | `true` | | [`spring.datasource.hikari.allow-pool-suspension` `spring.datasource.hikari.auto-commit` `spring.datasource.hikari.catalog` `spring.datasource.hikari.connection-init-sql` `spring.datasource.hikari.connection-test-query` `spring.datasource.hikari.connection-timeout` `spring.datasource.hikari.data-source-class-name` `spring.datasource.hikari.data-source-j-n-d-i` `spring.datasource.hikari.data-source-properties` `spring.datasource.hikari.driver-class-name` `spring.datasource.hikari.exception-override-class-name` `spring.datasource.hikari.health-check-properties` `spring.datasource.hikari.health-check-registry` `spring.datasource.hikari.idle-timeout` `spring.datasource.hikari.initialization-fail-timeout` `spring.datasource.hikari.isolate-internal-queries` `spring.datasource.hikari.jdbc-url` `spring.datasource.hikari.keepalive-time` `spring.datasource.hikari.leak-detection-threshold` `spring.datasource.hikari.login-timeout` `spring.datasource.hikari.max-lifetime` `spring.datasource.hikari.maximum-pool-size` `spring.datasource.hikari.metric-registry` `spring.datasource.hikari.metrics-tracker-factory` `spring.datasource.hikari.minimum-idle` `spring.datasource.hikari.password` `spring.datasource.hikari.pool-name` `spring.datasource.hikari.read-only` `spring.datasource.hikari.register-mbeans` `spring.datasource.hikari.scheduled-executor` `spring.datasource.hikari.schema` `spring.datasource.hikari.transaction-isolation` `spring.datasource.hikari.username` `spring.datasource.hikari.validation-timeout`](#application-properties.data.spring.datasource.hikari) | Hikari specific settings bound to an instance of Hikari's HikariDataSource | | | [`spring.datasource.jndi-name`](#application-properties.data.spring.datasource.jndi-name) | JNDI location of the datasource. Class, url, username and password are ignored when set. | | | [`spring.datasource.name`](#application-properties.data.spring.datasource.name) | Datasource name to use if "generate-unique-name" is false. Defaults to "testdb" when using an embedded database, otherwise null. | | | [`spring.datasource.oracleucp.abandoned-connection-timeout` `spring.datasource.oracleucp.connection-factory-class-name` `spring.datasource.oracleucp.connection-factory-properties` `spring.datasource.oracleucp.connection-harvest-max-count` `spring.datasource.oracleucp.connection-harvest-trigger-count` `spring.datasource.oracleucp.connection-labeling-high-cost` `spring.datasource.oracleucp.connection-pool-name` `spring.datasource.oracleucp.connection-properties` `spring.datasource.oracleucp.connection-repurpose-threshold` `spring.datasource.oracleucp.connection-validation-timeout` `spring.datasource.oracleucp.connection-wait-timeout` `spring.datasource.oracleucp.data-source-name` `spring.datasource.oracleucp.database-name` `spring.datasource.oracleucp.description` `spring.datasource.oracleucp.fast-connection-failover-enabled` `spring.datasource.oracleucp.high-cost-connection-reuse-threshold` `spring.datasource.oracleucp.inactive-connection-timeout` `spring.datasource.oracleucp.initial-pool-size` `spring.datasource.oracleucp.login-timeout` `spring.datasource.oracleucp.max-connection-reuse-count` `spring.datasource.oracleucp.max-connection-reuse-time` `spring.datasource.oracleucp.max-connections-per-shard` `spring.datasource.oracleucp.max-idle-time` `spring.datasource.oracleucp.max-pool-size` `spring.datasource.oracleucp.max-statements` `spring.datasource.oracleucp.min-pool-size` `spring.datasource.oracleucp.network-protocol` `spring.datasource.oracleucp.o-n-s-configuration` `spring.datasource.oracleucp.pdb-roles` `spring.datasource.oracleucp.port-number` `spring.datasource.oracleucp.property-cycle` `spring.datasource.oracleucp.query-timeout` `spring.datasource.oracleucp.read-only-instance-allowed` `spring.datasource.oracleucp.role-name` `spring.datasource.oracleucp.s-q-l-for-validate-connection` `spring.datasource.oracleucp.seconds-to-trust-idle-connection` `spring.datasource.oracleucp.server-name` `spring.datasource.oracleucp.sharding-mode` `spring.datasource.oracleucp.time-to-live-connection-timeout` `spring.datasource.oracleucp.timeout-check-interval` `spring.datasource.oracleucp.u-r-l` `spring.datasource.oracleucp.user` `spring.datasource.oracleucp.validate-connection-on-borrow`](#application-properties.data.spring.datasource.oracleucp) | Oracle UCP specific settings bound to an instance of Oracle UCP's PoolDataSource | | | [`spring.datasource.password`](#application-properties.data.spring.datasource.password) | Login password of the database. | | | [`spring.datasource.tomcat.abandon-when-percentage-full` `spring.datasource.tomcat.access-to-underlying-connection-allowed` `spring.datasource.tomcat.alternate-username-allowed` `spring.datasource.tomcat.commit-on-return` `spring.datasource.tomcat.connection-properties` `spring.datasource.tomcat.data-source` `spring.datasource.tomcat.data-source-j-n-d-i` `spring.datasource.tomcat.db-properties` `spring.datasource.tomcat.default-auto-commit` `spring.datasource.tomcat.default-catalog` `spring.datasource.tomcat.default-read-only` `spring.datasource.tomcat.default-transaction-isolation` `spring.datasource.tomcat.driver-class-name` `spring.datasource.tomcat.fair-queue` `spring.datasource.tomcat.ignore-exception-on-pre-load` `spring.datasource.tomcat.init-s-q-l` `spring.datasource.tomcat.initial-size` `spring.datasource.tomcat.jdbc-interceptors` `spring.datasource.tomcat.jmx-enabled` `spring.datasource.tomcat.log-abandoned` `spring.datasource.tomcat.log-validation-errors` `spring.datasource.tomcat.login-timeout` `spring.datasource.tomcat.max-active` `spring.datasource.tomcat.max-age` `spring.datasource.tomcat.max-idle` `spring.datasource.tomcat.max-wait` `spring.datasource.tomcat.min-evictable-idle-time-millis` `spring.datasource.tomcat.min-idle` `spring.datasource.tomcat.name` `spring.datasource.tomcat.num-tests-per-eviction-run` `spring.datasource.tomcat.password` `spring.datasource.tomcat.propagate-interrupt-state` `spring.datasource.tomcat.remove-abandoned` `spring.datasource.tomcat.remove-abandoned-timeout` `spring.datasource.tomcat.rollback-on-return` `spring.datasource.tomcat.suspect-timeout` `spring.datasource.tomcat.test-on-borrow` `spring.datasource.tomcat.test-on-connect` `spring.datasource.tomcat.test-on-return` `spring.datasource.tomcat.test-while-idle` `spring.datasource.tomcat.time-between-eviction-runs-millis` `spring.datasource.tomcat.url` `spring.datasource.tomcat.use-disposable-connection-facade` `spring.datasource.tomcat.use-equals` `spring.datasource.tomcat.use-lock` `spring.datasource.tomcat.use-statement-facade` `spring.datasource.tomcat.username` `spring.datasource.tomcat.validation-interval` `spring.datasource.tomcat.validation-query` `spring.datasource.tomcat.validation-query-timeout` `spring.datasource.tomcat.validator-class-name`](#application-properties.data.spring.datasource.tomcat) | Tomcat datasource specific settings bound to an instance of Tomcat JDBC's DataSource | | | [`spring.datasource.type`](#application-properties.data.spring.datasource.type) | Fully qualified name of the connection pool implementation to use. By default, it is auto-detected from the classpath. | | | [`spring.datasource.url`](#application-properties.data.spring.datasource.url) | JDBC URL of the database. | | | [`spring.datasource.username`](#application-properties.data.spring.datasource.username) | Login username of the database. | | | [`spring.datasource.xa.data-source-class-name`](#application-properties.data.spring.datasource.xa.data-source-class-name) | XA datasource fully qualified name. | | | [`spring.datasource.xa.properties.*`](#application-properties.data.spring.datasource.xa.properties) | Properties to pass to the XA data source. | | | [`spring.elasticsearch.connection-timeout`](#application-properties.data.spring.elasticsearch.connection-timeout) | Connection timeout used when communicating with Elasticsearch. | `1s` | | [`spring.elasticsearch.password`](#application-properties.data.spring.elasticsearch.password) | Password for authentication with Elasticsearch. | | | [`spring.elasticsearch.path-prefix`](#application-properties.data.spring.elasticsearch.path-prefix) | Prefix added to the path of every request sent to Elasticsearch. | | | [`spring.elasticsearch.restclient.sniffer.delay-after-failure`](#application-properties.data.spring.elasticsearch.restclient.sniffer.delay-after-failure) | Delay of a sniff execution scheduled after a failure. | `1m` | | [`spring.elasticsearch.restclient.sniffer.interval`](#application-properties.data.spring.elasticsearch.restclient.sniffer.interval) | Interval between consecutive ordinary sniff executions. | `5m` | | [`spring.elasticsearch.socket-timeout`](#application-properties.data.spring.elasticsearch.socket-timeout) | Socket timeout used when communicating with Elasticsearch. | `30s` | | [`spring.elasticsearch.uris`](#application-properties.data.spring.elasticsearch.uris) | Comma-separated list of the Elasticsearch instances to use. | `[http://localhost:9200]` | | [`spring.elasticsearch.username`](#application-properties.data.spring.elasticsearch.username) | Username for authentication with Elasticsearch. | | | [`spring.elasticsearch.webclient.max-in-memory-size`](#application-properties.data.spring.elasticsearch.webclient.max-in-memory-size) | Limit on the number of bytes that can be buffered whenever the input stream needs to be aggregated. | | | [`spring.h2.console.enabled`](#application-properties.data.spring.h2.console.enabled) | Whether to enable the console. | `false` | | [`spring.h2.console.path`](#application-properties.data.spring.h2.console.path) | Path at which the console is available. | `/h2-console` | | [`spring.h2.console.settings.trace`](#application-properties.data.spring.h2.console.settings.trace) | Whether to enable trace output. | `false` | | [`spring.h2.console.settings.web-admin-password`](#application-properties.data.spring.h2.console.settings.web-admin-password) | Password to access preferences and tools of H2 Console. | | | [`spring.h2.console.settings.web-allow-others`](#application-properties.data.spring.h2.console.settings.web-allow-others) | Whether to enable remote access. | `false` | | [`spring.influx.password`](#application-properties.data.spring.influx.password) | Login password. | | | [`spring.influx.url`](#application-properties.data.spring.influx.url) | URL of the InfluxDB instance to which to connect. | | | [`spring.influx.user`](#application-properties.data.spring.influx.user) | Login user. | | | [`spring.jdbc.template.fetch-size`](#application-properties.data.spring.jdbc.template.fetch-size) | Number of rows that should be fetched from the database when more rows are needed. Use -1 to use the JDBC driver's default configuration. | `-1` | | [`spring.jdbc.template.max-rows`](#application-properties.data.spring.jdbc.template.max-rows) | Maximum number of rows. Use -1 to use the JDBC driver's default configuration. | `-1` | | [`spring.jdbc.template.query-timeout`](#application-properties.data.spring.jdbc.template.query-timeout) | Query timeout. Default is to use the JDBC driver's default configuration. If a duration suffix is not specified, seconds will be used. | | | [`spring.jooq.sql-dialect`](#application-properties.data.spring.jooq.sql-dialect) | SQL dialect to use. Auto-detected by default. | | | [`spring.jpa.database`](#application-properties.data.spring.jpa.database) | Target database to operate on, auto-detected by default. Can be alternatively set using the "databasePlatform" property. | | | [`spring.jpa.database-platform`](#application-properties.data.spring.jpa.database-platform) | Name of the target database to operate on, auto-detected by default. Can be alternatively set using the "Database" enum. | | | [`spring.jpa.defer-datasource-initialization`](#application-properties.data.spring.jpa.defer-datasource-initialization) | | `false` | | [`spring.jpa.generate-ddl`](#application-properties.data.spring.jpa.generate-ddl) | Whether to initialize the schema on startup. | `false` | | [`spring.jpa.hibernate.ddl-auto`](#application-properties.data.spring.jpa.hibernate.ddl-auto) | DDL mode. This is actually a shortcut for the "hibernate.hbm2ddl.auto" property. Defaults to "create-drop" when using an embedded database and no schema manager was detected. Otherwise, defaults to "none". | | | [`spring.jpa.hibernate.naming.implicit-strategy`](#application-properties.data.spring.jpa.hibernate.naming.implicit-strategy) | Fully qualified name of the implicit naming strategy. | | | [`spring.jpa.hibernate.naming.physical-strategy`](#application-properties.data.spring.jpa.hibernate.naming.physical-strategy) | Fully qualified name of the physical naming strategy. | | | [`spring.jpa.hibernate.use-new-id-generator-mappings`](#application-properties.data.spring.jpa.hibernate.use-new-id-generator-mappings) | Whether to use Hibernate's newer IdentifierGenerator for AUTO, TABLE and SEQUENCE. This is actually a shortcut for the "hibernate.id.new\_generator\_mappings" property. When not specified will default to "true". | | | [`spring.jpa.mapping-resources`](#application-properties.data.spring.jpa.mapping-resources) | Mapping resources (equivalent to "mapping-file" entries in persistence.xml). | | | [`spring.jpa.open-in-view`](#application-properties.data.spring.jpa.open-in-view) | Register OpenEntityManagerInViewInterceptor. Binds a JPA EntityManager to the thread for the entire processing of the request. | `true` | | [`spring.jpa.properties.*`](#application-properties.data.spring.jpa.properties) | Additional native properties to set on the JPA provider. | | | [`spring.jpa.show-sql`](#application-properties.data.spring.jpa.show-sql) | Whether to enable logging of SQL statements. | `false` | | [`spring.ldap.anonymous-read-only`](#application-properties.data.spring.ldap.anonymous-read-only) | Whether read-only operations should use an anonymous environment. Disabled by default unless a username is set. | | | [`spring.ldap.base`](#application-properties.data.spring.ldap.base) | Base suffix from which all operations should originate. | | | [`spring.ldap.base-environment.*`](#application-properties.data.spring.ldap.base-environment) | LDAP specification settings. | | | [`spring.ldap.embedded.base-dn`](#application-properties.data.spring.ldap.embedded.base-dn) | List of base DNs. | | | [`spring.ldap.embedded.credential.password`](#application-properties.data.spring.ldap.embedded.credential.password) | Embedded LDAP password. | | | [`spring.ldap.embedded.credential.username`](#application-properties.data.spring.ldap.embedded.credential.username) | Embedded LDAP username. | | | [`spring.ldap.embedded.ldif`](#application-properties.data.spring.ldap.embedded.ldif) | Schema (LDIF) script resource reference. | `classpath:schema.ldif` | | [`spring.ldap.embedded.port`](#application-properties.data.spring.ldap.embedded.port) | Embedded LDAP port. | `0` | | [`spring.ldap.embedded.validation.enabled`](#application-properties.data.spring.ldap.embedded.validation.enabled) | Whether to enable LDAP schema validation. | `true` | | [`spring.ldap.embedded.validation.schema`](#application-properties.data.spring.ldap.embedded.validation.schema) | Path to the custom schema. | | | [`spring.ldap.password`](#application-properties.data.spring.ldap.password) | Login password of the server. | | | [`spring.ldap.template.ignore-name-not-found-exception`](#application-properties.data.spring.ldap.template.ignore-name-not-found-exception) | Whether NameNotFoundException should be ignored in searches via the LdapTemplate. | `false` | | [`spring.ldap.template.ignore-partial-result-exception`](#application-properties.data.spring.ldap.template.ignore-partial-result-exception) | Whether PartialResultException should be ignored in searches via the LdapTemplate. | `false` | | [`spring.ldap.template.ignore-size-limit-exceeded-exception`](#application-properties.data.spring.ldap.template.ignore-size-limit-exceeded-exception) | Whether SizeLimitExceededException should be ignored in searches via the LdapTemplate. | `true` | | [`spring.ldap.urls`](#application-properties.data.spring.ldap.urls) | LDAP URLs of the server. | | | [`spring.ldap.username`](#application-properties.data.spring.ldap.username) | Login username of the server. | | | [`spring.mongodb.embedded.storage.database-dir`](#application-properties.data.spring.mongodb.embedded.storage.database-dir) | Directory used for data storage. | | | [`spring.mongodb.embedded.storage.oplog-size`](#application-properties.data.spring.mongodb.embedded.storage.oplog-size) | Maximum size of the oplog. | | | [`spring.mongodb.embedded.storage.repl-set-name`](#application-properties.data.spring.mongodb.embedded.storage.repl-set-name) | Name of the replica set. | | | [`spring.mongodb.embedded.version`](#application-properties.data.spring.mongodb.embedded.version) | Version of Mongo to use. | | | [`spring.neo4j.authentication.kerberos-ticket`](#application-properties.data.spring.neo4j.authentication.kerberos-ticket) | Kerberos ticket for connecting to the database. Mutual exclusive with a given username. | | | [`spring.neo4j.authentication.password`](#application-properties.data.spring.neo4j.authentication.password) | Login password of the server. | | | [`spring.neo4j.authentication.realm`](#application-properties.data.spring.neo4j.authentication.realm) | Realm to connect to. | | | [`spring.neo4j.authentication.username`](#application-properties.data.spring.neo4j.authentication.username) | Login user of the server. | | | [`spring.neo4j.connection-timeout`](#application-properties.data.spring.neo4j.connection-timeout) | Timeout for borrowing connections from the pool. | `30s` | | [`spring.neo4j.max-transaction-retry-time`](#application-properties.data.spring.neo4j.max-transaction-retry-time) | Maximum time transactions are allowed to retry. | `30s` | | [`spring.neo4j.pool.connection-acquisition-timeout`](#application-properties.data.spring.neo4j.pool.connection-acquisition-timeout) | Acquisition of new connections will be attempted for at most configured timeout. | `60s` | | [`spring.neo4j.pool.idle-time-before-connection-test`](#application-properties.data.spring.neo4j.pool.idle-time-before-connection-test) | Pooled connections that have been idle in the pool for longer than this threshold will be tested before they are used again. | | | [`spring.neo4j.pool.log-leaked-sessions`](#application-properties.data.spring.neo4j.pool.log-leaked-sessions) | Whether to log leaked sessions. | `false` | | [`spring.neo4j.pool.max-connection-lifetime`](#application-properties.data.spring.neo4j.pool.max-connection-lifetime) | Pooled connections older than this threshold will be closed and removed from the pool. | `1h` | | [`spring.neo4j.pool.max-connection-pool-size`](#application-properties.data.spring.neo4j.pool.max-connection-pool-size) | Maximum amount of connections in the connection pool towards a single database. | `100` | | [`spring.neo4j.pool.metrics-enabled`](#application-properties.data.spring.neo4j.pool.metrics-enabled) | Whether to enable metrics. | `false` | | [`spring.neo4j.security.cert-file`](#application-properties.data.spring.neo4j.security.cert-file) | Path to the file that holds the trusted certificates. | | | [`spring.neo4j.security.encrypted`](#application-properties.data.spring.neo4j.security.encrypted) | Whether the driver should use encrypted traffic. | `false` | | [`spring.neo4j.security.hostname-verification-enabled`](#application-properties.data.spring.neo4j.security.hostname-verification-enabled) | Whether hostname verification is required. | `true` | | [`spring.neo4j.security.trust-strategy`](#application-properties.data.spring.neo4j.security.trust-strategy) | Trust strategy to use. | `trust-system-ca-signed-certificates` | | [`spring.neo4j.uri`](#application-properties.data.spring.neo4j.uri) | URI used by the driver. | `bolt://localhost:7687` | | [`spring.r2dbc.generate-unique-name`](#application-properties.data.spring.r2dbc.generate-unique-name) | Whether to generate a random database name. Ignore any configured name when enabled. | `false` | | [`spring.r2dbc.name`](#application-properties.data.spring.r2dbc.name) | Database name. Set if no name is specified in the url. Default to "testdb" when using an embedded database. | | | [`spring.r2dbc.password`](#application-properties.data.spring.r2dbc.password) | Login password of the database. Set if no password is specified in the url. | | | [`spring.r2dbc.pool.enabled`](#application-properties.data.spring.r2dbc.pool.enabled) | Whether pooling is enabled. Requires r2dbc-pool. | `true` | | [`spring.r2dbc.pool.initial-size`](#application-properties.data.spring.r2dbc.pool.initial-size) | Initial connection pool size. | `10` | | [`spring.r2dbc.pool.max-acquire-time`](#application-properties.data.spring.r2dbc.pool.max-acquire-time) | Maximum time to acquire a connection from the pool. By default, wait indefinitely. | | | [`spring.r2dbc.pool.max-create-connection-time`](#application-properties.data.spring.r2dbc.pool.max-create-connection-time) | Maximum time to wait to create a new connection. By default, wait indefinitely. | | | [`spring.r2dbc.pool.max-idle-time`](#application-properties.data.spring.r2dbc.pool.max-idle-time) | Maximum amount of time that a connection is allowed to sit idle in the pool. | `30m` | | [`spring.r2dbc.pool.max-life-time`](#application-properties.data.spring.r2dbc.pool.max-life-time) | Maximum lifetime of a connection in the pool. By default, connections have an infinite lifetime. | | | [`spring.r2dbc.pool.max-size`](#application-properties.data.spring.r2dbc.pool.max-size) | Maximal connection pool size. | `10` | | [`spring.r2dbc.pool.validation-depth`](#application-properties.data.spring.r2dbc.pool.validation-depth) | Validation depth. | `local` | | [`spring.r2dbc.pool.validation-query`](#application-properties.data.spring.r2dbc.pool.validation-query) | Validation query. | | | [`spring.r2dbc.properties.*`](#application-properties.data.spring.r2dbc.properties) | Additional R2DBC options. | | | [`spring.r2dbc.url`](#application-properties.data.spring.r2dbc.url) | R2DBC URL of the database. database name, username, password and pooling options specified in the url take precedence over individual options. | | | [`spring.r2dbc.username`](#application-properties.data.spring.r2dbc.username) | Login username of the database. Set if no username is specified in the url. | | | [`spring.redis.client-name`](#application-properties.data.spring.redis.client-name) | Client name to be set on connections with CLIENT SETNAME. | | | [`spring.redis.client-type`](#application-properties.data.spring.redis.client-type) | Type of client to use. By default, auto-detected according to the classpath. | | | [`spring.redis.cluster.max-redirects`](#application-properties.data.spring.redis.cluster.max-redirects) | Maximum number of redirects to follow when executing commands across the cluster. | | | [`spring.redis.cluster.nodes`](#application-properties.data.spring.redis.cluster.nodes) | Comma-separated list of "host:port" pairs to bootstrap from. This represents an "initial" list of cluster nodes and is required to have at least one entry. | | | [`spring.redis.connect-timeout`](#application-properties.data.spring.redis.connect-timeout) | Connection timeout. | | | [`spring.redis.database`](#application-properties.data.spring.redis.database) | Database index used by the connection factory. | `0` | | [`spring.redis.host`](#application-properties.data.spring.redis.host) | Redis server host. | `localhost` | | [`spring.redis.jedis.pool.enabled`](#application-properties.data.spring.redis.jedis.pool.enabled) | Whether to enable the pool. Enabled automatically if "commons-pool2" is available. With Jedis, pooling is implicitly enabled in sentinel mode and this setting only applies to single node setup. | | | [`spring.redis.jedis.pool.max-active`](#application-properties.data.spring.redis.jedis.pool.max-active) | Maximum number of connections that can be allocated by the pool at a given time. Use a negative value for no limit. | `8` | | [`spring.redis.jedis.pool.max-idle`](#application-properties.data.spring.redis.jedis.pool.max-idle) | Maximum number of "idle" connections in the pool. Use a negative value to indicate an unlimited number of idle connections. | `8` | | [`spring.redis.jedis.pool.max-wait`](#application-properties.data.spring.redis.jedis.pool.max-wait) | Maximum amount of time a connection allocation should block before throwing an exception when the pool is exhausted. Use a negative value to block indefinitely. | `-1ms` | | [`spring.redis.jedis.pool.min-idle`](#application-properties.data.spring.redis.jedis.pool.min-idle) | Target for the minimum number of idle connections to maintain in the pool. This setting only has an effect if both it and time between eviction runs are positive. | `0` | | [`spring.redis.jedis.pool.time-between-eviction-runs`](#application-properties.data.spring.redis.jedis.pool.time-between-eviction-runs) | Time between runs of the idle object evictor thread. When positive, the idle object evictor thread starts, otherwise no idle object eviction is performed. | | | [`spring.redis.lettuce.cluster.refresh.adaptive`](#application-properties.data.spring.redis.lettuce.cluster.refresh.adaptive) | Whether adaptive topology refreshing using all available refresh triggers should be used. | `false` | | [`spring.redis.lettuce.cluster.refresh.dynamic-refresh-sources`](#application-properties.data.spring.redis.lettuce.cluster.refresh.dynamic-refresh-sources) | Whether to discover and query all cluster nodes for obtaining the cluster topology. When set to false, only the initial seed nodes are used as sources for topology discovery. | `true` | | [`spring.redis.lettuce.cluster.refresh.period`](#application-properties.data.spring.redis.lettuce.cluster.refresh.period) | Cluster topology refresh period. | | | [`spring.redis.lettuce.pool.enabled`](#application-properties.data.spring.redis.lettuce.pool.enabled) | Whether to enable the pool. Enabled automatically if "commons-pool2" is available. With Jedis, pooling is implicitly enabled in sentinel mode and this setting only applies to single node setup. | | | [`spring.redis.lettuce.pool.max-active`](#application-properties.data.spring.redis.lettuce.pool.max-active) | Maximum number of connections that can be allocated by the pool at a given time. Use a negative value for no limit. | `8` | | [`spring.redis.lettuce.pool.max-idle`](#application-properties.data.spring.redis.lettuce.pool.max-idle) | Maximum number of "idle" connections in the pool. Use a negative value to indicate an unlimited number of idle connections. | `8` | | [`spring.redis.lettuce.pool.max-wait`](#application-properties.data.spring.redis.lettuce.pool.max-wait) | Maximum amount of time a connection allocation should block before throwing an exception when the pool is exhausted. Use a negative value to block indefinitely. | `-1ms` | | [`spring.redis.lettuce.pool.min-idle`](#application-properties.data.spring.redis.lettuce.pool.min-idle) | Target for the minimum number of idle connections to maintain in the pool. This setting only has an effect if both it and time between eviction runs are positive. | `0` | | [`spring.redis.lettuce.pool.time-between-eviction-runs`](#application-properties.data.spring.redis.lettuce.pool.time-between-eviction-runs) | Time between runs of the idle object evictor thread. When positive, the idle object evictor thread starts, otherwise no idle object eviction is performed. | | | [`spring.redis.lettuce.shutdown-timeout`](#application-properties.data.spring.redis.lettuce.shutdown-timeout) | Shutdown timeout. | `100ms` | | [`spring.redis.password`](#application-properties.data.spring.redis.password) | Login password of the redis server. | | | [`spring.redis.port`](#application-properties.data.spring.redis.port) | Redis server port. | `6379` | | [`spring.redis.sentinel.master`](#application-properties.data.spring.redis.sentinel.master) | Name of the Redis server. | | | [`spring.redis.sentinel.nodes`](#application-properties.data.spring.redis.sentinel.nodes) | Comma-separated list of "host:port" pairs. | | | [`spring.redis.sentinel.password`](#application-properties.data.spring.redis.sentinel.password) | Password for authenticating with sentinel(s). | | | [`spring.redis.sentinel.username`](#application-properties.data.spring.redis.sentinel.username) | Login username for authenticating with sentinel(s). | | | [`spring.redis.ssl`](#application-properties.data.spring.redis.ssl) | Whether to enable SSL support. | `false` | | [`spring.redis.timeout`](#application-properties.data.spring.redis.timeout) | Read timeout. | | | [`spring.redis.url`](#application-properties.data.spring.redis.url) | Connection URL. Overrides host, port, and password. User is ignored. Example: redis://user:[email protected]:6379 | | | [`spring.redis.username`](#application-properties.data.spring.redis.username) | Login username of the redis server. | | 6. Transaction Properties -------------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.jta.atomikos.connectionfactory.borrow-connection-timeout`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.borrow-connection-timeout) | Timeout, in seconds, for borrowing connections from the pool. | `30` | | [`spring.jta.atomikos.connectionfactory.ignore-session-transacted-flag`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.ignore-session-transacted-flag) | Whether to ignore the transacted flag when creating session. | `true` | | [`spring.jta.atomikos.connectionfactory.local-transaction-mode`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.local-transaction-mode) | Whether local transactions are desired. | `false` | | [`spring.jta.atomikos.connectionfactory.maintenance-interval`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.maintenance-interval) | Time, in seconds, between runs of the pool's maintenance thread. | `60` | | [`spring.jta.atomikos.connectionfactory.max-idle-time`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.max-idle-time) | Time, in seconds, after which connections are cleaned up from the pool. | `60` | | [`spring.jta.atomikos.connectionfactory.max-lifetime`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.max-lifetime) | Time, in seconds, that a connection can be pooled for before being destroyed. 0 denotes no limit. | `0` | | [`spring.jta.atomikos.connectionfactory.max-pool-size`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.max-pool-size) | Maximum size of the pool. | `1` | | [`spring.jta.atomikos.connectionfactory.min-pool-size`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.min-pool-size) | Minimum size of the pool. | `1` | | [`spring.jta.atomikos.connectionfactory.reap-timeout`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.reap-timeout) | Reap timeout, in seconds, for borrowed connections. 0 denotes no limit. | `0` | | [`spring.jta.atomikos.connectionfactory.unique-resource-name`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.unique-resource-name) | Unique name used to identify the resource during recovery. | `jmsConnectionFactory` | | [`spring.jta.atomikos.connectionfactory.xa-connection-factory-class-name`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.xa-connection-factory-class-name) | Vendor-specific implementation of XAConnectionFactory. | | | [`spring.jta.atomikos.connectionfactory.xa-properties`](#application-properties.transaction.spring.jta.atomikos.connectionfactory.xa-properties) | Vendor-specific XA properties. | | | [`spring.jta.atomikos.datasource.borrow-connection-timeout`](#application-properties.transaction.spring.jta.atomikos.datasource.borrow-connection-timeout) | Timeout, in seconds, for borrowing connections from the pool. | `30` | | [`spring.jta.atomikos.datasource.concurrent-connection-validation`](#application-properties.transaction.spring.jta.atomikos.datasource.concurrent-connection-validation) | Whether to use concurrent connection validation. | `true` | | [`spring.jta.atomikos.datasource.default-isolation-level`](#application-properties.transaction.spring.jta.atomikos.datasource.default-isolation-level) | Default isolation level of connections provided by the pool. | | | [`spring.jta.atomikos.datasource.login-timeout`](#application-properties.transaction.spring.jta.atomikos.datasource.login-timeout) | Timeout, in seconds, for establishing a database connection. | `0` | | [`spring.jta.atomikos.datasource.maintenance-interval`](#application-properties.transaction.spring.jta.atomikos.datasource.maintenance-interval) | Time, in seconds, between runs of the pool's maintenance thread. | `60` | | [`spring.jta.atomikos.datasource.max-idle-time`](#application-properties.transaction.spring.jta.atomikos.datasource.max-idle-time) | Time, in seconds, after which connections are cleaned up from the pool. | `60` | | [`spring.jta.atomikos.datasource.max-lifetime`](#application-properties.transaction.spring.jta.atomikos.datasource.max-lifetime) | Time, in seconds, that a connection can be pooled for before being destroyed. 0 denotes no limit. | `0` | | [`spring.jta.atomikos.datasource.max-pool-size`](#application-properties.transaction.spring.jta.atomikos.datasource.max-pool-size) | Maximum size of the pool. | `1` | | [`spring.jta.atomikos.datasource.min-pool-size`](#application-properties.transaction.spring.jta.atomikos.datasource.min-pool-size) | Minimum size of the pool. | `1` | | [`spring.jta.atomikos.datasource.reap-timeout`](#application-properties.transaction.spring.jta.atomikos.datasource.reap-timeout) | Reap timeout, in seconds, for borrowed connections. 0 denotes no limit. | `0` | | [`spring.jta.atomikos.datasource.test-query`](#application-properties.transaction.spring.jta.atomikos.datasource.test-query) | SQL query or statement used to validate a connection before returning it. | | | [`spring.jta.atomikos.datasource.unique-resource-name`](#application-properties.transaction.spring.jta.atomikos.datasource.unique-resource-name) | Unique name used to identify the resource during recovery. | `dataSource` | | [`spring.jta.atomikos.datasource.xa-data-source-class-name`](#application-properties.transaction.spring.jta.atomikos.datasource.xa-data-source-class-name) | Vendor-specific implementation of XAConnectionFactory. | | | [`spring.jta.atomikos.datasource.xa-properties`](#application-properties.transaction.spring.jta.atomikos.datasource.xa-properties) | Vendor-specific XA properties. | | | [`spring.jta.atomikos.properties.allow-sub-transactions`](#application-properties.transaction.spring.jta.atomikos.properties.allow-sub-transactions) | Specify whether sub-transactions are allowed. | `true` | | [`spring.jta.atomikos.properties.checkpoint-interval`](#application-properties.transaction.spring.jta.atomikos.properties.checkpoint-interval) | Interval between checkpoints, expressed as the number of log writes between two checkpoints. A checkpoint reduces the log file size at the expense of adding some overhead in the runtime. | `500` | | [`spring.jta.atomikos.properties.default-jta-timeout`](#application-properties.transaction.spring.jta.atomikos.properties.default-jta-timeout) | Default timeout for JTA transactions. | `10000ms` | | [`spring.jta.atomikos.properties.default-max-wait-time-on-shutdown`](#application-properties.transaction.spring.jta.atomikos.properties.default-max-wait-time-on-shutdown) | How long should normal shutdown (no-force) wait for transactions to complete. | | | [`spring.jta.atomikos.properties.enable-logging`](#application-properties.transaction.spring.jta.atomikos.properties.enable-logging) | Whether to enable disk logging. | `true` | | [`spring.jta.atomikos.properties.force-shutdown-on-vm-exit`](#application-properties.transaction.spring.jta.atomikos.properties.force-shutdown-on-vm-exit) | Whether a VM shutdown should trigger forced shutdown of the transaction core. | `false` | | [`spring.jta.atomikos.properties.log-base-dir`](#application-properties.transaction.spring.jta.atomikos.properties.log-base-dir) | Directory in which the log files should be stored. Defaults to the current working directory. | | | [`spring.jta.atomikos.properties.log-base-name`](#application-properties.transaction.spring.jta.atomikos.properties.log-base-name) | Transactions log file base name. | `tmlog` | | [`spring.jta.atomikos.properties.max-actives`](#application-properties.transaction.spring.jta.atomikos.properties.max-actives) | Maximum number of active transactions. | `50` | | [`spring.jta.atomikos.properties.max-timeout`](#application-properties.transaction.spring.jta.atomikos.properties.max-timeout) | Maximum timeout that can be allowed for transactions. | `300000ms` | | [`spring.jta.atomikos.properties.recovery.delay`](#application-properties.transaction.spring.jta.atomikos.properties.recovery.delay) | Delay between two recovery scans. | `10000ms` | | [`spring.jta.atomikos.properties.recovery.forget-orphaned-log-entries-delay`](#application-properties.transaction.spring.jta.atomikos.properties.recovery.forget-orphaned-log-entries-delay) | Delay after which recovery can cleanup pending ('orphaned') log entries. | `86400000ms` | | [`spring.jta.atomikos.properties.recovery.max-retries`](#application-properties.transaction.spring.jta.atomikos.properties.recovery.max-retries) | Number of retry attempts to commit the transaction before throwing an exception. | `5` | | [`spring.jta.atomikos.properties.recovery.retry-interval`](#application-properties.transaction.spring.jta.atomikos.properties.recovery.retry-interval) | Delay between retry attempts. | `10000ms` | | [`spring.jta.atomikos.properties.serial-jta-transactions`](#application-properties.transaction.spring.jta.atomikos.properties.serial-jta-transactions) | Whether sub-transactions should be joined when possible. | `true` | | [`spring.jta.atomikos.properties.service`](#application-properties.transaction.spring.jta.atomikos.properties.service) | Transaction manager implementation that should be started. | | | [`spring.jta.atomikos.properties.threaded-two-phase-commit`](#application-properties.transaction.spring.jta.atomikos.properties.threaded-two-phase-commit) | Whether to use different (and concurrent) threads for two-phase commit on the participating resources. | `false` | | [`spring.jta.atomikos.properties.transaction-manager-unique-name`](#application-properties.transaction.spring.jta.atomikos.properties.transaction-manager-unique-name) | The transaction manager's unique name. Defaults to the machine's IP address. If you plan to run more than one transaction manager against one database you must set this property to a unique value. | | | [`spring.jta.enabled`](#application-properties.transaction.spring.jta.enabled) | Whether to enable JTA support. | `true` | | [`spring.jta.log-dir`](#application-properties.transaction.spring.jta.log-dir) | Transaction logs directory. | | | [`spring.jta.transaction-manager-id`](#application-properties.transaction.spring.jta.transaction-manager-id) | Transaction manager unique identifier. | | | [`spring.transaction.default-timeout`](#application-properties.transaction.spring.transaction.default-timeout) | Default transaction timeout. If a duration suffix is not specified, seconds will be used. | | | [`spring.transaction.rollback-on-commit-failure`](#application-properties.transaction.spring.transaction.rollback-on-commit-failure) | Whether to roll back on commit failures. | | 7. Data Migration Properties ----------------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.flyway.baseline-description`](#application-properties.data-migration.spring.flyway.baseline-description) | Description to tag an existing schema with when applying a baseline. | `<< Flyway Baseline >>` | | [`spring.flyway.baseline-migration-prefix`](#application-properties.data-migration.spring.flyway.baseline-migration-prefix) | Filename prefix for baseline migrations. Requires Flyway Teams. | `B` | | [`spring.flyway.baseline-on-migrate`](#application-properties.data-migration.spring.flyway.baseline-on-migrate) | Whether to automatically call baseline when migrating a non-empty schema. | `false` | | [`spring.flyway.baseline-version`](#application-properties.data-migration.spring.flyway.baseline-version) | Version to tag an existing schema with when executing baseline. | `1` | | [`spring.flyway.batch`](#application-properties.data-migration.spring.flyway.batch) | Whether to batch SQL statements when executing them. Requires Flyway Teams. | | | [`spring.flyway.cherry-pick`](#application-properties.data-migration.spring.flyway.cherry-pick) | Migrations that Flyway should consider when migrating or undoing. When empty all available migrations are considered. Requires Flyway Teams. | | | [`spring.flyway.clean-disabled`](#application-properties.data-migration.spring.flyway.clean-disabled) | Whether to disable cleaning of the database. | `false` | | [`spring.flyway.clean-on-validation-error`](#application-properties.data-migration.spring.flyway.clean-on-validation-error) | Whether to automatically call clean when a validation error occurs. | `false` | | [`spring.flyway.connect-retries`](#application-properties.data-migration.spring.flyway.connect-retries) | Maximum number of retries when attempting to connect to the database. | `0` | | [`spring.flyway.connect-retries-interval`](#application-properties.data-migration.spring.flyway.connect-retries-interval) | Maximum time between retries when attempting to connect to the database. If a duration suffix is not specified, seconds will be used. | `120` | | [`spring.flyway.create-schemas`](#application-properties.data-migration.spring.flyway.create-schemas) | Whether Flyway should attempt to create the schemas specified in the schemas property. | `true` | | [`spring.flyway.default-schema`](#application-properties.data-migration.spring.flyway.default-schema) | Default schema name managed by Flyway (case-sensitive). | | | [`spring.flyway.detect-encoding`](#application-properties.data-migration.spring.flyway.detect-encoding) | Whether to attempt to automatically detect SQL migration file encoding. Requires Flyway Teams. | | | [`spring.flyway.driver-class-name`](#application-properties.data-migration.spring.flyway.driver-class-name) | Fully qualified name of the JDBC driver. Auto-detected based on the URL by default. | | | [`spring.flyway.enabled`](#application-properties.data-migration.spring.flyway.enabled) | Whether to enable flyway. | `true` | | [`spring.flyway.encoding`](#application-properties.data-migration.spring.flyway.encoding) | Encoding of SQL migrations. | `UTF-8` | | [`spring.flyway.error-overrides`](#application-properties.data-migration.spring.flyway.error-overrides) | Rules for the built-in error handling to override specific SQL states and error codes. Requires Flyway Teams. | | | [`spring.flyway.fail-on-missing-locations`](#application-properties.data-migration.spring.flyway.fail-on-missing-locations) | Whether to fail if a location of migration scripts doesn't exist. | `false` | | [`spring.flyway.group`](#application-properties.data-migration.spring.flyway.group) | Whether to group all pending migrations together in the same transaction when applying them. | `false` | | [`spring.flyway.ignore-migration-patterns`](#application-properties.data-migration.spring.flyway.ignore-migration-patterns) | Ignore migrations that match this comma-separated list of patterns when validating migrations. Requires Flyway Teams. | | | [`spring.flyway.init-sqls`](#application-properties.data-migration.spring.flyway.init-sqls) | SQL statements to execute to initialize a connection immediately after obtaining it. | | | [`spring.flyway.installed-by`](#application-properties.data-migration.spring.flyway.installed-by) | Username recorded in the schema history table as having applied the migration. | | | [`spring.flyway.jdbc-properties.*`](#application-properties.data-migration.spring.flyway.jdbc-properties) | Properties to pass to the JDBC driver. Requires Flyway Teams. | | | [`spring.flyway.kerberos-config-file`](#application-properties.data-migration.spring.flyway.kerberos-config-file) | Path of the Kerberos config file. Requires Flyway Teams. | | | [`spring.flyway.license-key`](#application-properties.data-migration.spring.flyway.license-key) | Licence key for Flyway Teams. | | | [`spring.flyway.locations`](#application-properties.data-migration.spring.flyway.locations) | Locations of migrations scripts. Can contain the special "{vendor}" placeholder to use vendor-specific locations. | `[classpath:db/migration]` | | [`spring.flyway.lock-retry-count`](#application-properties.data-migration.spring.flyway.lock-retry-count) | Maximum number of retries when trying to obtain a lock. | `50` | | [`spring.flyway.mixed`](#application-properties.data-migration.spring.flyway.mixed) | Whether to allow mixing transactional and non-transactional statements within the same migration. | `false` | | [`spring.flyway.oracle-kerberos-cache-file`](#application-properties.data-migration.spring.flyway.oracle-kerberos-cache-file) | Path of the Oracle Kerberos cache file. Requires Flyway Teams. | | | [`spring.flyway.oracle-sqlplus`](#application-properties.data-migration.spring.flyway.oracle-sqlplus) | Whether to enable support for Oracle SQL\*Plus commands. Requires Flyway Teams. | | | [`spring.flyway.oracle-sqlplus-warn`](#application-properties.data-migration.spring.flyway.oracle-sqlplus-warn) | Whether to issue a warning rather than an error when a not-yet-supported Oracle SQL\*Plus statement is encountered. Requires Flyway Teams. | | | [`spring.flyway.oracle-wallet-location`](#application-properties.data-migration.spring.flyway.oracle-wallet-location) | Location of the Oracle Wallet, used to sign-in to the database automatically. Requires Flyway Teams. | | | [`spring.flyway.out-of-order`](#application-properties.data-migration.spring.flyway.out-of-order) | Whether to allow migrations to be run out of order. | `false` | | [`spring.flyway.output-query-results`](#application-properties.data-migration.spring.flyway.output-query-results) | Whether Flyway should output a table with the results of queries when executing migrations. Requires Flyway Teams. | | | [`spring.flyway.password`](#application-properties.data-migration.spring.flyway.password) | Login password of the database to migrate. | | | [`spring.flyway.placeholder-prefix`](#application-properties.data-migration.spring.flyway.placeholder-prefix) | Prefix of placeholders in migration scripts. | `${` | | [`spring.flyway.placeholder-replacement`](#application-properties.data-migration.spring.flyway.placeholder-replacement) | Perform placeholder replacement in migration scripts. | `true` | | [`spring.flyway.placeholder-separator`](#application-properties.data-migration.spring.flyway.placeholder-separator) | Separator of default placeholders. | `:` | | [`spring.flyway.placeholder-suffix`](#application-properties.data-migration.spring.flyway.placeholder-suffix) | Suffix of placeholders in migration scripts. | `}` | | [`spring.flyway.placeholders.*`](#application-properties.data-migration.spring.flyway.placeholders) | Placeholders and their replacements to apply to sql migration scripts. | | | [`spring.flyway.repeatable-sql-migration-prefix`](#application-properties.data-migration.spring.flyway.repeatable-sql-migration-prefix) | File name prefix for repeatable SQL migrations. | `R` | | [`spring.flyway.schemas`](#application-properties.data-migration.spring.flyway.schemas) | Scheme names managed by Flyway (case-sensitive). | | | [`spring.flyway.script-placeholder-prefix`](#application-properties.data-migration.spring.flyway.script-placeholder-prefix) | Prefix of placeholders in migration scripts. | `FP__` | | [`spring.flyway.script-placeholder-suffix`](#application-properties.data-migration.spring.flyway.script-placeholder-suffix) | Suffix of placeholders in migration scripts. | `__` | | [`spring.flyway.skip-default-callbacks`](#application-properties.data-migration.spring.flyway.skip-default-callbacks) | Whether to skip default callbacks. If true, only custom callbacks are used. | `false` | | [`spring.flyway.skip-default-resolvers`](#application-properties.data-migration.spring.flyway.skip-default-resolvers) | Whether to skip default resolvers. If true, only custom resolvers are used. | `false` | | [`spring.flyway.skip-executing-migrations`](#application-properties.data-migration.spring.flyway.skip-executing-migrations) | Whether Flyway should skip executing the contents of the migrations and only update the schema history table. Requires Flyway teams. | | | [`spring.flyway.sql-migration-prefix`](#application-properties.data-migration.spring.flyway.sql-migration-prefix) | File name prefix for SQL migrations. | `V` | | [`spring.flyway.sql-migration-separator`](#application-properties.data-migration.spring.flyway.sql-migration-separator) | File name separator for SQL migrations. | `__` | | [`spring.flyway.sql-migration-suffixes`](#application-properties.data-migration.spring.flyway.sql-migration-suffixes) | File name suffix for SQL migrations. | `[.sql]` | | [`spring.flyway.sql-server-kerberos-login-file`](#application-properties.data-migration.spring.flyway.sql-server-kerberos-login-file) | Path to the SQL Server Kerberos login file. Requires Flyway Teams. | | | [`spring.flyway.stream`](#application-properties.data-migration.spring.flyway.stream) | Whether to stream SQL migrations when executing them. Requires Flyway Teams. | | | [`spring.flyway.table`](#application-properties.data-migration.spring.flyway.table) | Name of the schema history table that will be used by Flyway. | `flyway_schema_history` | | [`spring.flyway.tablespace`](#application-properties.data-migration.spring.flyway.tablespace) | Tablespace in which the schema history table is created. Ignored when using a database that does not support tablespaces. Defaults to the default tablespace of the connection used by Flyway. | | | [`spring.flyway.target`](#application-properties.data-migration.spring.flyway.target) | Target version up to which migrations should be considered. | | | [`spring.flyway.url`](#application-properties.data-migration.spring.flyway.url) | JDBC url of the database to migrate. If not set, the primary configured data source is used. | | | [`spring.flyway.user`](#application-properties.data-migration.spring.flyway.user) | Login user of the database to migrate. | | | [`spring.flyway.validate-migration-naming`](#application-properties.data-migration.spring.flyway.validate-migration-naming) | Whether to validate migrations and callbacks whose scripts do not obey the correct naming convention. | `false` | | [`spring.flyway.validate-on-migrate`](#application-properties.data-migration.spring.flyway.validate-on-migrate) | Whether to automatically call validate when performing a migration. | `true` | | [`spring.liquibase.change-log`](#application-properties.data-migration.spring.liquibase.change-log) | Change log configuration path. | `classpath:/db/changelog/db.changelog-master.yaml` | | [`spring.liquibase.clear-checksums`](#application-properties.data-migration.spring.liquibase.clear-checksums) | Whether to clear all checksums in the current changelog, so they will be recalculated upon the next update. | `false` | | [`spring.liquibase.contexts`](#application-properties.data-migration.spring.liquibase.contexts) | Comma-separated list of runtime contexts to use. | | | [`spring.liquibase.database-change-log-lock-table`](#application-properties.data-migration.spring.liquibase.database-change-log-lock-table) | Name of table to use for tracking concurrent Liquibase usage. | `DATABASECHANGELOGLOCK` | | [`spring.liquibase.database-change-log-table`](#application-properties.data-migration.spring.liquibase.database-change-log-table) | Name of table to use for tracking change history. | `DATABASECHANGELOG` | | [`spring.liquibase.default-schema`](#application-properties.data-migration.spring.liquibase.default-schema) | Default database schema. | | | [`spring.liquibase.driver-class-name`](#application-properties.data-migration.spring.liquibase.driver-class-name) | Fully qualified name of the JDBC driver. Auto-detected based on the URL by default. | | | [`spring.liquibase.drop-first`](#application-properties.data-migration.spring.liquibase.drop-first) | Whether to first drop the database schema. | `false` | | [`spring.liquibase.enabled`](#application-properties.data-migration.spring.liquibase.enabled) | Whether to enable Liquibase support. | `true` | | [`spring.liquibase.labels`](#application-properties.data-migration.spring.liquibase.labels) | Comma-separated list of runtime labels to use. | | | [`spring.liquibase.liquibase-schema`](#application-properties.data-migration.spring.liquibase.liquibase-schema) | Schema to use for Liquibase objects. | | | [`spring.liquibase.liquibase-tablespace`](#application-properties.data-migration.spring.liquibase.liquibase-tablespace) | Tablespace to use for Liquibase objects. | | | [`spring.liquibase.parameters.*`](#application-properties.data-migration.spring.liquibase.parameters) | Change log parameters. | | | [`spring.liquibase.password`](#application-properties.data-migration.spring.liquibase.password) | Login password of the database to migrate. | | | [`spring.liquibase.rollback-file`](#application-properties.data-migration.spring.liquibase.rollback-file) | File to which rollback SQL is written when an update is performed. | | | [`spring.liquibase.tag`](#application-properties.data-migration.spring.liquibase.tag) | Tag name to use when applying database changes. Can also be used with "rollbackFile" to generate a rollback script for all existing changes associated with that tag. | | | [`spring.liquibase.test-rollback-on-update`](#application-properties.data-migration.spring.liquibase.test-rollback-on-update) | Whether rollback should be tested before update is performed. | `false` | | [`spring.liquibase.url`](#application-properties.data-migration.spring.liquibase.url) | JDBC URL of the database to migrate. If not set, the primary configured data source is used. | | | [`spring.liquibase.user`](#application-properties.data-migration.spring.liquibase.user) | Login user of the database to migrate. | | | [`spring.sql.init.continue-on-error`](#application-properties.data-migration.spring.sql.init.continue-on-error) | Whether initialization should continue when an error occurs. | `false` | | [`spring.sql.init.data-locations`](#application-properties.data-migration.spring.sql.init.data-locations) | Locations of the data (DML) scripts to apply to the database. | | | [`spring.sql.init.encoding`](#application-properties.data-migration.spring.sql.init.encoding) | Encoding of the schema and data scripts. | | | [`spring.sql.init.mode`](#application-properties.data-migration.spring.sql.init.mode) | Mode to apply when determining whether initialization should be performed. | `embedded` | | [`spring.sql.init.password`](#application-properties.data-migration.spring.sql.init.password) | Password of the database to use when applying initialization scripts (if different). | | | [`spring.sql.init.platform`](#application-properties.data-migration.spring.sql.init.platform) | Platform to use in the default schema or data script locations, schema-${platform}.sql and data-${platform}.sql. | `all` | | [`spring.sql.init.schema-locations`](#application-properties.data-migration.spring.sql.init.schema-locations) | Locations of the schema (DDL) scripts to apply to the database. | | | [`spring.sql.init.separator`](#application-properties.data-migration.spring.sql.init.separator) | Statement separator in the schema and data scripts. | `;` | | [`spring.sql.init.username`](#application-properties.data-migration.spring.sql.init.username) | Username of the database to use when applying initialization scripts (if different). | | 8. Integration Properties -------------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.activemq.broker-url`](#application-properties.integration.spring.activemq.broker-url) | URL of the ActiveMQ broker. Auto-generated by default. | | | [`spring.activemq.close-timeout`](#application-properties.integration.spring.activemq.close-timeout) | Time to wait before considering a close complete. | `15s` | | [`spring.activemq.in-memory`](#application-properties.integration.spring.activemq.in-memory) | Whether the default broker URL should be in memory. Ignored if an explicit broker has been specified. | `true` | | [`spring.activemq.non-blocking-redelivery`](#application-properties.integration.spring.activemq.non-blocking-redelivery) | Whether to stop message delivery before re-delivering messages from a rolled back transaction. This implies that message order is not preserved when this is enabled. | `false` | | [`spring.activemq.packages.trust-all`](#application-properties.integration.spring.activemq.packages.trust-all) | Whether to trust all packages. | | | [`spring.activemq.packages.trusted`](#application-properties.integration.spring.activemq.packages.trusted) | Comma-separated list of specific packages to trust (when not trusting all packages). | | | [`spring.activemq.password`](#application-properties.integration.spring.activemq.password) | Login password of the broker. | | | [`spring.activemq.pool.block-if-full`](#application-properties.integration.spring.activemq.pool.block-if-full) | Whether to block when a connection is requested and the pool is full. Set it to false to throw a "JMSException" instead. | `true` | | [`spring.activemq.pool.block-if-full-timeout`](#application-properties.integration.spring.activemq.pool.block-if-full-timeout) | Blocking period before throwing an exception if the pool is still full. | `-1ms` | | [`spring.activemq.pool.enabled`](#application-properties.integration.spring.activemq.pool.enabled) | Whether a JmsPoolConnectionFactory should be created, instead of a regular ConnectionFactory. | `false` | | [`spring.activemq.pool.idle-timeout`](#application-properties.integration.spring.activemq.pool.idle-timeout) | Connection idle timeout. | `30s` | | [`spring.activemq.pool.max-connections`](#application-properties.integration.spring.activemq.pool.max-connections) | Maximum number of pooled connections. | `1` | | [`spring.activemq.pool.max-sessions-per-connection`](#application-properties.integration.spring.activemq.pool.max-sessions-per-connection) | Maximum number of pooled sessions per connection in the pool. | `500` | | [`spring.activemq.pool.time-between-expiration-check`](#application-properties.integration.spring.activemq.pool.time-between-expiration-check) | Time to sleep between runs of the idle connection eviction thread. When negative, no idle connection eviction thread runs. | `-1ms` | | [`spring.activemq.pool.use-anonymous-producers`](#application-properties.integration.spring.activemq.pool.use-anonymous-producers) | Whether to use only one anonymous "MessageProducer" instance. Set it to false to create one "MessageProducer" every time one is required. | `true` | | [`spring.activemq.send-timeout`](#application-properties.integration.spring.activemq.send-timeout) | Time to wait on message sends for a response. Set it to 0 to wait forever. | `0ms` | | [`spring.activemq.user`](#application-properties.integration.spring.activemq.user) | Login user of the broker. | | | [`spring.artemis.broker-url`](#application-properties.integration.spring.artemis.broker-url) | Artemis broker port. | `tcp://localhost:61616` | | [`spring.artemis.embedded.cluster-password`](#application-properties.integration.spring.artemis.embedded.cluster-password) | Cluster password. Randomly generated on startup by default. | | | [`spring.artemis.embedded.data-directory`](#application-properties.integration.spring.artemis.embedded.data-directory) | Journal file directory. Not necessary if persistence is turned off. | | | [`spring.artemis.embedded.enabled`](#application-properties.integration.spring.artemis.embedded.enabled) | Whether to enable embedded mode if the Artemis server APIs are available. | `true` | | [`spring.artemis.embedded.persistent`](#application-properties.integration.spring.artemis.embedded.persistent) | Whether to enable persistent store. | `false` | | [`spring.artemis.embedded.queues`](#application-properties.integration.spring.artemis.embedded.queues) | Comma-separated list of queues to create on startup. | `[]` | | [`spring.artemis.embedded.server-id`](#application-properties.integration.spring.artemis.embedded.server-id) | Server ID. By default, an auto-incremented counter is used. | `0` | | [`spring.artemis.embedded.topics`](#application-properties.integration.spring.artemis.embedded.topics) | Comma-separated list of topics to create on startup. | `[]` | | [`spring.artemis.mode`](#application-properties.integration.spring.artemis.mode) | Artemis deployment mode, auto-detected by default. | | | [`spring.artemis.password`](#application-properties.integration.spring.artemis.password) | Login password of the broker. | | | [`spring.artemis.pool.block-if-full`](#application-properties.integration.spring.artemis.pool.block-if-full) | Whether to block when a connection is requested and the pool is full. Set it to false to throw a "JMSException" instead. | `true` | | [`spring.artemis.pool.block-if-full-timeout`](#application-properties.integration.spring.artemis.pool.block-if-full-timeout) | Blocking period before throwing an exception if the pool is still full. | `-1ms` | | [`spring.artemis.pool.enabled`](#application-properties.integration.spring.artemis.pool.enabled) | Whether a JmsPoolConnectionFactory should be created, instead of a regular ConnectionFactory. | `false` | | [`spring.artemis.pool.idle-timeout`](#application-properties.integration.spring.artemis.pool.idle-timeout) | Connection idle timeout. | `30s` | | [`spring.artemis.pool.max-connections`](#application-properties.integration.spring.artemis.pool.max-connections) | Maximum number of pooled connections. | `1` | | [`spring.artemis.pool.max-sessions-per-connection`](#application-properties.integration.spring.artemis.pool.max-sessions-per-connection) | Maximum number of pooled sessions per connection in the pool. | `500` | | [`spring.artemis.pool.time-between-expiration-check`](#application-properties.integration.spring.artemis.pool.time-between-expiration-check) | Time to sleep between runs of the idle connection eviction thread. When negative, no idle connection eviction thread runs. | `-1ms` | | [`spring.artemis.pool.use-anonymous-producers`](#application-properties.integration.spring.artemis.pool.use-anonymous-producers) | Whether to use only one anonymous "MessageProducer" instance. Set it to false to create one "MessageProducer" every time one is required. | `true` | | [`spring.artemis.user`](#application-properties.integration.spring.artemis.user) | Login user of the broker. | | | [`spring.batch.jdbc.initialize-schema`](#application-properties.integration.spring.batch.jdbc.initialize-schema) | Database schema initialization mode. | `embedded` | | [`spring.batch.jdbc.isolation-level-for-create`](#application-properties.integration.spring.batch.jdbc.isolation-level-for-create) | Transaction isolation level to use when creating job meta-data for new jobs. Auto-detected based on whether JPA is being used or not. | | | [`spring.batch.jdbc.platform`](#application-properties.integration.spring.batch.jdbc.platform) | Platform to use in initialization scripts if the @@platform@@ placeholder is used. Auto-detected by default. | | | [`spring.batch.jdbc.schema`](#application-properties.integration.spring.batch.jdbc.schema) | Path to the SQL file to use to initialize the database schema. | `classpath:org/springframework/batch/core/schema-@@platform@@.sql` | | [`spring.batch.jdbc.table-prefix`](#application-properties.integration.spring.batch.jdbc.table-prefix) | Table prefix for all the batch meta-data tables. | | | [`spring.batch.job.enabled`](#application-properties.integration.spring.batch.job.enabled) | Execute all Spring Batch jobs in the context on startup. | `true` | | [`spring.batch.job.names`](#application-properties.integration.spring.batch.job.names) | Comma-separated list of job names to execute on startup (for instance, 'job1,job2'). By default, all Jobs found in the context are executed. | | | [`spring.hazelcast.config`](#application-properties.integration.spring.hazelcast.config) | The location of the configuration file to use to initialize Hazelcast. | | | [`spring.integration.channel.auto-create`](#application-properties.integration.spring.integration.channel.auto-create) | Whether to create input channels if necessary. | `true` | | [`spring.integration.channel.max-broadcast-subscribers`](#application-properties.integration.spring.integration.channel.max-broadcast-subscribers) | Default number of subscribers allowed on, for example, a 'PublishSubscribeChannel'. | | | [`spring.integration.channel.max-unicast-subscribers`](#application-properties.integration.spring.integration.channel.max-unicast-subscribers) | Default number of subscribers allowed on, for example, a 'DirectChannel'. | | | [`spring.integration.endpoint.no-auto-startup`](#application-properties.integration.spring.integration.endpoint.no-auto-startup) | A comma-separated list of endpoint bean names patterns that should not be started automatically during application startup. | | | [`spring.integration.endpoint.read-only-headers`](#application-properties.integration.spring.integration.endpoint.read-only-headers) | A comma-separated list of message header names that should not be populated into Message instances during a header copying operation. | | | [`spring.integration.endpoint.throw-exception-on-late-reply`](#application-properties.integration.spring.integration.endpoint.throw-exception-on-late-reply) | Whether to throw an exception when a reply is not expected anymore by a gateway. | `false` | | [`spring.integration.error.ignore-failures`](#application-properties.integration.spring.integration.error.ignore-failures) | Whether to ignore failures for one or more of the handlers of the global 'errorChannel'. | `true` | | [`spring.integration.error.require-subscribers`](#application-properties.integration.spring.integration.error.require-subscribers) | Whether to not silently ignore messages on the global 'errorChannel' when they are no subscribers. | `true` | | [`spring.integration.jdbc.initialize-schema`](#application-properties.integration.spring.integration.jdbc.initialize-schema) | Database schema initialization mode. | `embedded` | | [`spring.integration.jdbc.platform`](#application-properties.integration.spring.integration.jdbc.platform) | Platform to use in initialization scripts if the @@platform@@ placeholder is used. Auto-detected by default. | | | [`spring.integration.jdbc.schema`](#application-properties.integration.spring.integration.jdbc.schema) | Path to the SQL file to use to initialize the database schema. | `classpath:org/springframework/integration/jdbc/schema-@@platform@@.sql` | | [`spring.integration.management.default-logging-enabled`](#application-properties.integration.spring.integration.management.default-logging-enabled) | Whether Spring Integration components should perform logging in the main message flow. When disabled, such logging will be skipped without checking the logging level. When enabled, such logging is controlled as normal by the logging system's log level configuration. | `true` | | [`spring.integration.poller.cron`](#application-properties.integration.spring.integration.poller.cron) | Cron expression for polling. Mutually exclusive with 'fixedDelay' and 'fixedRate'. | | | [`spring.integration.poller.fixed-delay`](#application-properties.integration.spring.integration.poller.fixed-delay) | Polling delay period. Mutually exclusive with 'cron' and 'fixedRate'. | | | [`spring.integration.poller.fixed-rate`](#application-properties.integration.spring.integration.poller.fixed-rate) | Polling rate period. Mutually exclusive with 'fixedDelay' and 'cron'. | | | [`spring.integration.poller.initial-delay`](#application-properties.integration.spring.integration.poller.initial-delay) | Polling initial delay. Applied for 'fixedDelay' and 'fixedRate'; ignored for 'cron'. | | | [`spring.integration.poller.max-messages-per-poll`](#application-properties.integration.spring.integration.poller.max-messages-per-poll) | Maximum number of messages to poll per polling cycle. | | | [`spring.integration.poller.receive-timeout`](#application-properties.integration.spring.integration.poller.receive-timeout) | How long to wait for messages on poll. | `1s` | | [`spring.integration.rsocket.client.host`](#application-properties.integration.spring.integration.rsocket.client.host) | TCP RSocket server host to connect to. | | | [`spring.integration.rsocket.client.port`](#application-properties.integration.spring.integration.rsocket.client.port) | TCP RSocket server port to connect to. | | | [`spring.integration.rsocket.client.uri`](#application-properties.integration.spring.integration.rsocket.client.uri) | WebSocket RSocket server uri to connect to. | | | [`spring.integration.rsocket.server.message-mapping-enabled`](#application-properties.integration.spring.integration.rsocket.server.message-mapping-enabled) | Whether to handle message mapping for RSocket via Spring Integration. | `false` | | [`spring.jms.cache.consumers`](#application-properties.integration.spring.jms.cache.consumers) | Whether to cache message consumers. | `false` | | [`spring.jms.cache.enabled`](#application-properties.integration.spring.jms.cache.enabled) | Whether to cache sessions. | `true` | | [`spring.jms.cache.producers`](#application-properties.integration.spring.jms.cache.producers) | Whether to cache message producers. | `true` | | [`spring.jms.cache.session-cache-size`](#application-properties.integration.spring.jms.cache.session-cache-size) | Size of the session cache (per JMS Session type). | `1` | | [`spring.jms.jndi-name`](#application-properties.integration.spring.jms.jndi-name) | Connection factory JNDI name. When set, takes precedence to others connection factory auto-configurations. | | | [`spring.jms.listener.acknowledge-mode`](#application-properties.integration.spring.jms.listener.acknowledge-mode) | Acknowledge mode of the container. By default, the listener is transacted with automatic acknowledgment. | | | [`spring.jms.listener.auto-startup`](#application-properties.integration.spring.jms.listener.auto-startup) | Start the container automatically on startup. | `true` | | [`spring.jms.listener.concurrency`](#application-properties.integration.spring.jms.listener.concurrency) | Minimum number of concurrent consumers. | | | [`spring.jms.listener.max-concurrency`](#application-properties.integration.spring.jms.listener.max-concurrency) | Maximum number of concurrent consumers. | | | [`spring.jms.listener.receive-timeout`](#application-properties.integration.spring.jms.listener.receive-timeout) | Timeout to use for receive calls. Use -1 for a no-wait receive or 0 for no timeout at all. The latter is only feasible if not running within a transaction manager and is generally discouraged since it prevents clean shutdown. | `1s` | | [`spring.jms.pub-sub-domain`](#application-properties.integration.spring.jms.pub-sub-domain) | Whether the default destination type is topic. | `false` | | [`spring.jms.template.default-destination`](#application-properties.integration.spring.jms.template.default-destination) | Default destination to use on send and receive operations that do not have a destination parameter. | | | [`spring.jms.template.delivery-delay`](#application-properties.integration.spring.jms.template.delivery-delay) | Delivery delay to use for send calls. | | | [`spring.jms.template.delivery-mode`](#application-properties.integration.spring.jms.template.delivery-mode) | Delivery mode. Enables QoS (Quality of Service) when set. | | | [`spring.jms.template.priority`](#application-properties.integration.spring.jms.template.priority) | Priority of a message when sending. Enables QoS (Quality of Service) when set. | | | [`spring.jms.template.qos-enabled`](#application-properties.integration.spring.jms.template.qos-enabled) | Whether to enable explicit QoS (Quality of Service) when sending a message. When enabled, the delivery mode, priority and time-to-live properties will be used when sending a message. QoS is automatically enabled when at least one of those settings is customized. | | | [`spring.jms.template.receive-timeout`](#application-properties.integration.spring.jms.template.receive-timeout) | Timeout to use for receive calls. | | | [`spring.jms.template.time-to-live`](#application-properties.integration.spring.jms.template.time-to-live) | Time-to-live of a message when sending. Enables QoS (Quality of Service) when set. | | | [`spring.kafka.admin.client-id`](#application-properties.integration.spring.kafka.admin.client-id) | ID to pass to the server when making requests. Used for server-side logging. | | | [`spring.kafka.admin.fail-fast`](#application-properties.integration.spring.kafka.admin.fail-fast) | Whether to fail fast if the broker is not available on startup. | `false` | | [`spring.kafka.admin.properties.*`](#application-properties.integration.spring.kafka.admin.properties) | Additional admin-specific properties used to configure the client. | | | [`spring.kafka.admin.security.protocol`](#application-properties.integration.spring.kafka.admin.security.protocol) | Security protocol used to communicate with brokers. | | | [`spring.kafka.admin.ssl.key-password`](#application-properties.integration.spring.kafka.admin.ssl.key-password) | Password of the private key in either key store key or key store file. | | | [`spring.kafka.admin.ssl.key-store-certificate-chain`](#application-properties.integration.spring.kafka.admin.ssl.key-store-certificate-chain) | Certificate chain in PEM format with a list of X.509 certificates. | | | [`spring.kafka.admin.ssl.key-store-key`](#application-properties.integration.spring.kafka.admin.ssl.key-store-key) | Private key in PEM format with PKCS#8 keys. | | | [`spring.kafka.admin.ssl.key-store-location`](#application-properties.integration.spring.kafka.admin.ssl.key-store-location) | Location of the key store file. | | | [`spring.kafka.admin.ssl.key-store-password`](#application-properties.integration.spring.kafka.admin.ssl.key-store-password) | Store password for the key store file. | | | [`spring.kafka.admin.ssl.key-store-type`](#application-properties.integration.spring.kafka.admin.ssl.key-store-type) | Type of the key store. | | | [`spring.kafka.admin.ssl.protocol`](#application-properties.integration.spring.kafka.admin.ssl.protocol) | SSL protocol to use. | | | [`spring.kafka.admin.ssl.trust-store-certificates`](#application-properties.integration.spring.kafka.admin.ssl.trust-store-certificates) | Trusted certificates in PEM format with X.509 certificates. | | | [`spring.kafka.admin.ssl.trust-store-location`](#application-properties.integration.spring.kafka.admin.ssl.trust-store-location) | Location of the trust store file. | | | [`spring.kafka.admin.ssl.trust-store-password`](#application-properties.integration.spring.kafka.admin.ssl.trust-store-password) | Store password for the trust store file. | | | [`spring.kafka.admin.ssl.trust-store-type`](#application-properties.integration.spring.kafka.admin.ssl.trust-store-type) | Type of the trust store. | | | [`spring.kafka.bootstrap-servers`](#application-properties.integration.spring.kafka.bootstrap-servers) | Comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster. Applies to all components unless overridden. | | | [`spring.kafka.client-id`](#application-properties.integration.spring.kafka.client-id) | ID to pass to the server when making requests. Used for server-side logging. | | | [`spring.kafka.consumer.auto-commit-interval`](#application-properties.integration.spring.kafka.consumer.auto-commit-interval) | Frequency with which the consumer offsets are auto-committed to Kafka if 'enable.auto.commit' is set to true. | | | [`spring.kafka.consumer.auto-offset-reset`](#application-properties.integration.spring.kafka.consumer.auto-offset-reset) | What to do when there is no initial offset in Kafka or if the current offset no longer exists on the server. | | | [`spring.kafka.consumer.bootstrap-servers`](#application-properties.integration.spring.kafka.consumer.bootstrap-servers) | Comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster. Overrides the global property, for consumers. | | | [`spring.kafka.consumer.client-id`](#application-properties.integration.spring.kafka.consumer.client-id) | ID to pass to the server when making requests. Used for server-side logging. | | | [`spring.kafka.consumer.enable-auto-commit`](#application-properties.integration.spring.kafka.consumer.enable-auto-commit) | Whether the consumer's offset is periodically committed in the background. | | | [`spring.kafka.consumer.fetch-max-wait`](#application-properties.integration.spring.kafka.consumer.fetch-max-wait) | Maximum amount of time the server blocks before answering the fetch request if there isn't sufficient data to immediately satisfy the requirement given by "fetch-min-size". | | | [`spring.kafka.consumer.fetch-min-size`](#application-properties.integration.spring.kafka.consumer.fetch-min-size) | Minimum amount of data the server should return for a fetch request. | | | [`spring.kafka.consumer.group-id`](#application-properties.integration.spring.kafka.consumer.group-id) | Unique string that identifies the consumer group to which this consumer belongs. | | | [`spring.kafka.consumer.heartbeat-interval`](#application-properties.integration.spring.kafka.consumer.heartbeat-interval) | Expected time between heartbeats to the consumer coordinator. | | | [`spring.kafka.consumer.isolation-level`](#application-properties.integration.spring.kafka.consumer.isolation-level) | Isolation level for reading messages that have been written transactionally. | `read-uncommitted` | | [`spring.kafka.consumer.key-deserializer`](#application-properties.integration.spring.kafka.consumer.key-deserializer) | Deserializer class for keys. | | | [`spring.kafka.consumer.max-poll-records`](#application-properties.integration.spring.kafka.consumer.max-poll-records) | Maximum number of records returned in a single call to poll(). | | | [`spring.kafka.consumer.properties.*`](#application-properties.integration.spring.kafka.consumer.properties) | Additional consumer-specific properties used to configure the client. | | | [`spring.kafka.consumer.security.protocol`](#application-properties.integration.spring.kafka.consumer.security.protocol) | Security protocol used to communicate with brokers. | | | [`spring.kafka.consumer.ssl.key-password`](#application-properties.integration.spring.kafka.consumer.ssl.key-password) | Password of the private key in either key store key or key store file. | | | [`spring.kafka.consumer.ssl.key-store-certificate-chain`](#application-properties.integration.spring.kafka.consumer.ssl.key-store-certificate-chain) | Certificate chain in PEM format with a list of X.509 certificates. | | | [`spring.kafka.consumer.ssl.key-store-key`](#application-properties.integration.spring.kafka.consumer.ssl.key-store-key) | Private key in PEM format with PKCS#8 keys. | | | [`spring.kafka.consumer.ssl.key-store-location`](#application-properties.integration.spring.kafka.consumer.ssl.key-store-location) | Location of the key store file. | | | [`spring.kafka.consumer.ssl.key-store-password`](#application-properties.integration.spring.kafka.consumer.ssl.key-store-password) | Store password for the key store file. | | | [`spring.kafka.consumer.ssl.key-store-type`](#application-properties.integration.spring.kafka.consumer.ssl.key-store-type) | Type of the key store. | | | [`spring.kafka.consumer.ssl.protocol`](#application-properties.integration.spring.kafka.consumer.ssl.protocol) | SSL protocol to use. | | | [`spring.kafka.consumer.ssl.trust-store-certificates`](#application-properties.integration.spring.kafka.consumer.ssl.trust-store-certificates) | Trusted certificates in PEM format with X.509 certificates. | | | [`spring.kafka.consumer.ssl.trust-store-location`](#application-properties.integration.spring.kafka.consumer.ssl.trust-store-location) | Location of the trust store file. | | | [`spring.kafka.consumer.ssl.trust-store-password`](#application-properties.integration.spring.kafka.consumer.ssl.trust-store-password) | Store password for the trust store file. | | | [`spring.kafka.consumer.ssl.trust-store-type`](#application-properties.integration.spring.kafka.consumer.ssl.trust-store-type) | Type of the trust store. | | | [`spring.kafka.consumer.value-deserializer`](#application-properties.integration.spring.kafka.consumer.value-deserializer) | Deserializer class for values. | | | [`spring.kafka.jaas.control-flag`](#application-properties.integration.spring.kafka.jaas.control-flag) | Control flag for login configuration. | `required` | | [`spring.kafka.jaas.enabled`](#application-properties.integration.spring.kafka.jaas.enabled) | Whether to enable JAAS configuration. | `false` | | [`spring.kafka.jaas.login-module`](#application-properties.integration.spring.kafka.jaas.login-module) | Login module. | `com.sun.security.auth.module.Krb5LoginModule` | | [`spring.kafka.jaas.options.*`](#application-properties.integration.spring.kafka.jaas.options) | Additional JAAS options. | | | [`spring.kafka.listener.ack-count`](#application-properties.integration.spring.kafka.listener.ack-count) | Number of records between offset commits when ackMode is "COUNT" or "COUNT\_TIME". | | | [`spring.kafka.listener.ack-mode`](#application-properties.integration.spring.kafka.listener.ack-mode) | Listener AckMode. See the spring-kafka documentation. | | | [`spring.kafka.listener.ack-time`](#application-properties.integration.spring.kafka.listener.ack-time) | Time between offset commits when ackMode is "TIME" or "COUNT\_TIME". | | | [`spring.kafka.listener.client-id`](#application-properties.integration.spring.kafka.listener.client-id) | Prefix for the listener's consumer client.id property. | | | [`spring.kafka.listener.concurrency`](#application-properties.integration.spring.kafka.listener.concurrency) | Number of threads to run in the listener containers. | | | [`spring.kafka.listener.idle-between-polls`](#application-properties.integration.spring.kafka.listener.idle-between-polls) | Sleep interval between Consumer.poll(Duration) calls. | `0` | | [`spring.kafka.listener.idle-event-interval`](#application-properties.integration.spring.kafka.listener.idle-event-interval) | Time between publishing idle consumer events (no data received). | | | [`spring.kafka.listener.idle-partition-event-interval`](#application-properties.integration.spring.kafka.listener.idle-partition-event-interval) | Time between publishing idle partition consumer events (no data received for partition). | | | [`spring.kafka.listener.immediate-stop`](#application-properties.integration.spring.kafka.listener.immediate-stop) | Whether the container stops after the current record is processed or after all the records from the previous poll are processed. | `false` | | [`spring.kafka.listener.log-container-config`](#application-properties.integration.spring.kafka.listener.log-container-config) | Whether to log the container configuration during initialization (INFO level). | | | [`spring.kafka.listener.missing-topics-fatal`](#application-properties.integration.spring.kafka.listener.missing-topics-fatal) | Whether the container should fail to start if at least one of the configured topics are not present on the broker. | `false` | | [`spring.kafka.listener.monitor-interval`](#application-properties.integration.spring.kafka.listener.monitor-interval) | Time between checks for non-responsive consumers. If a duration suffix is not specified, seconds will be used. | | | [`spring.kafka.listener.no-poll-threshold`](#application-properties.integration.spring.kafka.listener.no-poll-threshold) | Multiplier applied to "pollTimeout" to determine if a consumer is non-responsive. | | | [`spring.kafka.listener.poll-timeout`](#application-properties.integration.spring.kafka.listener.poll-timeout) | Timeout to use when polling the consumer. | | | [`spring.kafka.listener.type`](#application-properties.integration.spring.kafka.listener.type) | Listener type. | `single` | | [`spring.kafka.producer.acks`](#application-properties.integration.spring.kafka.producer.acks) | Number of acknowledgments the producer requires the leader to have received before considering a request complete. | | | [`spring.kafka.producer.batch-size`](#application-properties.integration.spring.kafka.producer.batch-size) | Default batch size. A small batch size will make batching less common and may reduce throughput (a batch size of zero disables batching entirely). | | | [`spring.kafka.producer.bootstrap-servers`](#application-properties.integration.spring.kafka.producer.bootstrap-servers) | Comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster. Overrides the global property, for producers. | | | [`spring.kafka.producer.buffer-memory`](#application-properties.integration.spring.kafka.producer.buffer-memory) | Total memory size the producer can use to buffer records waiting to be sent to the server. | | | [`spring.kafka.producer.client-id`](#application-properties.integration.spring.kafka.producer.client-id) | ID to pass to the server when making requests. Used for server-side logging. | | | [`spring.kafka.producer.compression-type`](#application-properties.integration.spring.kafka.producer.compression-type) | Compression type for all data generated by the producer. | | | [`spring.kafka.producer.key-serializer`](#application-properties.integration.spring.kafka.producer.key-serializer) | Serializer class for keys. | | | [`spring.kafka.producer.properties.*`](#application-properties.integration.spring.kafka.producer.properties) | Additional producer-specific properties used to configure the client. | | | [`spring.kafka.producer.retries`](#application-properties.integration.spring.kafka.producer.retries) | When greater than zero, enables retrying of failed sends. | | | [`spring.kafka.producer.security.protocol`](#application-properties.integration.spring.kafka.producer.security.protocol) | Security protocol used to communicate with brokers. | | | [`spring.kafka.producer.ssl.key-password`](#application-properties.integration.spring.kafka.producer.ssl.key-password) | Password of the private key in either key store key or key store file. | | | [`spring.kafka.producer.ssl.key-store-certificate-chain`](#application-properties.integration.spring.kafka.producer.ssl.key-store-certificate-chain) | Certificate chain in PEM format with a list of X.509 certificates. | | | [`spring.kafka.producer.ssl.key-store-key`](#application-properties.integration.spring.kafka.producer.ssl.key-store-key) | Private key in PEM format with PKCS#8 keys. | | | [`spring.kafka.producer.ssl.key-store-location`](#application-properties.integration.spring.kafka.producer.ssl.key-store-location) | Location of the key store file. | | | [`spring.kafka.producer.ssl.key-store-password`](#application-properties.integration.spring.kafka.producer.ssl.key-store-password) | Store password for the key store file. | | | [`spring.kafka.producer.ssl.key-store-type`](#application-properties.integration.spring.kafka.producer.ssl.key-store-type) | Type of the key store. | | | [`spring.kafka.producer.ssl.protocol`](#application-properties.integration.spring.kafka.producer.ssl.protocol) | SSL protocol to use. | | | [`spring.kafka.producer.ssl.trust-store-certificates`](#application-properties.integration.spring.kafka.producer.ssl.trust-store-certificates) | Trusted certificates in PEM format with X.509 certificates. | | | [`spring.kafka.producer.ssl.trust-store-location`](#application-properties.integration.spring.kafka.producer.ssl.trust-store-location) | Location of the trust store file. | | | [`spring.kafka.producer.ssl.trust-store-password`](#application-properties.integration.spring.kafka.producer.ssl.trust-store-password) | Store password for the trust store file. | | | [`spring.kafka.producer.ssl.trust-store-type`](#application-properties.integration.spring.kafka.producer.ssl.trust-store-type) | Type of the trust store. | | | [`spring.kafka.producer.transaction-id-prefix`](#application-properties.integration.spring.kafka.producer.transaction-id-prefix) | When non empty, enables transaction support for producer. | | | [`spring.kafka.producer.value-serializer`](#application-properties.integration.spring.kafka.producer.value-serializer) | Serializer class for values. | | | [`spring.kafka.properties.*`](#application-properties.integration.spring.kafka.properties) | Additional properties, common to producers and consumers, used to configure the client. | | | [`spring.kafka.retry.topic.attempts`](#application-properties.integration.spring.kafka.retry.topic.attempts) | Total number of processing attempts made before sending the message to the DLT. | `3` | | [`spring.kafka.retry.topic.delay`](#application-properties.integration.spring.kafka.retry.topic.delay) | Canonical backoff period. Used as an initial value in the exponential case, and as a minimum value in the uniform case. | `1s` | | [`spring.kafka.retry.topic.enabled`](#application-properties.integration.spring.kafka.retry.topic.enabled) | Whether to enable topic-based non-blocking retries. | `false` | | [`spring.kafka.retry.topic.max-delay`](#application-properties.integration.spring.kafka.retry.topic.max-delay) | Maximum wait between retries. If less than the delay then the default of 30 seconds is applied. | `0` | | [`spring.kafka.retry.topic.multiplier`](#application-properties.integration.spring.kafka.retry.topic.multiplier) | Multiplier to use for generating the next backoff delay. | `0` | | [`spring.kafka.retry.topic.random-back-off`](#application-properties.integration.spring.kafka.retry.topic.random-back-off) | Whether to have the backoff delays. | `false` | | [`spring.kafka.security.protocol`](#application-properties.integration.spring.kafka.security.protocol) | Security protocol used to communicate with brokers. | | | [`spring.kafka.ssl.key-password`](#application-properties.integration.spring.kafka.ssl.key-password) | Password of the private key in either key store key or key store file. | | | [`spring.kafka.ssl.key-store-certificate-chain`](#application-properties.integration.spring.kafka.ssl.key-store-certificate-chain) | Certificate chain in PEM format with a list of X.509 certificates. | | | [`spring.kafka.ssl.key-store-key`](#application-properties.integration.spring.kafka.ssl.key-store-key) | Private key in PEM format with PKCS#8 keys. | | | [`spring.kafka.ssl.key-store-location`](#application-properties.integration.spring.kafka.ssl.key-store-location) | Location of the key store file. | | | [`spring.kafka.ssl.key-store-password`](#application-properties.integration.spring.kafka.ssl.key-store-password) | Store password for the key store file. | | | [`spring.kafka.ssl.key-store-type`](#application-properties.integration.spring.kafka.ssl.key-store-type) | Type of the key store. | | | [`spring.kafka.ssl.protocol`](#application-properties.integration.spring.kafka.ssl.protocol) | SSL protocol to use. | | | [`spring.kafka.ssl.trust-store-certificates`](#application-properties.integration.spring.kafka.ssl.trust-store-certificates) | Trusted certificates in PEM format with X.509 certificates. | | | [`spring.kafka.ssl.trust-store-location`](#application-properties.integration.spring.kafka.ssl.trust-store-location) | Location of the trust store file. | | | [`spring.kafka.ssl.trust-store-password`](#application-properties.integration.spring.kafka.ssl.trust-store-password) | Store password for the trust store file. | | | [`spring.kafka.ssl.trust-store-type`](#application-properties.integration.spring.kafka.ssl.trust-store-type) | Type of the trust store. | | | [`spring.kafka.streams.application-id`](#application-properties.integration.spring.kafka.streams.application-id) | Kafka streams application.id property; default spring.application.name. | | | [`spring.kafka.streams.auto-startup`](#application-properties.integration.spring.kafka.streams.auto-startup) | Whether or not to auto-start the streams factory bean. | `true` | | [`spring.kafka.streams.bootstrap-servers`](#application-properties.integration.spring.kafka.streams.bootstrap-servers) | Comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster. Overrides the global property, for streams. | | | [`spring.kafka.streams.cache-max-size-buffering`](#application-properties.integration.spring.kafka.streams.cache-max-size-buffering) | Maximum memory size to be used for buffering across all threads. | | | [`spring.kafka.streams.cleanup.on-shutdown`](#application-properties.integration.spring.kafka.streams.cleanup.on-shutdown) | Cleanup the application’s local state directory on shutdown. | `false` | | [`spring.kafka.streams.cleanup.on-startup`](#application-properties.integration.spring.kafka.streams.cleanup.on-startup) | Cleanup the application’s local state directory on startup. | `false` | | [`spring.kafka.streams.client-id`](#application-properties.integration.spring.kafka.streams.client-id) | ID to pass to the server when making requests. Used for server-side logging. | | | [`spring.kafka.streams.properties.*`](#application-properties.integration.spring.kafka.streams.properties) | Additional Kafka properties used to configure the streams. | | | [`spring.kafka.streams.replication-factor`](#application-properties.integration.spring.kafka.streams.replication-factor) | The replication factor for change log topics and repartition topics created by the stream processing application. | | | [`spring.kafka.streams.security.protocol`](#application-properties.integration.spring.kafka.streams.security.protocol) | Security protocol used to communicate with brokers. | | | [`spring.kafka.streams.ssl.key-password`](#application-properties.integration.spring.kafka.streams.ssl.key-password) | Password of the private key in either key store key or key store file. | | | [`spring.kafka.streams.ssl.key-store-certificate-chain`](#application-properties.integration.spring.kafka.streams.ssl.key-store-certificate-chain) | Certificate chain in PEM format with a list of X.509 certificates. | | | [`spring.kafka.streams.ssl.key-store-key`](#application-properties.integration.spring.kafka.streams.ssl.key-store-key) | Private key in PEM format with PKCS#8 keys. | | | [`spring.kafka.streams.ssl.key-store-location`](#application-properties.integration.spring.kafka.streams.ssl.key-store-location) | Location of the key store file. | | | [`spring.kafka.streams.ssl.key-store-password`](#application-properties.integration.spring.kafka.streams.ssl.key-store-password) | Store password for the key store file. | | | [`spring.kafka.streams.ssl.key-store-type`](#application-properties.integration.spring.kafka.streams.ssl.key-store-type) | Type of the key store. | | | [`spring.kafka.streams.ssl.protocol`](#application-properties.integration.spring.kafka.streams.ssl.protocol) | SSL protocol to use. | | | [`spring.kafka.streams.ssl.trust-store-certificates`](#application-properties.integration.spring.kafka.streams.ssl.trust-store-certificates) | Trusted certificates in PEM format with X.509 certificates. | | | [`spring.kafka.streams.ssl.trust-store-location`](#application-properties.integration.spring.kafka.streams.ssl.trust-store-location) | Location of the trust store file. | | | [`spring.kafka.streams.ssl.trust-store-password`](#application-properties.integration.spring.kafka.streams.ssl.trust-store-password) | Store password for the trust store file. | | | [`spring.kafka.streams.ssl.trust-store-type`](#application-properties.integration.spring.kafka.streams.ssl.trust-store-type) | Type of the trust store. | | | [`spring.kafka.streams.state-dir`](#application-properties.integration.spring.kafka.streams.state-dir) | Directory location for the state store. | | | [`spring.kafka.template.default-topic`](#application-properties.integration.spring.kafka.template.default-topic) | Default topic to which messages are sent. | | | [`spring.kafka.template.transaction-id-prefix`](#application-properties.integration.spring.kafka.template.transaction-id-prefix) | Transaction id prefix, override the transaction id prefix in the producer factory. | | | [`spring.rabbitmq.address-shuffle-mode`](#application-properties.integration.spring.rabbitmq.address-shuffle-mode) | Mode used to shuffle configured addresses. | `none` | | [`spring.rabbitmq.addresses`](#application-properties.integration.spring.rabbitmq.addresses) | Comma-separated list of addresses to which the client should connect. When set, the host and port are ignored. | | | [`spring.rabbitmq.cache.channel.checkout-timeout`](#application-properties.integration.spring.rabbitmq.cache.channel.checkout-timeout) | Duration to wait to obtain a channel if the cache size has been reached. If 0, always create a new channel. | | | [`spring.rabbitmq.cache.channel.size`](#application-properties.integration.spring.rabbitmq.cache.channel.size) | Number of channels to retain in the cache. When "check-timeout" > 0, max channels per connection. | | | [`spring.rabbitmq.cache.connection.mode`](#application-properties.integration.spring.rabbitmq.cache.connection.mode) | Connection factory cache mode. | `channel` | | [`spring.rabbitmq.cache.connection.size`](#application-properties.integration.spring.rabbitmq.cache.connection.size) | Number of connections to cache. Only applies when mode is CONNECTION. | | | [`spring.rabbitmq.channel-rpc-timeout`](#application-properties.integration.spring.rabbitmq.channel-rpc-timeout) | Continuation timeout for RPC calls in channels. Set it to zero to wait forever. | `10m` | | [`spring.rabbitmq.connection-timeout`](#application-properties.integration.spring.rabbitmq.connection-timeout) | Connection timeout. Set it to zero to wait forever. | | | [`spring.rabbitmq.dynamic`](#application-properties.integration.spring.rabbitmq.dynamic) | Whether to create an AmqpAdmin bean. | `true` | | [`spring.rabbitmq.host`](#application-properties.integration.spring.rabbitmq.host) | RabbitMQ host. Ignored if an address is set. | `localhost` | | [`spring.rabbitmq.listener.direct.acknowledge-mode`](#application-properties.integration.spring.rabbitmq.listener.direct.acknowledge-mode) | Acknowledge mode of container. | | | [`spring.rabbitmq.listener.direct.auto-startup`](#application-properties.integration.spring.rabbitmq.listener.direct.auto-startup) | Whether to start the container automatically on startup. | `true` | | [`spring.rabbitmq.listener.direct.consumers-per-queue`](#application-properties.integration.spring.rabbitmq.listener.direct.consumers-per-queue) | Number of consumers per queue. | | | [`spring.rabbitmq.listener.direct.de-batching-enabled`](#application-properties.integration.spring.rabbitmq.listener.direct.de-batching-enabled) | Whether the container should present batched messages as discrete messages or call the listener with the batch. | `true` | | [`spring.rabbitmq.listener.direct.default-requeue-rejected`](#application-properties.integration.spring.rabbitmq.listener.direct.default-requeue-rejected) | Whether rejected deliveries are re-queued by default. | | | [`spring.rabbitmq.listener.direct.idle-event-interval`](#application-properties.integration.spring.rabbitmq.listener.direct.idle-event-interval) | How often idle container events should be published. | | | [`spring.rabbitmq.listener.direct.missing-queues-fatal`](#application-properties.integration.spring.rabbitmq.listener.direct.missing-queues-fatal) | Whether to fail if the queues declared by the container are not available on the broker. | `false` | | [`spring.rabbitmq.listener.direct.prefetch`](#application-properties.integration.spring.rabbitmq.listener.direct.prefetch) | Maximum number of unacknowledged messages that can be outstanding at each consumer. | | | [`spring.rabbitmq.listener.direct.retry.enabled`](#application-properties.integration.spring.rabbitmq.listener.direct.retry.enabled) | Whether publishing retries are enabled. | `false` | | [`spring.rabbitmq.listener.direct.retry.initial-interval`](#application-properties.integration.spring.rabbitmq.listener.direct.retry.initial-interval) | Duration between the first and second attempt to deliver a message. | `1000ms` | | [`spring.rabbitmq.listener.direct.retry.max-attempts`](#application-properties.integration.spring.rabbitmq.listener.direct.retry.max-attempts) | Maximum number of attempts to deliver a message. | `3` | | [`spring.rabbitmq.listener.direct.retry.max-interval`](#application-properties.integration.spring.rabbitmq.listener.direct.retry.max-interval) | Maximum duration between attempts. | `10000ms` | | [`spring.rabbitmq.listener.direct.retry.multiplier`](#application-properties.integration.spring.rabbitmq.listener.direct.retry.multiplier) | Multiplier to apply to the previous retry interval. | `1` | | [`spring.rabbitmq.listener.direct.retry.stateless`](#application-properties.integration.spring.rabbitmq.listener.direct.retry.stateless) | Whether retries are stateless or stateful. | `true` | | [`spring.rabbitmq.listener.simple.acknowledge-mode`](#application-properties.integration.spring.rabbitmq.listener.simple.acknowledge-mode) | Acknowledge mode of container. | | | [`spring.rabbitmq.listener.simple.auto-startup`](#application-properties.integration.spring.rabbitmq.listener.simple.auto-startup) | Whether to start the container automatically on startup. | `true` | | [`spring.rabbitmq.listener.simple.batch-size`](#application-properties.integration.spring.rabbitmq.listener.simple.batch-size) | Batch size, expressed as the number of physical messages, to be used by the container. | | | [`spring.rabbitmq.listener.simple.concurrency`](#application-properties.integration.spring.rabbitmq.listener.simple.concurrency) | Minimum number of listener invoker threads. | | | [`spring.rabbitmq.listener.simple.consumer-batch-enabled`](#application-properties.integration.spring.rabbitmq.listener.simple.consumer-batch-enabled) | Whether the container creates a batch of messages based on the 'receive-timeout' and 'batch-size'. Coerces 'de-batching-enabled' to true to include the contents of a producer created batch in the batch as discrete records. | `false` | | [`spring.rabbitmq.listener.simple.de-batching-enabled`](#application-properties.integration.spring.rabbitmq.listener.simple.de-batching-enabled) | Whether the container should present batched messages as discrete messages or call the listener with the batch. | `true` | | [`spring.rabbitmq.listener.simple.default-requeue-rejected`](#application-properties.integration.spring.rabbitmq.listener.simple.default-requeue-rejected) | Whether rejected deliveries are re-queued by default. | | | [`spring.rabbitmq.listener.simple.idle-event-interval`](#application-properties.integration.spring.rabbitmq.listener.simple.idle-event-interval) | How often idle container events should be published. | | | [`spring.rabbitmq.listener.simple.max-concurrency`](#application-properties.integration.spring.rabbitmq.listener.simple.max-concurrency) | Maximum number of listener invoker threads. | | | [`spring.rabbitmq.listener.simple.missing-queues-fatal`](#application-properties.integration.spring.rabbitmq.listener.simple.missing-queues-fatal) | Whether to fail if the queues declared by the container are not available on the broker and/or whether to stop the container if one or more queues are deleted at runtime. | `true` | | [`spring.rabbitmq.listener.simple.prefetch`](#application-properties.integration.spring.rabbitmq.listener.simple.prefetch) | Maximum number of unacknowledged messages that can be outstanding at each consumer. | | | [`spring.rabbitmq.listener.simple.retry.enabled`](#application-properties.integration.spring.rabbitmq.listener.simple.retry.enabled) | Whether publishing retries are enabled. | `false` | | [`spring.rabbitmq.listener.simple.retry.initial-interval`](#application-properties.integration.spring.rabbitmq.listener.simple.retry.initial-interval) | Duration between the first and second attempt to deliver a message. | `1000ms` | | [`spring.rabbitmq.listener.simple.retry.max-attempts`](#application-properties.integration.spring.rabbitmq.listener.simple.retry.max-attempts) | Maximum number of attempts to deliver a message. | `3` | | [`spring.rabbitmq.listener.simple.retry.max-interval`](#application-properties.integration.spring.rabbitmq.listener.simple.retry.max-interval) | Maximum duration between attempts. | `10000ms` | | [`spring.rabbitmq.listener.simple.retry.multiplier`](#application-properties.integration.spring.rabbitmq.listener.simple.retry.multiplier) | Multiplier to apply to the previous retry interval. | `1` | | [`spring.rabbitmq.listener.simple.retry.stateless`](#application-properties.integration.spring.rabbitmq.listener.simple.retry.stateless) | Whether retries are stateless or stateful. | `true` | | [`spring.rabbitmq.listener.stream.auto-startup`](#application-properties.integration.spring.rabbitmq.listener.stream.auto-startup) | Whether to start the container automatically on startup. | `true` | | [`spring.rabbitmq.listener.stream.native-listener`](#application-properties.integration.spring.rabbitmq.listener.stream.native-listener) | Whether the container will support listeners that consume native stream messages instead of Spring AMQP messages. | `false` | | [`spring.rabbitmq.listener.type`](#application-properties.integration.spring.rabbitmq.listener.type) | Listener container type. | `simple` | | [`spring.rabbitmq.password`](#application-properties.integration.spring.rabbitmq.password) | Login to authenticate against the broker. | `guest` | | [`spring.rabbitmq.port`](#application-properties.integration.spring.rabbitmq.port) | RabbitMQ port. Ignored if an address is set. Default to 5672, or 5671 if SSL is enabled. | | | [`spring.rabbitmq.publisher-confirm-type`](#application-properties.integration.spring.rabbitmq.publisher-confirm-type) | Type of publisher confirms to use. | | | [`spring.rabbitmq.publisher-returns`](#application-properties.integration.spring.rabbitmq.publisher-returns) | Whether to enable publisher returns. | `false` | | [`spring.rabbitmq.requested-channel-max`](#application-properties.integration.spring.rabbitmq.requested-channel-max) | Number of channels per connection requested by the client. Use 0 for unlimited. | `2047` | | [`spring.rabbitmq.requested-heartbeat`](#application-properties.integration.spring.rabbitmq.requested-heartbeat) | Requested heartbeat timeout; zero for none. If a duration suffix is not specified, seconds will be used. | | | [`spring.rabbitmq.ssl.algorithm`](#application-properties.integration.spring.rabbitmq.ssl.algorithm) | SSL algorithm to use. By default, configured by the Rabbit client library. | | | [`spring.rabbitmq.ssl.enabled`](#application-properties.integration.spring.rabbitmq.ssl.enabled) | Whether to enable SSL support. Determined automatically if an address is provided with the protocol (amqp:// vs. amqps://). | | | [`spring.rabbitmq.ssl.key-store`](#application-properties.integration.spring.rabbitmq.ssl.key-store) | Path to the key store that holds the SSL certificate. | | | [`spring.rabbitmq.ssl.key-store-algorithm`](#application-properties.integration.spring.rabbitmq.ssl.key-store-algorithm) | Key store algorithm. | `SunX509` | | [`spring.rabbitmq.ssl.key-store-password`](#application-properties.integration.spring.rabbitmq.ssl.key-store-password) | Password used to access the key store. | | | [`spring.rabbitmq.ssl.key-store-type`](#application-properties.integration.spring.rabbitmq.ssl.key-store-type) | Key store type. | `PKCS12` | | [`spring.rabbitmq.ssl.trust-store`](#application-properties.integration.spring.rabbitmq.ssl.trust-store) | Trust store that holds SSL certificates. | | | [`spring.rabbitmq.ssl.trust-store-algorithm`](#application-properties.integration.spring.rabbitmq.ssl.trust-store-algorithm) | Trust store algorithm. | `SunX509` | | [`spring.rabbitmq.ssl.trust-store-password`](#application-properties.integration.spring.rabbitmq.ssl.trust-store-password) | Password used to access the trust store. | | | [`spring.rabbitmq.ssl.trust-store-type`](#application-properties.integration.spring.rabbitmq.ssl.trust-store-type) | Trust store type. | `JKS` | | [`spring.rabbitmq.ssl.validate-server-certificate`](#application-properties.integration.spring.rabbitmq.ssl.validate-server-certificate) | Whether to enable server side certificate validation. | `true` | | [`spring.rabbitmq.ssl.verify-hostname`](#application-properties.integration.spring.rabbitmq.ssl.verify-hostname) | Whether to enable hostname verification. | `true` | | [`spring.rabbitmq.stream.host`](#application-properties.integration.spring.rabbitmq.stream.host) | Host of a RabbitMQ instance with the Stream plugin enabled. | `localhost` | | [`spring.rabbitmq.stream.name`](#application-properties.integration.spring.rabbitmq.stream.name) | Name of the stream. | | | [`spring.rabbitmq.stream.password`](#application-properties.integration.spring.rabbitmq.stream.password) | Login password to authenticate to the broker. When not set spring.rabbitmq.password is used. | | | [`spring.rabbitmq.stream.port`](#application-properties.integration.spring.rabbitmq.stream.port) | Stream port of a RabbitMQ instance with the Stream plugin enabled. | | | [`spring.rabbitmq.stream.username`](#application-properties.integration.spring.rabbitmq.stream.username) | Login user to authenticate to the broker. When not set, spring.rabbitmq.username is used. | | | [`spring.rabbitmq.template.default-receive-queue`](#application-properties.integration.spring.rabbitmq.template.default-receive-queue) | Name of the default queue to receive messages from when none is specified explicitly. | | | [`spring.rabbitmq.template.exchange`](#application-properties.integration.spring.rabbitmq.template.exchange) | Name of the default exchange to use for send operations. | | | [`spring.rabbitmq.template.mandatory`](#application-properties.integration.spring.rabbitmq.template.mandatory) | Whether to enable mandatory messages. | | | [`spring.rabbitmq.template.receive-timeout`](#application-properties.integration.spring.rabbitmq.template.receive-timeout) | Timeout for receive() operations. | | | [`spring.rabbitmq.template.reply-timeout`](#application-properties.integration.spring.rabbitmq.template.reply-timeout) | Timeout for sendAndReceive() operations. | | | [`spring.rabbitmq.template.retry.enabled`](#application-properties.integration.spring.rabbitmq.template.retry.enabled) | Whether publishing retries are enabled. | `false` | | [`spring.rabbitmq.template.retry.initial-interval`](#application-properties.integration.spring.rabbitmq.template.retry.initial-interval) | Duration between the first and second attempt to deliver a message. | `1000ms` | | [`spring.rabbitmq.template.retry.max-attempts`](#application-properties.integration.spring.rabbitmq.template.retry.max-attempts) | Maximum number of attempts to deliver a message. | `3` | | [`spring.rabbitmq.template.retry.max-interval`](#application-properties.integration.spring.rabbitmq.template.retry.max-interval) | Maximum duration between attempts. | `10000ms` | | [`spring.rabbitmq.template.retry.multiplier`](#application-properties.integration.spring.rabbitmq.template.retry.multiplier) | Multiplier to apply to the previous retry interval. | `1` | | [`spring.rabbitmq.template.routing-key`](#application-properties.integration.spring.rabbitmq.template.routing-key) | Value of a default routing key to use for send operations. | | | [`spring.rabbitmq.username`](#application-properties.integration.spring.rabbitmq.username) | Login user to authenticate to the broker. | `guest` | | [`spring.rabbitmq.virtual-host`](#application-properties.integration.spring.rabbitmq.virtual-host) | Virtual host to use when connecting to the broker. | | | [`spring.webservices.path`](#application-properties.integration.spring.webservices.path) | Path that serves as the base URI for the services. | `/services` | | [`spring.webservices.servlet.init.*`](#application-properties.integration.spring.webservices.servlet.init) | Servlet init parameters to pass to Spring Web Services. | | | [`spring.webservices.servlet.load-on-startup`](#application-properties.integration.spring.webservices.servlet.load-on-startup) | Load on startup priority of the Spring Web Services servlet. | `-1` | | [`spring.webservices.wsdl-locations`](#application-properties.integration.spring.webservices.wsdl-locations) | Comma-separated list of locations of WSDLs and accompanying XSDs to be exposed as beans. | | 9. Web Properties ------------------ | Name | Description | Default Value | | --- | --- | --- | | [`spring.graphql.cors.allow-credentials`](#application-properties.web.spring.graphql.cors.allow-credentials) | Whether credentials are supported. When not set, credentials are not supported. | | | [`spring.graphql.cors.allowed-headers`](#application-properties.web.spring.graphql.cors.allowed-headers) | Comma-separated list of HTTP headers to allow in a request. '\*' allows all headers. | | | [`spring.graphql.cors.allowed-methods`](#application-properties.web.spring.graphql.cors.allowed-methods) | Comma-separated list of HTTP methods to allow. '\*' allows all methods. When not set, defaults to GET. | | | [`spring.graphql.cors.allowed-origin-patterns`](#application-properties.web.spring.graphql.cors.allowed-origin-patterns) | Comma-separated list of origin patterns to allow. Unlike allowed origins which only support '\*', origin patterns are more flexible, e.g. 'https://\*.example.com', and can be used with allow-credentials. When neither allowed origins nor allowed origin patterns are set, cross-origin requests are effectively disabled. | | | [`spring.graphql.cors.allowed-origins`](#application-properties.web.spring.graphql.cors.allowed-origins) | Comma-separated list of origins to allow with '\*' allowing all origins. When allow-credentials is enabled, '\*' cannot be used, and setting origin patterns should be considered instead. When neither allowed origins nor allowed origin patterns are set, cross-origin requests are effectively disabled. | | | [`spring.graphql.cors.exposed-headers`](#application-properties.web.spring.graphql.cors.exposed-headers) | Comma-separated list of headers to include in a response. | | | [`spring.graphql.cors.max-age`](#application-properties.web.spring.graphql.cors.max-age) | How long the response from a pre-flight request can be cached by clients. If a duration suffix is not specified, seconds will be used. | `1800s` | | [`spring.graphql.graphiql.enabled`](#application-properties.web.spring.graphql.graphiql.enabled) | Whether the default GraphiQL UI is enabled. | `false` | | [`spring.graphql.graphiql.path`](#application-properties.web.spring.graphql.graphiql.path) | Path to the GraphiQL UI endpoint. | `/graphiql` | | [`spring.graphql.path`](#application-properties.web.spring.graphql.path) | Path at which to expose a GraphQL request HTTP endpoint. | `/graphql` | | [`spring.graphql.rsocket.mapping`](#application-properties.web.spring.graphql.rsocket.mapping) | Mapping of the RSocket message handler. | | | [`spring.graphql.schema.file-extensions`](#application-properties.web.spring.graphql.schema.file-extensions) | File extensions for GraphQL schema files. | `.graphqls,.gqls` | | [`spring.graphql.schema.introspection.enabled`](#application-properties.web.spring.graphql.schema.introspection.enabled) | Whether field introspection should be enabled at the schema level. | `true` | | [`spring.graphql.schema.locations`](#application-properties.web.spring.graphql.schema.locations) | Locations of GraphQL schema files. | `classpath:graphql/**/` | | [`spring.graphql.schema.printer.enabled`](#application-properties.web.spring.graphql.schema.printer.enabled) | Whether the endpoint that prints the schema is enabled. Schema is available under spring.graphql.path + "/schema". | `false` | | [`spring.graphql.websocket.connection-init-timeout`](#application-properties.web.spring.graphql.websocket.connection-init-timeout) | Time within which the initial {@code CONNECTION\_INIT} type message must be received. | `60s` | | [`spring.graphql.websocket.path`](#application-properties.web.spring.graphql.websocket.path) | Path of the GraphQL WebSocket subscription endpoint. | | | [`spring.hateoas.use-hal-as-default-json-media-type`](#application-properties.web.spring.hateoas.use-hal-as-default-json-media-type) | Whether application/hal+json responses should be sent to requests that accept application/json. | `true` | | [`spring.jersey.application-path`](#application-properties.web.spring.jersey.application-path) | Path that serves as the base URI for the application. If specified, overrides the value of "@ApplicationPath". | | | [`spring.jersey.filter.order`](#application-properties.web.spring.jersey.filter.order) | Jersey filter chain order. | `0` | | [`spring.jersey.init.*`](#application-properties.web.spring.jersey.init) | Init parameters to pass to Jersey through the servlet or filter. | | | [`spring.jersey.servlet.load-on-startup`](#application-properties.web.spring.jersey.servlet.load-on-startup) | Load on startup priority of the Jersey servlet. | `-1` | | [`spring.jersey.type`](#application-properties.web.spring.jersey.type) | Jersey integration type. | `servlet` | | [`spring.mvc.async.request-timeout`](#application-properties.web.spring.mvc.async.request-timeout) | Amount of time before asynchronous request handling times out. If this value is not set, the default timeout of the underlying implementation is used. | | | [`spring.mvc.contentnegotiation.favor-parameter`](#application-properties.web.spring.mvc.contentnegotiation.favor-parameter) | Whether a request parameter ("format" by default) should be used to determine the requested media type. | `false` | | [`spring.mvc.contentnegotiation.media-types.*`](#application-properties.web.spring.mvc.contentnegotiation.media-types) | Map file extensions to media types for content negotiation. For instance, yml to text/yaml. | | | [`spring.mvc.contentnegotiation.parameter-name`](#application-properties.web.spring.mvc.contentnegotiation.parameter-name) | Query parameter name to use when "favor-parameter" is enabled. | | | [`spring.mvc.converters.preferred-json-mapper`](#application-properties.web.spring.mvc.converters.preferred-json-mapper) | Preferred JSON mapper to use for HTTP message conversion. By default, auto-detected according to the environment. | | | [`spring.mvc.dispatch-options-request`](#application-properties.web.spring.mvc.dispatch-options-request) | Whether to dispatch OPTIONS requests to the FrameworkServlet doService method. | `true` | | [`spring.mvc.dispatch-trace-request`](#application-properties.web.spring.mvc.dispatch-trace-request) | Whether to dispatch TRACE requests to the FrameworkServlet doService method. | `false` | | [`spring.mvc.format.date`](#application-properties.web.spring.mvc.format.date) | Date format to use, for example 'dd/MM/yyyy'. | | | [`spring.mvc.format.date-time`](#application-properties.web.spring.mvc.format.date-time) | Date-time format to use, for example 'yyyy-MM-dd HH:mm:ss'. | | | [`spring.mvc.format.time`](#application-properties.web.spring.mvc.format.time) | Time format to use, for example 'HH:mm:ss'. | | | [`spring.mvc.formcontent.filter.enabled`](#application-properties.web.spring.mvc.formcontent.filter.enabled) | Whether to enable Spring's FormContentFilter. | `true` | | [`spring.mvc.hiddenmethod.filter.enabled`](#application-properties.web.spring.mvc.hiddenmethod.filter.enabled) | Whether to enable Spring's HiddenHttpMethodFilter. | `false` | | [`spring.mvc.ignore-default-model-on-redirect`](#application-properties.web.spring.mvc.ignore-default-model-on-redirect) | Whether the content of the "default" model should be ignored during redirect scenarios. | `true` | | [`spring.mvc.log-request-details`](#application-properties.web.spring.mvc.log-request-details) | Whether logging of (potentially sensitive) request details at DEBUG and TRACE level is allowed. | `false` | | [`spring.mvc.log-resolved-exception`](#application-properties.web.spring.mvc.log-resolved-exception) | Whether to enable warn logging of exceptions resolved by a "HandlerExceptionResolver", except for "DefaultHandlerExceptionResolver". | `false` | | [`spring.mvc.message-codes-resolver-format`](#application-properties.web.spring.mvc.message-codes-resolver-format) | Formatting strategy for message codes. For instance, 'PREFIX\_ERROR\_CODE'. | | | [`spring.mvc.pathmatch.matching-strategy`](#application-properties.web.spring.mvc.pathmatch.matching-strategy) | Choice of strategy for matching request paths against registered mappings. | `path-pattern-parser` | | [`spring.mvc.publish-request-handled-events`](#application-properties.web.spring.mvc.publish-request-handled-events) | Whether to publish a ServletRequestHandledEvent at the end of each request. | `true` | | [`spring.mvc.servlet.load-on-startup`](#application-properties.web.spring.mvc.servlet.load-on-startup) | Load on startup priority of the dispatcher servlet. | `-1` | | [`spring.mvc.servlet.path`](#application-properties.web.spring.mvc.servlet.path) | Path of the dispatcher servlet. Setting a custom value for this property is not compatible with the PathPatternParser matching strategy. | `/` | | [`spring.mvc.static-path-pattern`](#application-properties.web.spring.mvc.static-path-pattern) | Path pattern used for static resources. | `/**` | | [`spring.mvc.throw-exception-if-no-handler-found`](#application-properties.web.spring.mvc.throw-exception-if-no-handler-found) | Whether a "NoHandlerFoundException" should be thrown if no Handler was found to process a request. | `false` | | [`spring.mvc.view.prefix`](#application-properties.web.spring.mvc.view.prefix) | Spring MVC view prefix. | | | [`spring.mvc.view.suffix`](#application-properties.web.spring.mvc.view.suffix) | Spring MVC view suffix. | | | [`spring.netty.leak-detection`](#application-properties.web.spring.netty.leak-detection) | Level of leak detection for reference-counted buffers. | `simple` | | [`spring.servlet.multipart.enabled`](#application-properties.web.spring.servlet.multipart.enabled) | Whether to enable support of multipart uploads. | `true` | | [`spring.servlet.multipart.file-size-threshold`](#application-properties.web.spring.servlet.multipart.file-size-threshold) | Threshold after which files are written to disk. | `0B` | | [`spring.servlet.multipart.location`](#application-properties.web.spring.servlet.multipart.location) | Intermediate location of uploaded files. | | | [`spring.servlet.multipart.max-file-size`](#application-properties.web.spring.servlet.multipart.max-file-size) | Max file size. | `1MB` | | [`spring.servlet.multipart.max-request-size`](#application-properties.web.spring.servlet.multipart.max-request-size) | Max request size. | `10MB` | | [`spring.servlet.multipart.resolve-lazily`](#application-properties.web.spring.servlet.multipart.resolve-lazily) | Whether to resolve the multipart request lazily at the time of file or parameter access. | `false` | | [`spring.session.hazelcast.flush-mode`](#application-properties.web.spring.session.hazelcast.flush-mode) | Sessions flush mode. Determines when session changes are written to the session store. | `on-save` | | [`spring.session.hazelcast.map-name`](#application-properties.web.spring.session.hazelcast.map-name) | Name of the map used to store sessions. | `spring:session:sessions` | | [`spring.session.hazelcast.save-mode`](#application-properties.web.spring.session.hazelcast.save-mode) | Sessions save mode. Determines how session changes are tracked and saved to the session store. | `on-set-attribute` | | [`spring.session.jdbc.cleanup-cron`](#application-properties.web.spring.session.jdbc.cleanup-cron) | Cron expression for expired session cleanup job. | `0 * * * * *` | | [`spring.session.jdbc.flush-mode`](#application-properties.web.spring.session.jdbc.flush-mode) | Sessions flush mode. Determines when session changes are written to the session store. | `on-save` | | [`spring.session.jdbc.initialize-schema`](#application-properties.web.spring.session.jdbc.initialize-schema) | Database schema initialization mode. | `embedded` | | [`spring.session.jdbc.platform`](#application-properties.web.spring.session.jdbc.platform) | Platform to use in initialization scripts if the @@platform@@ placeholder is used. Auto-detected by default. | | | [`spring.session.jdbc.save-mode`](#application-properties.web.spring.session.jdbc.save-mode) | Sessions save mode. Determines how session changes are tracked and saved to the session store. | `on-set-attribute` | | [`spring.session.jdbc.schema`](#application-properties.web.spring.session.jdbc.schema) | Path to the SQL file to use to initialize the database schema. | `classpath:org/springframework/session/jdbc/schema-@@platform@@.sql` | | [`spring.session.jdbc.table-name`](#application-properties.web.spring.session.jdbc.table-name) | Name of the database table used to store sessions. | `SPRING_SESSION` | | [`spring.session.mongodb.collection-name`](#application-properties.web.spring.session.mongodb.collection-name) | Collection name used to store sessions. | `sessions` | | [`spring.session.redis.cleanup-cron`](#application-properties.web.spring.session.redis.cleanup-cron) | Cron expression for expired session cleanup job. | `0 * * * * *` | | [`spring.session.redis.configure-action`](#application-properties.web.spring.session.redis.configure-action) | The configure action to apply when no user defined ConfigureRedisAction bean is present. | `notify-keyspace-events` | | [`spring.session.redis.flush-mode`](#application-properties.web.spring.session.redis.flush-mode) | Sessions flush mode. Determines when session changes are written to the session store. | `on-save` | | [`spring.session.redis.namespace`](#application-properties.web.spring.session.redis.namespace) | Namespace for keys used to store sessions. | `spring:session` | | [`spring.session.redis.save-mode`](#application-properties.web.spring.session.redis.save-mode) | Sessions save mode. Determines how session changes are tracked and saved to the session store. | `on-set-attribute` | | [`spring.session.servlet.filter-dispatcher-types`](#application-properties.web.spring.session.servlet.filter-dispatcher-types) | Session repository filter dispatcher types. | `[async, error, request]` | | [`spring.session.servlet.filter-order`](#application-properties.web.spring.session.servlet.filter-order) | Session repository filter order. | | | [`spring.session.store-type`](#application-properties.web.spring.session.store-type) | Session store type. | | | [`spring.session.timeout`](#application-properties.web.spring.session.timeout) | Session timeout. If a duration suffix is not specified, seconds will be used. | | | [`spring.web.locale`](#application-properties.web.spring.web.locale) | Locale to use. By default, this locale is overridden by the "Accept-Language" header. | | | [`spring.web.locale-resolver`](#application-properties.web.spring.web.locale-resolver) | Define how the locale should be resolved. | `accept-header` | | [`spring.web.resources.add-mappings`](#application-properties.web.spring.web.resources.add-mappings) | Whether to enable default resource handling. | `true` | | [`spring.web.resources.cache.cachecontrol.cache-private`](#application-properties.web.spring.web.resources.cache.cachecontrol.cache-private) | Indicate that the response message is intended for a single user and must not be stored by a shared cache. | | | [`spring.web.resources.cache.cachecontrol.cache-public`](#application-properties.web.spring.web.resources.cache.cachecontrol.cache-public) | Indicate that any cache may store the response. | | | [`spring.web.resources.cache.cachecontrol.max-age`](#application-properties.web.spring.web.resources.cache.cachecontrol.max-age) | Maximum time the response should be cached, in seconds if no duration suffix is not specified. | | | [`spring.web.resources.cache.cachecontrol.must-revalidate`](#application-properties.web.spring.web.resources.cache.cachecontrol.must-revalidate) | Indicate that once it has become stale, a cache must not use the response without re-validating it with the server. | | | [`spring.web.resources.cache.cachecontrol.no-cache`](#application-properties.web.spring.web.resources.cache.cachecontrol.no-cache) | Indicate that the cached response can be reused only if re-validated with the server. | | | [`spring.web.resources.cache.cachecontrol.no-store`](#application-properties.web.spring.web.resources.cache.cachecontrol.no-store) | Indicate to not cache the response in any case. | | | [`spring.web.resources.cache.cachecontrol.no-transform`](#application-properties.web.spring.web.resources.cache.cachecontrol.no-transform) | Indicate intermediaries (caches and others) that they should not transform the response content. | | | [`spring.web.resources.cache.cachecontrol.proxy-revalidate`](#application-properties.web.spring.web.resources.cache.cachecontrol.proxy-revalidate) | Same meaning as the "must-revalidate" directive, except that it does not apply to private caches. | | | [`spring.web.resources.cache.cachecontrol.s-max-age`](#application-properties.web.spring.web.resources.cache.cachecontrol.s-max-age) | Maximum time the response should be cached by shared caches, in seconds if no duration suffix is not specified. | | | [`spring.web.resources.cache.cachecontrol.stale-if-error`](#application-properties.web.spring.web.resources.cache.cachecontrol.stale-if-error) | Maximum time the response may be used when errors are encountered, in seconds if no duration suffix is not specified. | | | [`spring.web.resources.cache.cachecontrol.stale-while-revalidate`](#application-properties.web.spring.web.resources.cache.cachecontrol.stale-while-revalidate) | Maximum time the response can be served after it becomes stale, in seconds if no duration suffix is not specified. | | | [`spring.web.resources.cache.period`](#application-properties.web.spring.web.resources.cache.period) | Cache period for the resources served by the resource handler. If a duration suffix is not specified, seconds will be used. Can be overridden by the 'spring.web.resources.cache.cachecontrol' properties. | | | [`spring.web.resources.cache.use-last-modified`](#application-properties.web.spring.web.resources.cache.use-last-modified) | Whether we should use the "lastModified" metadata of the files in HTTP caching headers. | `true` | | [`spring.web.resources.chain.cache`](#application-properties.web.spring.web.resources.chain.cache) | Whether to enable caching in the Resource chain. | `true` | | [`spring.web.resources.chain.compressed`](#application-properties.web.spring.web.resources.chain.compressed) | Whether to enable resolution of already compressed resources (gzip, brotli). Checks for a resource name with the '.gz' or '.br' file extensions. | `false` | | [`spring.web.resources.chain.enabled`](#application-properties.web.spring.web.resources.chain.enabled) | Whether to enable the Spring Resource Handling chain. By default, disabled unless at least one strategy has been enabled. | | | [`spring.web.resources.chain.strategy.content.enabled`](#application-properties.web.spring.web.resources.chain.strategy.content.enabled) | Whether to enable the content Version Strategy. | `false` | | [`spring.web.resources.chain.strategy.content.paths`](#application-properties.web.spring.web.resources.chain.strategy.content.paths) | Comma-separated list of patterns to apply to the content Version Strategy. | `[/**]` | | [`spring.web.resources.chain.strategy.fixed.enabled`](#application-properties.web.spring.web.resources.chain.strategy.fixed.enabled) | Whether to enable the fixed Version Strategy. | `false` | | [`spring.web.resources.chain.strategy.fixed.paths`](#application-properties.web.spring.web.resources.chain.strategy.fixed.paths) | Comma-separated list of patterns to apply to the fixed Version Strategy. | `[/**]` | | [`spring.web.resources.chain.strategy.fixed.version`](#application-properties.web.spring.web.resources.chain.strategy.fixed.version) | Version string to use for the fixed Version Strategy. | | | [`spring.web.resources.static-locations`](#application-properties.web.spring.web.resources.static-locations) | Locations of static resources. Defaults to classpath:[/META-INF/resources/, /resources/, /static/, /public/]. | `[classpath:/META-INF/resources/, classpath:/resources/, classpath:/static/, classpath:/public/]` | | [`spring.webflux.base-path`](#application-properties.web.spring.webflux.base-path) | Base path for all web handlers. | | | [`spring.webflux.format.date`](#application-properties.web.spring.webflux.format.date) | Date format to use, for example 'dd/MM/yyyy'. | | | [`spring.webflux.format.date-time`](#application-properties.web.spring.webflux.format.date-time) | Date-time format to use, for example 'yyyy-MM-dd HH:mm:ss'. | | | [`spring.webflux.format.time`](#application-properties.web.spring.webflux.format.time) | Time format to use, for example 'HH:mm:ss'. | | | [`spring.webflux.hiddenmethod.filter.enabled`](#application-properties.web.spring.webflux.hiddenmethod.filter.enabled) | Whether to enable Spring's HiddenHttpMethodFilter. | `false` | | [`spring.webflux.multipart.file-storage-directory`](#application-properties.web.spring.webflux.multipart.file-storage-directory) | Directory used to store file parts larger than 'maxInMemorySize'. Default is a directory named 'spring-multipart' created under the system temporary directory. Ignored when streaming is enabled. | | | [`spring.webflux.multipart.headers-charset`](#application-properties.web.spring.webflux.multipart.headers-charset) | Character set used to decode headers. | `UTF-8` | | [`spring.webflux.multipart.max-disk-usage-per-part`](#application-properties.web.spring.webflux.multipart.max-disk-usage-per-part) | Maximum amount of disk space allowed per part. Default is -1 which enforces no limits. Ignored when streaming is enabled. | `-1B` | | [`spring.webflux.multipart.max-headers-size`](#application-properties.web.spring.webflux.multipart.max-headers-size) | Maximum amount of memory allowed per headers section of each part. Set to -1 to enforce no limits. | `10KB` | | [`spring.webflux.multipart.max-in-memory-size`](#application-properties.web.spring.webflux.multipart.max-in-memory-size) | Maximum amount of memory allowed per part before it's written to disk. Set to -1 to store all contents in memory. Ignored when streaming is enabled. | `256KB` | | [`spring.webflux.multipart.max-parts`](#application-properties.web.spring.webflux.multipart.max-parts) | Maximum number of parts allowed in a given multipart request. Default is -1 which enforces no limits. | `-1` | | [`spring.webflux.multipart.streaming`](#application-properties.web.spring.webflux.multipart.streaming) | Whether to stream directly from the parsed input buffer stream without storing in memory nor file. Default is non-streaming. | `false` | | [`spring.webflux.session.timeout`](#application-properties.web.spring.webflux.session.timeout) | | `30m` | | [`spring.webflux.static-path-pattern`](#application-properties.web.spring.webflux.static-path-pattern) | Path pattern used for static resources. | `/**` | 10. Templating Properties -------------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.freemarker.allow-request-override`](#application-properties.templating.spring.freemarker.allow-request-override) | Whether HttpServletRequest attributes are allowed to override (hide) controller generated model attributes of the same name. | `false` | | [`spring.freemarker.allow-session-override`](#application-properties.templating.spring.freemarker.allow-session-override) | Whether HttpSession attributes are allowed to override (hide) controller generated model attributes of the same name. | `false` | | [`spring.freemarker.cache`](#application-properties.templating.spring.freemarker.cache) | Whether to enable template caching. | `false` | | [`spring.freemarker.charset`](#application-properties.templating.spring.freemarker.charset) | Template encoding. | `UTF-8` | | [`spring.freemarker.check-template-location`](#application-properties.templating.spring.freemarker.check-template-location) | Whether to check that the templates location exists. | `true` | | [`spring.freemarker.content-type`](#application-properties.templating.spring.freemarker.content-type) | Content-Type value. | `text/html` | | [`spring.freemarker.enabled`](#application-properties.templating.spring.freemarker.enabled) | Whether to enable MVC view resolution for this technology. | `true` | | [`spring.freemarker.expose-request-attributes`](#application-properties.templating.spring.freemarker.expose-request-attributes) | Whether all request attributes should be added to the model prior to merging with the template. | `false` | | [`spring.freemarker.expose-session-attributes`](#application-properties.templating.spring.freemarker.expose-session-attributes) | Whether all HttpSession attributes should be added to the model prior to merging with the template. | `false` | | [`spring.freemarker.expose-spring-macro-helpers`](#application-properties.templating.spring.freemarker.expose-spring-macro-helpers) | Whether to expose a RequestContext for use by Spring's macro library, under the name "springMacroRequestContext". | `true` | | [`spring.freemarker.prefer-file-system-access`](#application-properties.templating.spring.freemarker.prefer-file-system-access) | Whether to prefer file system access for template loading to enable hot detection of template changes. When a template path is detected as a directory, templates are loaded from the directory only and other matching classpath locations will not be considered. | `false` | | [`spring.freemarker.prefix`](#application-properties.templating.spring.freemarker.prefix) | Prefix that gets prepended to view names when building a URL. | | | [`spring.freemarker.request-context-attribute`](#application-properties.templating.spring.freemarker.request-context-attribute) | Name of the RequestContext attribute for all views. | | | [`spring.freemarker.settings.*`](#application-properties.templating.spring.freemarker.settings) | Well-known FreeMarker keys which are passed to FreeMarker's Configuration. | | | [`spring.freemarker.suffix`](#application-properties.templating.spring.freemarker.suffix) | Suffix that gets appended to view names when building a URL. | `.ftlh` | | [`spring.freemarker.template-loader-path`](#application-properties.templating.spring.freemarker.template-loader-path) | Comma-separated list of template paths. | `[classpath:/templates/]` | | [`spring.freemarker.view-names`](#application-properties.templating.spring.freemarker.view-names) | View names that can be resolved. | | | [`spring.groovy.template.allow-request-override`](#application-properties.templating.spring.groovy.template.allow-request-override) | Whether HttpServletRequest attributes are allowed to override (hide) controller generated model attributes of the same name. | `false` | | [`spring.groovy.template.allow-session-override`](#application-properties.templating.spring.groovy.template.allow-session-override) | Whether HttpSession attributes are allowed to override (hide) controller generated model attributes of the same name. | `false` | | [`spring.groovy.template.cache`](#application-properties.templating.spring.groovy.template.cache) | Whether to enable template caching. | `false` | | [`spring.groovy.template.charset`](#application-properties.templating.spring.groovy.template.charset) | Template encoding. | `UTF-8` | | [`spring.groovy.template.check-template-location`](#application-properties.templating.spring.groovy.template.check-template-location) | Whether to check that the templates location exists. | `true` | | [`spring.groovy.template.configuration.auto-escape` `spring.groovy.template.configuration.auto-indent` `spring.groovy.template.configuration.auto-indent-string` `spring.groovy.template.configuration.auto-new-line` `spring.groovy.template.configuration.base-template-class` `spring.groovy.template.configuration.cache-templates` `spring.groovy.template.configuration.declaration-encoding` `spring.groovy.template.configuration.expand-empty-elements` `spring.groovy.template.configuration.locale` `spring.groovy.template.configuration.new-line-string` `spring.groovy.template.configuration.resource-loader-path` `spring.groovy.template.configuration.use-double-quotes`](#application-properties.templating.spring.groovy.template.configuration) | See GroovyMarkupConfigurer | | | [`spring.groovy.template.content-type`](#application-properties.templating.spring.groovy.template.content-type) | Content-Type value. | `text/html` | | [`spring.groovy.template.enabled`](#application-properties.templating.spring.groovy.template.enabled) | Whether to enable MVC view resolution for this technology. | `true` | | [`spring.groovy.template.expose-request-attributes`](#application-properties.templating.spring.groovy.template.expose-request-attributes) | Whether all request attributes should be added to the model prior to merging with the template. | `false` | | [`spring.groovy.template.expose-session-attributes`](#application-properties.templating.spring.groovy.template.expose-session-attributes) | Whether all HttpSession attributes should be added to the model prior to merging with the template. | `false` | | [`spring.groovy.template.expose-spring-macro-helpers`](#application-properties.templating.spring.groovy.template.expose-spring-macro-helpers) | Whether to expose a RequestContext for use by Spring's macro library, under the name "springMacroRequestContext". | `true` | | [`spring.groovy.template.prefix`](#application-properties.templating.spring.groovy.template.prefix) | Prefix that gets prepended to view names when building a URL. | | | [`spring.groovy.template.request-context-attribute`](#application-properties.templating.spring.groovy.template.request-context-attribute) | Name of the RequestContext attribute for all views. | | | [`spring.groovy.template.resource-loader-path`](#application-properties.templating.spring.groovy.template.resource-loader-path) | Template path. | `classpath:/templates/` | | [`spring.groovy.template.suffix`](#application-properties.templating.spring.groovy.template.suffix) | Suffix that gets appended to view names when building a URL. | `.tpl` | | [`spring.groovy.template.view-names`](#application-properties.templating.spring.groovy.template.view-names) | View names that can be resolved. | | | [`spring.mustache.charset`](#application-properties.templating.spring.mustache.charset) | Template encoding. | `UTF-8` | | [`spring.mustache.check-template-location`](#application-properties.templating.spring.mustache.check-template-location) | Whether to check that the templates location exists. | `true` | | [`spring.mustache.enabled`](#application-properties.templating.spring.mustache.enabled) | Whether to enable MVC view resolution for Mustache. | `true` | | [`spring.mustache.prefix`](#application-properties.templating.spring.mustache.prefix) | Prefix to apply to template names. | `classpath:/templates/` | | [`spring.mustache.reactive.media-types`](#application-properties.templating.spring.mustache.reactive.media-types) | Media types supported by Mustache views. | `text/html;charset=UTF-8` | | [`spring.mustache.request-context-attribute`](#application-properties.templating.spring.mustache.request-context-attribute) | Name of the RequestContext attribute for all views. | | | [`spring.mustache.servlet.allow-request-override`](#application-properties.templating.spring.mustache.servlet.allow-request-override) | Whether HttpServletRequest attributes are allowed to override (hide) controller generated model attributes of the same name. | `false` | | [`spring.mustache.servlet.allow-session-override`](#application-properties.templating.spring.mustache.servlet.allow-session-override) | Whether HttpSession attributes are allowed to override (hide) controller generated model attributes of the same name. | `false` | | [`spring.mustache.servlet.cache`](#application-properties.templating.spring.mustache.servlet.cache) | Whether to enable template caching. | `false` | | [`spring.mustache.servlet.content-type`](#application-properties.templating.spring.mustache.servlet.content-type) | Content-Type value. | | | [`spring.mustache.servlet.expose-request-attributes`](#application-properties.templating.spring.mustache.servlet.expose-request-attributes) | Whether all request attributes should be added to the model prior to merging with the template. | `false` | | [`spring.mustache.servlet.expose-session-attributes`](#application-properties.templating.spring.mustache.servlet.expose-session-attributes) | Whether all HttpSession attributes should be added to the model prior to merging with the template. | `false` | | [`spring.mustache.servlet.expose-spring-macro-helpers`](#application-properties.templating.spring.mustache.servlet.expose-spring-macro-helpers) | Whether to expose a RequestContext for use by Spring's macro library, under the name "springMacroRequestContext". | `true` | | [`spring.mustache.suffix`](#application-properties.templating.spring.mustache.suffix) | Suffix to apply to template names. | `.mustache` | | [`spring.mustache.view-names`](#application-properties.templating.spring.mustache.view-names) | View names that can be resolved. | | | [`spring.thymeleaf.cache`](#application-properties.templating.spring.thymeleaf.cache) | Whether to enable template caching. | `true` | | [`spring.thymeleaf.check-template`](#application-properties.templating.spring.thymeleaf.check-template) | Whether to check that the template exists before rendering it. | `true` | | [`spring.thymeleaf.check-template-location`](#application-properties.templating.spring.thymeleaf.check-template-location) | Whether to check that the templates location exists. | `true` | | [`spring.thymeleaf.enable-spring-el-compiler`](#application-properties.templating.spring.thymeleaf.enable-spring-el-compiler) | Enable the SpringEL compiler in SpringEL expressions. | `false` | | [`spring.thymeleaf.enabled`](#application-properties.templating.spring.thymeleaf.enabled) | Whether to enable Thymeleaf view resolution for Web frameworks. | `true` | | [`spring.thymeleaf.encoding`](#application-properties.templating.spring.thymeleaf.encoding) | Template files encoding. | `UTF-8` | | [`spring.thymeleaf.excluded-view-names`](#application-properties.templating.spring.thymeleaf.excluded-view-names) | Comma-separated list of view names (patterns allowed) that should be excluded from resolution. | | | [`spring.thymeleaf.mode`](#application-properties.templating.spring.thymeleaf.mode) | Template mode to be applied to templates. See also Thymeleaf's TemplateMode enum. | `HTML` | | [`spring.thymeleaf.prefix`](#application-properties.templating.spring.thymeleaf.prefix) | Prefix that gets prepended to view names when building a URL. | `classpath:/templates/` | | [`spring.thymeleaf.reactive.chunked-mode-view-names`](#application-properties.templating.spring.thymeleaf.reactive.chunked-mode-view-names) | Comma-separated list of view names (patterns allowed) that should be the only ones executed in CHUNKED mode when a max chunk size is set. | | | [`spring.thymeleaf.reactive.full-mode-view-names`](#application-properties.templating.spring.thymeleaf.reactive.full-mode-view-names) | Comma-separated list of view names (patterns allowed) that should be executed in FULL mode even if a max chunk size is set. | | | [`spring.thymeleaf.reactive.max-chunk-size`](#application-properties.templating.spring.thymeleaf.reactive.max-chunk-size) | Maximum size of data buffers used for writing to the response. Templates will execute in CHUNKED mode by default if this is set. | `0B` | | [`spring.thymeleaf.reactive.media-types`](#application-properties.templating.spring.thymeleaf.reactive.media-types) | Media types supported by the view technology. | `[text/html, application/xhtml+xml, application/xml, text/xml, application/rss+xml, application/atom+xml, application/javascript, application/ecmascript, text/javascript, text/ecmascript, application/json, text/css, text/plain, text/event-stream]` | | [`spring.thymeleaf.render-hidden-markers-before-checkboxes`](#application-properties.templating.spring.thymeleaf.render-hidden-markers-before-checkboxes) | Whether hidden form inputs acting as markers for checkboxes should be rendered before the checkbox element itself. | `false` | | [`spring.thymeleaf.servlet.content-type`](#application-properties.templating.spring.thymeleaf.servlet.content-type) | Content-Type value written to HTTP responses. | `text/html` | | [`spring.thymeleaf.servlet.produce-partial-output-while-processing`](#application-properties.templating.spring.thymeleaf.servlet.produce-partial-output-while-processing) | Whether Thymeleaf should start writing partial output as soon as possible or buffer until template processing is finished. | `true` | | [`spring.thymeleaf.suffix`](#application-properties.templating.spring.thymeleaf.suffix) | Suffix that gets appended to view names when building a URL. | `.html` | | [`spring.thymeleaf.template-resolver-order`](#application-properties.templating.spring.thymeleaf.template-resolver-order) | Order of the template resolver in the chain. By default, the template resolver is first in the chain. Order start at 1 and should only be set if you have defined additional "TemplateResolver" beans. | | | [`spring.thymeleaf.view-names`](#application-properties.templating.spring.thymeleaf.view-names) | Comma-separated list of view names (patterns allowed) that can be resolved. | | 11. Server Properties ---------------------- | Name | Description | Default Value | | --- | --- | --- | | [`server.address`](#application-properties.server.server.address) | Network address to which the server should bind. | | | [`server.compression.enabled`](#application-properties.server.server.compression.enabled) | Whether response compression is enabled. | `false` | | [`server.compression.excluded-user-agents`](#application-properties.server.server.compression.excluded-user-agents) | Comma-separated list of user agents for which responses should not be compressed. | | | [`server.compression.mime-types`](#application-properties.server.server.compression.mime-types) | Comma-separated list of MIME types that should be compressed. | `[text/html, text/xml, text/plain, text/css, text/javascript, application/javascript, application/json, application/xml]` | | [`server.compression.min-response-size`](#application-properties.server.server.compression.min-response-size) | Minimum "Content-Length" value that is required for compression to be performed. | `2KB` | | [`server.error.include-binding-errors`](#application-properties.server.server.error.include-binding-errors) | When to include "errors" attribute. | `never` | | [`server.error.include-exception`](#application-properties.server.server.error.include-exception) | Include the "exception" attribute. | `false` | | [`server.error.include-message`](#application-properties.server.server.error.include-message) | When to include "message" attribute. | `never` | | [`server.error.include-stacktrace`](#application-properties.server.server.error.include-stacktrace) | When to include the "trace" attribute. | `never` | | [`server.error.path`](#application-properties.server.server.error.path) | Path of the error controller. | `/error` | | [`server.error.whitelabel.enabled`](#application-properties.server.server.error.whitelabel.enabled) | Whether to enable the default error page displayed in browsers in case of a server error. | `true` | | [`server.forward-headers-strategy`](#application-properties.server.server.forward-headers-strategy) | Strategy for handling X-Forwarded-\* headers. | | | [`server.http2.enabled`](#application-properties.server.server.http2.enabled) | Whether to enable HTTP/2 support, if the current environment supports it. | `false` | | [`server.jetty.accesslog.append`](#application-properties.server.server.jetty.accesslog.append) | Append to log. | `false` | | [`server.jetty.accesslog.custom-format`](#application-properties.server.server.jetty.accesslog.custom-format) | Custom log format, see org.eclipse.jetty.server.CustomRequestLog. If defined, overrides the "format" configuration key. | | | [`server.jetty.accesslog.enabled`](#application-properties.server.server.jetty.accesslog.enabled) | Enable access log. | `false` | | [`server.jetty.accesslog.file-date-format`](#application-properties.server.server.jetty.accesslog.file-date-format) | Date format to place in log file name. | | | [`server.jetty.accesslog.filename`](#application-properties.server.server.jetty.accesslog.filename) | Log filename. If not specified, logs redirect to "System.err". | | | [`server.jetty.accesslog.format`](#application-properties.server.server.jetty.accesslog.format) | Log format. | `ncsa` | | [`server.jetty.accesslog.ignore-paths`](#application-properties.server.server.jetty.accesslog.ignore-paths) | Request paths that should not be logged. | | | [`server.jetty.accesslog.retention-period`](#application-properties.server.server.jetty.accesslog.retention-period) | Number of days before rotated log files are deleted. | `31` | | [`server.jetty.connection-idle-timeout`](#application-properties.server.server.jetty.connection-idle-timeout) | Time that the connection can be idle before it is closed. | | | [`server.jetty.max-http-form-post-size`](#application-properties.server.server.jetty.max-http-form-post-size) | Maximum size of the form content in any HTTP post request. | `200000B` | | [`server.jetty.threads.acceptors`](#application-properties.server.server.jetty.threads.acceptors) | Number of acceptor threads to use. When the value is -1, the default, the number of acceptors is derived from the operating environment. | `-1` | | [`server.jetty.threads.idle-timeout`](#application-properties.server.server.jetty.threads.idle-timeout) | Maximum thread idle time. | `60000ms` | | [`server.jetty.threads.max`](#application-properties.server.server.jetty.threads.max) | Maximum number of threads. | `200` | | [`server.jetty.threads.max-queue-capacity`](#application-properties.server.server.jetty.threads.max-queue-capacity) | Maximum capacity of the thread pool's backing queue. A default is computed based on the threading configuration. | | | [`server.jetty.threads.min`](#application-properties.server.server.jetty.threads.min) | Minimum number of threads. | `8` | | [`server.jetty.threads.selectors`](#application-properties.server.server.jetty.threads.selectors) | Number of selector threads to use. When the value is -1, the default, the number of selectors is derived from the operating environment. | `-1` | | [`server.max-http-header-size`](#application-properties.server.server.max-http-header-size) | Maximum size of the HTTP message header. | `8KB` | | [`server.netty.connection-timeout`](#application-properties.server.server.netty.connection-timeout) | Connection timeout of the Netty channel. | | | [`server.netty.h2c-max-content-length`](#application-properties.server.server.netty.h2c-max-content-length) | Maximum content length of an H2C upgrade request. | `0B` | | [`server.netty.idle-timeout`](#application-properties.server.server.netty.idle-timeout) | Idle timeout of the Netty channel. When not specified, an infinite timeout is used. | | | [`server.netty.initial-buffer-size`](#application-properties.server.server.netty.initial-buffer-size) | Initial buffer size for HTTP request decoding. | `128B` | | [`server.netty.max-chunk-size`](#application-properties.server.server.netty.max-chunk-size) | Maximum chunk size that can be decoded for an HTTP request. | `8KB` | | [`server.netty.max-initial-line-length`](#application-properties.server.server.netty.max-initial-line-length) | Maximum length that can be decoded for an HTTP request's initial line. | `4KB` | | [`server.netty.max-keep-alive-requests`](#application-properties.server.server.netty.max-keep-alive-requests) | Maximum number of requests that can be made per connection. By default, a connection serves unlimited number of requests. | | | [`server.netty.validate-headers`](#application-properties.server.server.netty.validate-headers) | Whether to validate headers when decoding requests. | `true` | | [`server.port`](#application-properties.server.server.port) | Server HTTP port. | `8080` | | [`server.reactive.session.timeout`](#application-properties.server.server.reactive.session.timeout) | Session timeout. If a duration suffix is not specified, seconds will be used. | `30m` | | [`server.server-header`](#application-properties.server.server.server-header) | Value to use for the Server response header (if empty, no header is sent). | | | [`server.servlet.application-display-name`](#application-properties.server.server.servlet.application-display-name) | Display name of the application. | `application` | | [`server.servlet.context-parameters.*`](#application-properties.server.server.servlet.context-parameters) | Servlet context init parameters. | | | [`server.servlet.context-path`](#application-properties.server.server.servlet.context-path) | Context path of the application. | | | [`server.servlet.encoding.charset`](#application-properties.server.server.servlet.encoding.charset) | | | | [`server.servlet.encoding.enabled`](#application-properties.server.server.servlet.encoding.enabled) | Whether to enable http encoding support. | `true` | | [`server.servlet.encoding.force`](#application-properties.server.server.servlet.encoding.force) | | | | [`server.servlet.encoding.force-request`](#application-properties.server.server.servlet.encoding.force-request) | | | | [`server.servlet.encoding.force-response`](#application-properties.server.server.servlet.encoding.force-response) | | | | [`server.servlet.encoding.mapping.*`](#application-properties.server.server.servlet.encoding.mapping) | | | | [`server.servlet.jsp.class-name`](#application-properties.server.server.servlet.jsp.class-name) | Class name of the servlet to use for JSPs. If registered is true and this class \* is on the classpath then it will be registered. | `org.apache.jasper.servlet.JspServlet` | | [`server.servlet.jsp.init-parameters.*`](#application-properties.server.server.servlet.jsp.init-parameters) | Init parameters used to configure the JSP servlet. | | | [`server.servlet.jsp.registered`](#application-properties.server.server.servlet.jsp.registered) | Whether the JSP servlet is registered. | `true` | | [`server.servlet.register-default-servlet`](#application-properties.server.server.servlet.register-default-servlet) | Whether to register the default Servlet with the container. | `false` | | [`server.servlet.session.cookie.comment`](#application-properties.server.server.servlet.session.cookie.comment) | | | | [`server.servlet.session.cookie.domain`](#application-properties.server.server.servlet.session.cookie.domain) | | | | [`server.servlet.session.cookie.http-only`](#application-properties.server.server.servlet.session.cookie.http-only) | | | | [`server.servlet.session.cookie.max-age`](#application-properties.server.server.servlet.session.cookie.max-age) | | | | [`server.servlet.session.cookie.name`](#application-properties.server.server.servlet.session.cookie.name) | | | | [`server.servlet.session.cookie.path`](#application-properties.server.server.servlet.session.cookie.path) | | | | [`server.servlet.session.cookie.same-site`](#application-properties.server.server.servlet.session.cookie.same-site) | | | | [`server.servlet.session.cookie.secure`](#application-properties.server.server.servlet.session.cookie.secure) | | | | [`server.servlet.session.persistent`](#application-properties.server.server.servlet.session.persistent) | Whether to persist session data between restarts. | `false` | | [`server.servlet.session.store-dir`](#application-properties.server.server.servlet.session.store-dir) | Directory used to store session data. | | | [`server.servlet.session.timeout`](#application-properties.server.server.servlet.session.timeout) | Session timeout. If a duration suffix is not specified, seconds will be used. | `30m` | | [`server.servlet.session.tracking-modes`](#application-properties.server.server.servlet.session.tracking-modes) | Session tracking modes. | | | [`server.shutdown`](#application-properties.server.server.shutdown) | Type of shutdown that the server will support. | `immediate` | | [`server.ssl.certificate`](#application-properties.server.server.ssl.certificate) | Path to a PEM-encoded SSL certificate file. | | | [`server.ssl.certificate-private-key`](#application-properties.server.server.ssl.certificate-private-key) | Path to a PEM-encoded private key file for the SSL certificate. | | | [`server.ssl.ciphers`](#application-properties.server.server.ssl.ciphers) | Supported SSL ciphers. | | | [`server.ssl.client-auth`](#application-properties.server.server.ssl.client-auth) | Client authentication mode. Requires a trust store. | | | [`server.ssl.enabled`](#application-properties.server.server.ssl.enabled) | Whether to enable SSL support. | `true` | | [`server.ssl.enabled-protocols`](#application-properties.server.server.ssl.enabled-protocols) | Enabled SSL protocols. | | | [`server.ssl.key-alias`](#application-properties.server.server.ssl.key-alias) | Alias that identifies the key in the key store. | | | [`server.ssl.key-password`](#application-properties.server.server.ssl.key-password) | Password used to access the key in the key store. | | | [`server.ssl.key-store`](#application-properties.server.server.ssl.key-store) | Path to the key store that holds the SSL certificate (typically a jks file). | | | [`server.ssl.key-store-password`](#application-properties.server.server.ssl.key-store-password) | Password used to access the key store. | | | [`server.ssl.key-store-provider`](#application-properties.server.server.ssl.key-store-provider) | Provider for the key store. | | | [`server.ssl.key-store-type`](#application-properties.server.server.ssl.key-store-type) | Type of the key store. | | | [`server.ssl.protocol`](#application-properties.server.server.ssl.protocol) | SSL protocol to use. | `TLS` | | [`server.ssl.trust-certificate`](#application-properties.server.server.ssl.trust-certificate) | Path to a PEM-encoded SSL certificate authority file. | | | [`server.ssl.trust-certificate-private-key`](#application-properties.server.server.ssl.trust-certificate-private-key) | Path to a PEM-encoded private key file for the SSL certificate authority. | | | [`server.ssl.trust-store`](#application-properties.server.server.ssl.trust-store) | Trust store that holds SSL certificates. | | | [`server.ssl.trust-store-password`](#application-properties.server.server.ssl.trust-store-password) | Password used to access the trust store. | | | [`server.ssl.trust-store-provider`](#application-properties.server.server.ssl.trust-store-provider) | Provider for the trust store. | | | [`server.ssl.trust-store-type`](#application-properties.server.server.ssl.trust-store-type) | Type of the trust store. | | | [`server.tomcat.accept-count`](#application-properties.server.server.tomcat.accept-count) | Maximum queue length for incoming connection requests when all possible request processing threads are in use. | `100` | | [`server.tomcat.accesslog.buffered`](#application-properties.server.server.tomcat.accesslog.buffered) | Whether to buffer output such that it is flushed only periodically. | `true` | | [`server.tomcat.accesslog.check-exists`](#application-properties.server.server.tomcat.accesslog.check-exists) | Whether to check for log file existence so it can be recreated it if an external process has renamed it. | `false` | | [`server.tomcat.accesslog.condition-if`](#application-properties.server.server.tomcat.accesslog.condition-if) | Whether logging of the request will only be enabled if "ServletRequest.getAttribute(conditionIf)" does not yield null. | | | [`server.tomcat.accesslog.condition-unless`](#application-properties.server.server.tomcat.accesslog.condition-unless) | Whether logging of the request will only be enabled if "ServletRequest.getAttribute(conditionUnless)" yield null. | | | [`server.tomcat.accesslog.directory`](#application-properties.server.server.tomcat.accesslog.directory) | Directory in which log files are created. Can be absolute or relative to the Tomcat base dir. | `logs` | | [`server.tomcat.accesslog.enabled`](#application-properties.server.server.tomcat.accesslog.enabled) | Enable access log. | `false` | | [`server.tomcat.accesslog.encoding`](#application-properties.server.server.tomcat.accesslog.encoding) | Character set used by the log file. Default to the system default character set. | | | [`server.tomcat.accesslog.file-date-format`](#application-properties.server.server.tomcat.accesslog.file-date-format) | Date format to place in the log file name. | `.yyyy-MM-dd` | | [`server.tomcat.accesslog.ipv6-canonical`](#application-properties.server.server.tomcat.accesslog.ipv6-canonical) | Whether to use IPv6 canonical representation format as defined by RFC 5952. | `false` | | [`server.tomcat.accesslog.locale`](#application-properties.server.server.tomcat.accesslog.locale) | Locale used to format timestamps in log entries and in log file name suffix. Default to the default locale of the Java process. | | | [`server.tomcat.accesslog.max-days`](#application-properties.server.server.tomcat.accesslog.max-days) | Number of days to retain the access log files before they are removed. | `-1` | | [`server.tomcat.accesslog.pattern`](#application-properties.server.server.tomcat.accesslog.pattern) | Format pattern for access logs. | `common` | | [`server.tomcat.accesslog.prefix`](#application-properties.server.server.tomcat.accesslog.prefix) | Log file name prefix. | `access_log` | | [`server.tomcat.accesslog.rename-on-rotate`](#application-properties.server.server.tomcat.accesslog.rename-on-rotate) | Whether to defer inclusion of the date stamp in the file name until rotate time. | `false` | | [`server.tomcat.accesslog.request-attributes-enabled`](#application-properties.server.server.tomcat.accesslog.request-attributes-enabled) | Set request attributes for the IP address, Hostname, protocol, and port used for the request. | `false` | | [`server.tomcat.accesslog.rotate`](#application-properties.server.server.tomcat.accesslog.rotate) | Whether to enable access log rotation. | `true` | | [`server.tomcat.accesslog.suffix`](#application-properties.server.server.tomcat.accesslog.suffix) | Log file name suffix. | `.log` | | [`server.tomcat.additional-tld-skip-patterns`](#application-properties.server.server.tomcat.additional-tld-skip-patterns) | Comma-separated list of additional patterns that match jars to ignore for TLD scanning. The special '?' and '\*' characters can be used in the pattern to match one and only one character and zero or more characters respectively. | | | [`server.tomcat.background-processor-delay`](#application-properties.server.server.tomcat.background-processor-delay) | Delay between the invocation of backgroundProcess methods. If a duration suffix is not specified, seconds will be used. | `10s` | | [`server.tomcat.basedir`](#application-properties.server.server.tomcat.basedir) | Tomcat base directory. If not specified, a temporary directory is used. | | | [`server.tomcat.connection-timeout`](#application-properties.server.server.tomcat.connection-timeout) | Amount of time the connector will wait, after accepting a connection, for the request URI line to be presented. | | | [`server.tomcat.keep-alive-timeout`](#application-properties.server.server.tomcat.keep-alive-timeout) | Time to wait for another HTTP request before the connection is closed. When not set the connectionTimeout is used. When set to -1 there will be no timeout. | | | [`server.tomcat.max-connections`](#application-properties.server.server.tomcat.max-connections) | Maximum number of connections that the server accepts and processes at any given time. Once the limit has been reached, the operating system may still accept connections based on the "acceptCount" property. | `8192` | | [`server.tomcat.max-http-form-post-size`](#application-properties.server.server.tomcat.max-http-form-post-size) | Maximum size of the form content in any HTTP post request. | `2MB` | | [`server.tomcat.max-keep-alive-requests`](#application-properties.server.server.tomcat.max-keep-alive-requests) | Maximum number of HTTP requests that can be pipelined before the connection is closed. When set to 0 or 1, keep-alive and pipelining are disabled. When set to -1, an unlimited number of pipelined or keep-alive requests are allowed. | `100` | | [`server.tomcat.max-swallow-size`](#application-properties.server.server.tomcat.max-swallow-size) | Maximum amount of request body to swallow. | `2MB` | | [`server.tomcat.mbeanregistry.enabled`](#application-properties.server.server.tomcat.mbeanregistry.enabled) | Whether Tomcat's MBean Registry should be enabled. | `false` | | [`server.tomcat.processor-cache`](#application-properties.server.server.tomcat.processor-cache) | Maximum number of idle processors that will be retained in the cache and reused with a subsequent request. When set to -1 the cache will be unlimited with a theoretical maximum size equal to the maximum number of connections. | `200` | | [`server.tomcat.redirect-context-root`](#application-properties.server.server.tomcat.redirect-context-root) | Whether requests to the context root should be redirected by appending a / to the path. When using SSL terminated at a proxy, this property should be set to false. | `true` | | [`server.tomcat.reject-illegal-header`](#application-properties.server.server.tomcat.reject-illegal-header) | Whether to reject requests with illegal header names or values. | `true` | | [`server.tomcat.relaxed-path-chars`](#application-properties.server.server.tomcat.relaxed-path-chars) | Comma-separated list of additional unencoded characters that should be allowed in URI paths. Only "< > [ \ ] ^ ` { | }" are allowed. | | | [`server.tomcat.relaxed-query-chars`](#application-properties.server.server.tomcat.relaxed-query-chars) | Comma-separated list of additional unencoded characters that should be allowed in URI query strings. Only "< > [ \ ] ^ ` { | }" are allowed. | | | [`server.tomcat.remoteip.host-header`](#application-properties.server.server.tomcat.remoteip.host-header) | Name of the HTTP header from which the remote host is extracted. | `X-Forwarded-Host` | | [`server.tomcat.remoteip.internal-proxies`](#application-properties.server.server.tomcat.remoteip.internal-proxies) | Regular expression that matches proxies that are to be trusted. | `10\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}|192\\.168\\.\\d{1,3}\\.\\d{1,3}|169\\.254\\.\\d{1,3}\\.\\d{1,3}|127\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}|172\\.1[6-9]{1}\\.\\d{1,3}\\.\\d{1,3}|172\\.2[0-9]{1}\\.\\d{1,3}\\.\\d{1,3}|172\\.3[0-1]{1}\\.\\d{1,3}\\.\\d{1,3}|0:0:0:0:0:0:0:1|::1` | | [`server.tomcat.remoteip.port-header`](#application-properties.server.server.tomcat.remoteip.port-header) | Name of the HTTP header used to override the original port value. | `X-Forwarded-Port` | | [`server.tomcat.remoteip.protocol-header`](#application-properties.server.server.tomcat.remoteip.protocol-header) | Header that holds the incoming protocol, usually named "X-Forwarded-Proto". | | | [`server.tomcat.remoteip.protocol-header-https-value`](#application-properties.server.server.tomcat.remoteip.protocol-header-https-value) | Value of the protocol header indicating whether the incoming request uses SSL. | `https` | | [`server.tomcat.remoteip.remote-ip-header`](#application-properties.server.server.tomcat.remoteip.remote-ip-header) | Name of the HTTP header from which the remote IP is extracted. For instance, 'X-FORWARDED-FOR'. | | | [`server.tomcat.resource.allow-caching`](#application-properties.server.server.tomcat.resource.allow-caching) | Whether static resource caching is permitted for this web application. | `true` | | [`server.tomcat.resource.cache-ttl`](#application-properties.server.server.tomcat.resource.cache-ttl) | Time-to-live of the static resource cache. | | | [`server.tomcat.threads.max`](#application-properties.server.server.tomcat.threads.max) | Maximum amount of worker threads. | `200` | | [`server.tomcat.threads.min-spare`](#application-properties.server.server.tomcat.threads.min-spare) | Minimum amount of worker threads. | `10` | | [`server.tomcat.uri-encoding`](#application-properties.server.server.tomcat.uri-encoding) | Character encoding to use to decode the URI. | `UTF-8` | | [`server.tomcat.use-relative-redirects`](#application-properties.server.server.tomcat.use-relative-redirects) | Whether HTTP 1.1 and later location headers generated by a call to sendRedirect will use relative or absolute redirects. | `false` | | [`server.undertow.accesslog.dir`](#application-properties.server.server.undertow.accesslog.dir) | Undertow access log directory. | | | [`server.undertow.accesslog.enabled`](#application-properties.server.server.undertow.accesslog.enabled) | Whether to enable the access log. | `false` | | [`server.undertow.accesslog.pattern`](#application-properties.server.server.undertow.accesslog.pattern) | Format pattern for access logs. | `common` | | [`server.undertow.accesslog.prefix`](#application-properties.server.server.undertow.accesslog.prefix) | Log file name prefix. | `access_log.` | | [`server.undertow.accesslog.rotate`](#application-properties.server.server.undertow.accesslog.rotate) | Whether to enable access log rotation. | `true` | | [`server.undertow.accesslog.suffix`](#application-properties.server.server.undertow.accesslog.suffix) | Log file name suffix. | `log` | | [`server.undertow.allow-encoded-slash`](#application-properties.server.server.undertow.allow-encoded-slash) | Whether the server should decode percent encoded slash characters. Enabling encoded slashes can have security implications due to different servers interpreting the slash differently. Only enable this if you have a legacy application that requires it. | `false` | | [`server.undertow.always-set-keep-alive`](#application-properties.server.server.undertow.always-set-keep-alive) | Whether the 'Connection: keep-alive' header should be added to all responses, even if not required by the HTTP specification. | `true` | | [`server.undertow.buffer-size`](#application-properties.server.server.undertow.buffer-size) | Size of each buffer. The default is derived from the maximum amount of memory that is available to the JVM. | | | [`server.undertow.decode-url`](#application-properties.server.server.undertow.decode-url) | Whether the URL should be decoded. When disabled, percent-encoded characters in the URL will be left as-is. | `true` | | [`server.undertow.direct-buffers`](#application-properties.server.server.undertow.direct-buffers) | Whether to allocate buffers outside the Java heap. The default is derived from the maximum amount of memory that is available to the JVM. | | | [`server.undertow.eager-filter-init`](#application-properties.server.server.undertow.eager-filter-init) | Whether servlet filters should be initialized on startup. | `true` | | [`server.undertow.max-cookies`](#application-properties.server.server.undertow.max-cookies) | Maximum number of cookies that are allowed. This limit exists to prevent hash collision based DOS attacks. | `200` | | [`server.undertow.max-headers`](#application-properties.server.server.undertow.max-headers) | Maximum number of headers that are allowed. This limit exists to prevent hash collision based DOS attacks. | | | [`server.undertow.max-http-post-size`](#application-properties.server.server.undertow.max-http-post-size) | Maximum size of the HTTP post content. When the value is -1, the default, the size is unlimited. | `-1B` | | [`server.undertow.max-parameters`](#application-properties.server.server.undertow.max-parameters) | Maximum number of query or path parameters that are allowed. This limit exists to prevent hash collision based DOS attacks. | | | [`server.undertow.no-request-timeout`](#application-properties.server.server.undertow.no-request-timeout) | Amount of time a connection can sit idle without processing a request, before it is closed by the server. | | | [`server.undertow.options.server.*`](#application-properties.server.server.undertow.options.server) | | | | [`server.undertow.options.socket.*`](#application-properties.server.server.undertow.options.socket) | | | | [`server.undertow.preserve-path-on-forward`](#application-properties.server.server.undertow.preserve-path-on-forward) | Whether to preserve the path of a request when it is forwarded. | `false` | | [`server.undertow.threads.io`](#application-properties.server.server.undertow.threads.io) | Number of I/O threads to create for the worker. The default is derived from the number of available processors. | | | [`server.undertow.threads.worker`](#application-properties.server.server.undertow.threads.worker) | Number of worker threads. The default is 8 times the number of I/O threads. | | | [`server.undertow.url-charset`](#application-properties.server.server.undertow.url-charset) | Charset used to decode URLs. | `UTF-8` | 12. Security Properties ------------------------ | Name | Description | Default Value | | --- | --- | --- | | [`spring.security.filter.dispatcher-types`](#application-properties.security.spring.security.filter.dispatcher-types) | Security filter chain dispatcher types. | `[async, error, request]` | | [`spring.security.filter.order`](#application-properties.security.spring.security.filter.order) | Security filter chain order. | `-100` | | [`spring.security.oauth2.client.provider.*`](#application-properties.security.spring.security.oauth2.client.provider) | OAuth provider details. | | | [`spring.security.oauth2.client.registration.*`](#application-properties.security.spring.security.oauth2.client.registration) | OAuth client registrations. | | | [`spring.security.oauth2.resourceserver.jwt.audiences`](#application-properties.security.spring.security.oauth2.resourceserver.jwt.audiences) | Identifies the recipients that the JWT is intended for. | | | [`spring.security.oauth2.resourceserver.jwt.issuer-uri`](#application-properties.security.spring.security.oauth2.resourceserver.jwt.issuer-uri) | URI that can either be an OpenID Connect discovery endpoint or an OAuth 2.0 Authorization Server Metadata endpoint defined by RFC 8414. | | | [`spring.security.oauth2.resourceserver.jwt.jwk-set-uri`](#application-properties.security.spring.security.oauth2.resourceserver.jwt.jwk-set-uri) | JSON Web Key URI to use to verify the JWT token. | | | [`spring.security.oauth2.resourceserver.jwt.jws-algorithm`](#application-properties.security.spring.security.oauth2.resourceserver.jwt.jws-algorithm) | JSON Web Algorithm used for verifying the digital signatures. | `RS256` | | [`spring.security.oauth2.resourceserver.jwt.public-key-location`](#application-properties.security.spring.security.oauth2.resourceserver.jwt.public-key-location) | Location of the file containing the public key used to verify a JWT. | | | [`spring.security.oauth2.resourceserver.opaquetoken.client-id`](#application-properties.security.spring.security.oauth2.resourceserver.opaquetoken.client-id) | Client id used to authenticate with the token introspection endpoint. | | | [`spring.security.oauth2.resourceserver.opaquetoken.client-secret`](#application-properties.security.spring.security.oauth2.resourceserver.opaquetoken.client-secret) | Client secret used to authenticate with the token introspection endpoint. | | | [`spring.security.oauth2.resourceserver.opaquetoken.introspection-uri`](#application-properties.security.spring.security.oauth2.resourceserver.opaquetoken.introspection-uri) | OAuth 2.0 endpoint through which token introspection is accomplished. | | | [`spring.security.saml2.relyingparty.registration.*`](#application-properties.security.spring.security.saml2.relyingparty.registration) | SAML2 relying party registrations. | | | [`spring.security.user.name`](#application-properties.security.spring.security.user.name) | Default user name. | `user` | | [`spring.security.user.password`](#application-properties.security.spring.security.user.password) | Password for the default user name. | | | [`spring.security.user.roles`](#application-properties.security.spring.security.user.roles) | Granted roles for the default user name. | | 13. RSocket Properties ----------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.rsocket.server.address`](#application-properties.rsocket.spring.rsocket.server.address) | Network address to which the server should bind. | | | [`spring.rsocket.server.fragment-size`](#application-properties.rsocket.spring.rsocket.server.fragment-size) | Maximum transmission unit. Frames larger than the specified value are fragmented. | | | [`spring.rsocket.server.mapping-path`](#application-properties.rsocket.spring.rsocket.server.mapping-path) | Path under which RSocket handles requests (only works with websocket transport). | | | [`spring.rsocket.server.port`](#application-properties.rsocket.spring.rsocket.server.port) | Server port. | | | [`spring.rsocket.server.ssl.certificate`](#application-properties.rsocket.spring.rsocket.server.ssl.certificate) | | | | [`spring.rsocket.server.ssl.certificate-private-key`](#application-properties.rsocket.spring.rsocket.server.ssl.certificate-private-key) | | | | [`spring.rsocket.server.ssl.ciphers`](#application-properties.rsocket.spring.rsocket.server.ssl.ciphers) | | | | [`spring.rsocket.server.ssl.client-auth`](#application-properties.rsocket.spring.rsocket.server.ssl.client-auth) | | | | [`spring.rsocket.server.ssl.enabled`](#application-properties.rsocket.spring.rsocket.server.ssl.enabled) | | | | [`spring.rsocket.server.ssl.enabled-protocols`](#application-properties.rsocket.spring.rsocket.server.ssl.enabled-protocols) | | | | [`spring.rsocket.server.ssl.key-alias`](#application-properties.rsocket.spring.rsocket.server.ssl.key-alias) | | | | [`spring.rsocket.server.ssl.key-password`](#application-properties.rsocket.spring.rsocket.server.ssl.key-password) | | | | [`spring.rsocket.server.ssl.key-store`](#application-properties.rsocket.spring.rsocket.server.ssl.key-store) | | | | [`spring.rsocket.server.ssl.key-store-password`](#application-properties.rsocket.spring.rsocket.server.ssl.key-store-password) | | | | [`spring.rsocket.server.ssl.key-store-provider`](#application-properties.rsocket.spring.rsocket.server.ssl.key-store-provider) | | | | [`spring.rsocket.server.ssl.key-store-type`](#application-properties.rsocket.spring.rsocket.server.ssl.key-store-type) | | | | [`spring.rsocket.server.ssl.protocol`](#application-properties.rsocket.spring.rsocket.server.ssl.protocol) | | | | [`spring.rsocket.server.ssl.trust-certificate`](#application-properties.rsocket.spring.rsocket.server.ssl.trust-certificate) | | | | [`spring.rsocket.server.ssl.trust-certificate-private-key`](#application-properties.rsocket.spring.rsocket.server.ssl.trust-certificate-private-key) | | | | [`spring.rsocket.server.ssl.trust-store`](#application-properties.rsocket.spring.rsocket.server.ssl.trust-store) | | | | [`spring.rsocket.server.ssl.trust-store-password`](#application-properties.rsocket.spring.rsocket.server.ssl.trust-store-password) | | | | [`spring.rsocket.server.ssl.trust-store-provider`](#application-properties.rsocket.spring.rsocket.server.ssl.trust-store-provider) | | | | [`spring.rsocket.server.ssl.trust-store-type`](#application-properties.rsocket.spring.rsocket.server.ssl.trust-store-type) | | | | [`spring.rsocket.server.transport`](#application-properties.rsocket.spring.rsocket.server.transport) | RSocket transport protocol. | `tcp` | 14. Actuator Properties ------------------------ | Name | Description | Default Value | | --- | --- | --- | | [`management.auditevents.enabled`](#application-properties.actuator.management.auditevents.enabled) | Whether to enable storage of audit events. | `true` | | [`management.cloudfoundry.enabled`](#application-properties.actuator.management.cloudfoundry.enabled) | Whether to enable extended Cloud Foundry actuator endpoints. | `true` | | [`management.cloudfoundry.skip-ssl-validation`](#application-properties.actuator.management.cloudfoundry.skip-ssl-validation) | Whether to skip SSL verification for Cloud Foundry actuator endpoint security calls. | `false` | | [`management.endpoint.auditevents.cache.time-to-live`](#application-properties.actuator.management.endpoint.auditevents.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.auditevents.enabled`](#application-properties.actuator.management.endpoint.auditevents.enabled) | Whether to enable the auditevents endpoint. | `true` | | [`management.endpoint.beans.cache.time-to-live`](#application-properties.actuator.management.endpoint.beans.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.beans.enabled`](#application-properties.actuator.management.endpoint.beans.enabled) | Whether to enable the beans endpoint. | `true` | | [`management.endpoint.caches.cache.time-to-live`](#application-properties.actuator.management.endpoint.caches.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.caches.enabled`](#application-properties.actuator.management.endpoint.caches.enabled) | Whether to enable the caches endpoint. | `true` | | [`management.endpoint.conditions.cache.time-to-live`](#application-properties.actuator.management.endpoint.conditions.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.conditions.enabled`](#application-properties.actuator.management.endpoint.conditions.enabled) | Whether to enable the conditions endpoint. | `true` | | [`management.endpoint.configprops.additional-keys-to-sanitize`](#application-properties.actuator.management.endpoint.configprops.additional-keys-to-sanitize) | Keys that should be sanitized in addition to those already configured. Keys can be simple strings that the property ends with or regular expressions. | | | [`management.endpoint.configprops.cache.time-to-live`](#application-properties.actuator.management.endpoint.configprops.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.configprops.enabled`](#application-properties.actuator.management.endpoint.configprops.enabled) | Whether to enable the configprops endpoint. | `true` | | [`management.endpoint.configprops.keys-to-sanitize`](#application-properties.actuator.management.endpoint.configprops.keys-to-sanitize) | Keys that should be sanitized. Keys can be simple strings that the property ends with or regular expressions. | `[password, secret, key, token, .*credentials.*, vcap_services, sun.java.command]` | | [`management.endpoint.env.additional-keys-to-sanitize`](#application-properties.actuator.management.endpoint.env.additional-keys-to-sanitize) | Keys that should be sanitized in addition to those already configured. Keys can be simple strings that the property ends with or regular expressions. | | | [`management.endpoint.env.cache.time-to-live`](#application-properties.actuator.management.endpoint.env.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.env.enabled`](#application-properties.actuator.management.endpoint.env.enabled) | Whether to enable the env endpoint. | `true` | | [`management.endpoint.env.keys-to-sanitize`](#application-properties.actuator.management.endpoint.env.keys-to-sanitize) | Keys that should be sanitized. Keys can be simple strings that the property ends with or regular expressions. | `[password, secret, key, token, .*credentials.*, vcap_services, sun.java.command]` | | [`management.endpoint.flyway.cache.time-to-live`](#application-properties.actuator.management.endpoint.flyway.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.flyway.enabled`](#application-properties.actuator.management.endpoint.flyway.enabled) | Whether to enable the flyway endpoint. | `true` | | [`management.endpoint.health.cache.time-to-live`](#application-properties.actuator.management.endpoint.health.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.health.enabled`](#application-properties.actuator.management.endpoint.health.enabled) | Whether to enable the health endpoint. | `true` | | [`management.endpoint.health.group.*`](#application-properties.actuator.management.endpoint.health.group) | Health endpoint groups. | | | [`management.endpoint.health.probes.add-additional-paths`](#application-properties.actuator.management.endpoint.health.probes.add-additional-paths) | Whether to make the liveness and readiness health groups available on the main server port. | `false` | | [`management.endpoint.health.probes.enabled`](#application-properties.actuator.management.endpoint.health.probes.enabled) | Whether to enable liveness and readiness probes. | `false` | | [`management.endpoint.health.roles`](#application-properties.actuator.management.endpoint.health.roles) | Roles used to determine whether or not a user is authorized to be shown details. When empty, all authenticated users are authorized. | | | [`management.endpoint.health.show-components`](#application-properties.actuator.management.endpoint.health.show-components) | When to show components. If not specified the 'show-details' setting will be used. | | | [`management.endpoint.health.show-details`](#application-properties.actuator.management.endpoint.health.show-details) | When to show full health details. | `never` | | [`management.endpoint.health.status.http-mapping.*`](#application-properties.actuator.management.endpoint.health.status.http-mapping) | Mapping of health statuses to HTTP status codes. By default, registered health statuses map to sensible defaults (for example, UP maps to 200). | | | [`management.endpoint.health.status.order`](#application-properties.actuator.management.endpoint.health.status.order) | Comma-separated list of health statuses in order of severity. | `[DOWN, OUT_OF_SERVICE, UP, UNKNOWN]` | | [`management.endpoint.heapdump.cache.time-to-live`](#application-properties.actuator.management.endpoint.heapdump.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.heapdump.enabled`](#application-properties.actuator.management.endpoint.heapdump.enabled) | Whether to enable the heapdump endpoint. | `true` | | [`management.endpoint.httptrace.cache.time-to-live`](#application-properties.actuator.management.endpoint.httptrace.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.httptrace.enabled`](#application-properties.actuator.management.endpoint.httptrace.enabled) | Whether to enable the httptrace endpoint. | `true` | | [`management.endpoint.info.cache.time-to-live`](#application-properties.actuator.management.endpoint.info.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.info.enabled`](#application-properties.actuator.management.endpoint.info.enabled) | Whether to enable the info endpoint. | `true` | | [`management.endpoint.integrationgraph.cache.time-to-live`](#application-properties.actuator.management.endpoint.integrationgraph.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.integrationgraph.enabled`](#application-properties.actuator.management.endpoint.integrationgraph.enabled) | Whether to enable the integrationgraph endpoint. | `true` | | [`management.endpoint.jolokia.config.*`](#application-properties.actuator.management.endpoint.jolokia.config) | Jolokia settings. Refer to the documentation of Jolokia for more details. | | | [`management.endpoint.jolokia.enabled`](#application-properties.actuator.management.endpoint.jolokia.enabled) | Whether to enable the jolokia endpoint. | `true` | | [`management.endpoint.liquibase.cache.time-to-live`](#application-properties.actuator.management.endpoint.liquibase.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.liquibase.enabled`](#application-properties.actuator.management.endpoint.liquibase.enabled) | Whether to enable the liquibase endpoint. | `true` | | [`management.endpoint.logfile.cache.time-to-live`](#application-properties.actuator.management.endpoint.logfile.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.logfile.enabled`](#application-properties.actuator.management.endpoint.logfile.enabled) | Whether to enable the logfile endpoint. | `true` | | [`management.endpoint.logfile.external-file`](#application-properties.actuator.management.endpoint.logfile.external-file) | External Logfile to be accessed. Can be used if the logfile is written by output redirect and not by the logging system itself. | | | [`management.endpoint.loggers.cache.time-to-live`](#application-properties.actuator.management.endpoint.loggers.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.loggers.enabled`](#application-properties.actuator.management.endpoint.loggers.enabled) | Whether to enable the loggers endpoint. | `true` | | [`management.endpoint.mappings.cache.time-to-live`](#application-properties.actuator.management.endpoint.mappings.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.mappings.enabled`](#application-properties.actuator.management.endpoint.mappings.enabled) | Whether to enable the mappings endpoint. | `true` | | [`management.endpoint.metrics.cache.time-to-live`](#application-properties.actuator.management.endpoint.metrics.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.metrics.enabled`](#application-properties.actuator.management.endpoint.metrics.enabled) | Whether to enable the metrics endpoint. | `true` | | [`management.endpoint.prometheus.enabled`](#application-properties.actuator.management.endpoint.prometheus.enabled) | Whether to enable the prometheus endpoint. | `true` | | [`management.endpoint.quartz.cache.time-to-live`](#application-properties.actuator.management.endpoint.quartz.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.quartz.enabled`](#application-properties.actuator.management.endpoint.quartz.enabled) | Whether to enable the quartz endpoint. | `true` | | [`management.endpoint.scheduledtasks.cache.time-to-live`](#application-properties.actuator.management.endpoint.scheduledtasks.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.scheduledtasks.enabled`](#application-properties.actuator.management.endpoint.scheduledtasks.enabled) | Whether to enable the scheduledtasks endpoint. | `true` | | [`management.endpoint.sessions.enabled`](#application-properties.actuator.management.endpoint.sessions.enabled) | Whether to enable the sessions endpoint. | `true` | | [`management.endpoint.shutdown.enabled`](#application-properties.actuator.management.endpoint.shutdown.enabled) | Whether to enable the shutdown endpoint. | `false` | | [`management.endpoint.startup.cache.time-to-live`](#application-properties.actuator.management.endpoint.startup.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.startup.enabled`](#application-properties.actuator.management.endpoint.startup.enabled) | Whether to enable the startup endpoint. | `true` | | [`management.endpoint.threaddump.cache.time-to-live`](#application-properties.actuator.management.endpoint.threaddump.cache.time-to-live) | Maximum time that a response can be cached. | `0ms` | | [`management.endpoint.threaddump.enabled`](#application-properties.actuator.management.endpoint.threaddump.enabled) | Whether to enable the threaddump endpoint. | `true` | | [`management.endpoints.enabled-by-default`](#application-properties.actuator.management.endpoints.enabled-by-default) | Whether to enable or disable all endpoints by default. | | | [`management.endpoints.jmx.domain`](#application-properties.actuator.management.endpoints.jmx.domain) | Endpoints JMX domain name. Fallback to 'spring.jmx.default-domain' if set. | `org.springframework.boot` | | [`management.endpoints.jmx.exposure.exclude`](#application-properties.actuator.management.endpoints.jmx.exposure.exclude) | Endpoint IDs that should be excluded or '\*' for all. | | | [`management.endpoints.jmx.exposure.include`](#application-properties.actuator.management.endpoints.jmx.exposure.include) | Endpoint IDs that should be included or '\*' for all. | `*` | | [`management.endpoints.jmx.static-names`](#application-properties.actuator.management.endpoints.jmx.static-names) | Additional static properties to append to all ObjectNames of MBeans representing Endpoints. | | | [`management.endpoints.migrate-legacy-ids`](#application-properties.actuator.management.endpoints.migrate-legacy-ids) | Whether to transparently migrate legacy endpoint IDs. | `false` | | [`management.endpoints.web.base-path`](#application-properties.actuator.management.endpoints.web.base-path) | Base path for Web endpoints. Relative to the servlet context path (server.servlet.context-path) or WebFlux base path (spring.webflux.base-path) when the management server is sharing the main server port. Relative to the management server base path (management.server.base-path) when a separate management server port (management.server.port) is configured. | `/actuator` | | [`management.endpoints.web.cors.allow-credentials`](#application-properties.actuator.management.endpoints.web.cors.allow-credentials) | Whether credentials are supported. When not set, credentials are not supported. | | | [`management.endpoints.web.cors.allowed-headers`](#application-properties.actuator.management.endpoints.web.cors.allowed-headers) | Comma-separated list of headers to allow in a request. '\*' allows all headers. | | | [`management.endpoints.web.cors.allowed-methods`](#application-properties.actuator.management.endpoints.web.cors.allowed-methods) | Comma-separated list of methods to allow. '\*' allows all methods. When not set, defaults to GET. | | | [`management.endpoints.web.cors.allowed-origin-patterns`](#application-properties.actuator.management.endpoints.web.cors.allowed-origin-patterns) | Comma-separated list of origin patterns to allow. Unlike allowed origins which only supports '\*', origin patterns are more flexible (for example 'https://\*.example.com') and can be used when credentials are allowed. When no allowed origin patterns or allowed origins are set, CORS support is disabled. | | | [`management.endpoints.web.cors.allowed-origins`](#application-properties.actuator.management.endpoints.web.cors.allowed-origins) | Comma-separated list of origins to allow. '\*' allows all origins. When credentials are allowed, '\*' cannot be used and origin patterns should be configured instead. When no allowed origins or allowed origin patterns are set, CORS support is disabled. | | | [`management.endpoints.web.cors.exposed-headers`](#application-properties.actuator.management.endpoints.web.cors.exposed-headers) | Comma-separated list of headers to include in a response. | | | [`management.endpoints.web.cors.max-age`](#application-properties.actuator.management.endpoints.web.cors.max-age) | How long the response from a pre-flight request can be cached by clients. If a duration suffix is not specified, seconds will be used. | `1800s` | | [`management.endpoints.web.discovery.enabled`](#application-properties.actuator.management.endpoints.web.discovery.enabled) | Whether the discovery page is enabled. | `true` | | [`management.endpoints.web.exposure.exclude`](#application-properties.actuator.management.endpoints.web.exposure.exclude) | Endpoint IDs that should be excluded or '\*' for all. | | | [`management.endpoints.web.exposure.include`](#application-properties.actuator.management.endpoints.web.exposure.include) | Endpoint IDs that should be included or '\*' for all. | `[health]` | | [`management.endpoints.web.path-mapping.*`](#application-properties.actuator.management.endpoints.web.path-mapping) | Mapping between endpoint IDs and the path that should expose them. | | | [`management.health.cassandra.enabled`](#application-properties.actuator.management.health.cassandra.enabled) | Whether to enable Cassandra health check. | `true` | | [`management.health.couchbase.enabled`](#application-properties.actuator.management.health.couchbase.enabled) | Whether to enable Couchbase health check. | `true` | | [`management.health.db.enabled`](#application-properties.actuator.management.health.db.enabled) | Whether to enable database health check. | `true` | | [`management.health.db.ignore-routing-data-sources`](#application-properties.actuator.management.health.db.ignore-routing-data-sources) | Whether to ignore AbstractRoutingDataSources when creating database health indicators. | `false` | | [`management.health.defaults.enabled`](#application-properties.actuator.management.health.defaults.enabled) | Whether to enable default health indicators. | `true` | | [`management.health.diskspace.enabled`](#application-properties.actuator.management.health.diskspace.enabled) | Whether to enable disk space health check. | `true` | | [`management.health.diskspace.path`](#application-properties.actuator.management.health.diskspace.path) | Path used to compute the available disk space. | | | [`management.health.diskspace.threshold`](#application-properties.actuator.management.health.diskspace.threshold) | Minimum disk space that should be available. | `10MB` | | [`management.health.elasticsearch.enabled`](#application-properties.actuator.management.health.elasticsearch.enabled) | Whether to enable Elasticsearch health check. | `true` | | [`management.health.influxdb.enabled`](#application-properties.actuator.management.health.influxdb.enabled) | Whether to enable InfluxDB health check. | `true` | | [`management.health.jms.enabled`](#application-properties.actuator.management.health.jms.enabled) | Whether to enable JMS health check. | `true` | | [`management.health.ldap.enabled`](#application-properties.actuator.management.health.ldap.enabled) | Whether to enable LDAP health check. | `true` | | [`management.health.livenessstate.enabled`](#application-properties.actuator.management.health.livenessstate.enabled) | Whether to enable liveness state health check. | `false` | | [`management.health.mail.enabled`](#application-properties.actuator.management.health.mail.enabled) | Whether to enable Mail health check. | `true` | | [`management.health.mongo.enabled`](#application-properties.actuator.management.health.mongo.enabled) | Whether to enable MongoDB health check. | `true` | | [`management.health.neo4j.enabled`](#application-properties.actuator.management.health.neo4j.enabled) | Whether to enable Neo4j health check. | `true` | | [`management.health.ping.enabled`](#application-properties.actuator.management.health.ping.enabled) | Whether to enable ping health check. | `true` | | [`management.health.rabbit.enabled`](#application-properties.actuator.management.health.rabbit.enabled) | Whether to enable RabbitMQ health check. | `true` | | [`management.health.readinessstate.enabled`](#application-properties.actuator.management.health.readinessstate.enabled) | Whether to enable readiness state health check. | `false` | | [`management.health.redis.enabled`](#application-properties.actuator.management.health.redis.enabled) | Whether to enable Redis health check. | `true` | | [`management.health.solr.enabled`](#application-properties.actuator.management.health.solr.enabled) | Whether to enable Solr health check. | `true` | | [`management.health.status.order`](#application-properties.actuator.management.health.status.order) | | `[DOWN, OUT_OF_SERVICE, UP, UNKNOWN]` | | [`management.info.build.enabled`](#application-properties.actuator.management.info.build.enabled) | Whether to enable build info. | `true` | | [`management.info.defaults.enabled`](#application-properties.actuator.management.info.defaults.enabled) | Whether to enable default info contributors. | `true` | | [`management.info.env.enabled`](#application-properties.actuator.management.info.env.enabled) | Whether to enable environment info. | `false` | | [`management.info.git.enabled`](#application-properties.actuator.management.info.git.enabled) | Whether to enable git info. | `true` | | [`management.info.git.mode`](#application-properties.actuator.management.info.git.mode) | Mode to use to expose git information. | `simple` | | [`management.info.java.enabled`](#application-properties.actuator.management.info.java.enabled) | Whether to enable Java info. | `false` | | [`management.info.os.enabled`](#application-properties.actuator.management.info.os.enabled) | Whether to enable Operating System info. | `false` | | [`management.metrics.data.repository.autotime.enabled`](#application-properties.actuator.management.metrics.data.repository.autotime.enabled) | | `true` | | [`management.metrics.data.repository.autotime.percentiles`](#application-properties.actuator.management.metrics.data.repository.autotime.percentiles) | | | | [`management.metrics.data.repository.autotime.percentiles-histogram`](#application-properties.actuator.management.metrics.data.repository.autotime.percentiles-histogram) | | `false` | | [`management.metrics.data.repository.metric-name`](#application-properties.actuator.management.metrics.data.repository.metric-name) | Name of the metric for sent requests. | `spring.data.repository.invocations` | | [`management.metrics.distribution.buffer-length.*`](#application-properties.actuator.management.metrics.distribution.buffer-length) | Number of histograms for meter IDs starting with the specified name to keep in the ring buffer. The longest match wins, the key `all` can also be used to configure all meters. | | | [`management.metrics.distribution.expiry.*`](#application-properties.actuator.management.metrics.distribution.expiry) | Maximum amount of time that samples for meter IDs starting with the specified name are accumulated to decaying distribution statistics before they are reset and rotated. The longest match wins, the key `all` can also be used to configure all meters. | | | [`management.metrics.distribution.maximum-expected-value.*`](#application-properties.actuator.management.metrics.distribution.maximum-expected-value) | Maximum value that meter IDs starting with the specified name are expected to observe. The longest match wins. Values can be specified as a double or as a Duration value (for timer meters, defaulting to ms if no unit specified). | | | [`management.metrics.distribution.minimum-expected-value.*`](#application-properties.actuator.management.metrics.distribution.minimum-expected-value) | Minimum value that meter IDs starting with the specified name are expected to observe. The longest match wins. Values can be specified as a double or as a Duration value (for timer meters, defaulting to ms if no unit specified). | | | [`management.metrics.distribution.percentiles.*`](#application-properties.actuator.management.metrics.distribution.percentiles) | Specific computed non-aggregable percentiles to ship to the backend for meter IDs starting-with the specified name. The longest match wins, the key 'all' can also be used to configure all meters. | | | [`management.metrics.distribution.percentiles-histogram.*`](#application-properties.actuator.management.metrics.distribution.percentiles-histogram) | Whether meter IDs starting with the specified name should publish percentile histograms. For monitoring systems that support aggregable percentile calculation based on a histogram, this can be set to true. For other systems, this has no effect. The longest match wins, the key 'all' can also be used to configure all meters. | | | [`management.metrics.distribution.slo.*`](#application-properties.actuator.management.metrics.distribution.slo) | Specific service-level objective boundaries for meter IDs starting with the specified name. The longest match wins. Counters will be published for each specified boundary. Values can be specified as a double or as a Duration value (for timer meters, defaulting to ms if no unit specified). | | | [`management.metrics.enable.*`](#application-properties.actuator.management.metrics.enable) | Whether meter IDs starting with the specified name should be enabled. The longest match wins, the key 'all' can also be used to configure all meters. | | | [`management.metrics.export.appoptics.api-token`](#application-properties.actuator.management.metrics.export.appoptics.api-token) | AppOptics API token. | | | [`management.metrics.export.appoptics.batch-size`](#application-properties.actuator.management.metrics.export.appoptics.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `500` | | [`management.metrics.export.appoptics.connect-timeout`](#application-properties.actuator.management.metrics.export.appoptics.connect-timeout) | Connection timeout for requests to this backend. | `5s` | | [`management.metrics.export.appoptics.enabled`](#application-properties.actuator.management.metrics.export.appoptics.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.appoptics.floor-times`](#application-properties.actuator.management.metrics.export.appoptics.floor-times) | Whether to ship a floored time, useful when sending measurements from multiple hosts to align them on a given time boundary. | `false` | | [`management.metrics.export.appoptics.host-tag`](#application-properties.actuator.management.metrics.export.appoptics.host-tag) | Tag that will be mapped to "@host" when shipping metrics to AppOptics. | `instance` | | [`management.metrics.export.appoptics.read-timeout`](#application-properties.actuator.management.metrics.export.appoptics.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.appoptics.step`](#application-properties.actuator.management.metrics.export.appoptics.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.appoptics.uri`](#application-properties.actuator.management.metrics.export.appoptics.uri) | URI to ship metrics to. | `https://api.appoptics.com/v1/measurements` | | [`management.metrics.export.atlas.batch-size`](#application-properties.actuator.management.metrics.export.atlas.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.atlas.config-refresh-frequency`](#application-properties.actuator.management.metrics.export.atlas.config-refresh-frequency) | Frequency for refreshing config settings from the LWC service. | `10s` | | [`management.metrics.export.atlas.config-time-to-live`](#application-properties.actuator.management.metrics.export.atlas.config-time-to-live) | Time to live for subscriptions from the LWC service. | `150s` | | [`management.metrics.export.atlas.config-uri`](#application-properties.actuator.management.metrics.export.atlas.config-uri) | URI for the Atlas LWC endpoint to retrieve current subscriptions. | `http://localhost:7101/lwc/api/v1/expressions/local-dev` | | [`management.metrics.export.atlas.connect-timeout`](#application-properties.actuator.management.metrics.export.atlas.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.atlas.enabled`](#application-properties.actuator.management.metrics.export.atlas.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.atlas.eval-uri`](#application-properties.actuator.management.metrics.export.atlas.eval-uri) | URI for the Atlas LWC endpoint to evaluate the data for a subscription. | `http://localhost:7101/lwc/api/v1/evaluate` | | [`management.metrics.export.atlas.lwc-enabled`](#application-properties.actuator.management.metrics.export.atlas.lwc-enabled) | Whether to enable streaming to Atlas LWC. | `false` | | [`management.metrics.export.atlas.meter-time-to-live`](#application-properties.actuator.management.metrics.export.atlas.meter-time-to-live) | Time to live for meters that do not have any activity. After this period the meter will be considered expired and will not get reported. | `15m` | | [`management.metrics.export.atlas.num-threads`](#application-properties.actuator.management.metrics.export.atlas.num-threads) | Number of threads to use with the metrics publishing scheduler. | `4` | | [`management.metrics.export.atlas.read-timeout`](#application-properties.actuator.management.metrics.export.atlas.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.atlas.step`](#application-properties.actuator.management.metrics.export.atlas.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.atlas.uri`](#application-properties.actuator.management.metrics.export.atlas.uri) | URI of the Atlas server. | `http://localhost:7101/api/v1/publish` | | [`management.metrics.export.datadog.api-key`](#application-properties.actuator.management.metrics.export.datadog.api-key) | Datadog API key. | | | [`management.metrics.export.datadog.application-key`](#application-properties.actuator.management.metrics.export.datadog.application-key) | Datadog application key. Not strictly required, but improves the Datadog experience by sending meter descriptions, types, and base units to Datadog. | | | [`management.metrics.export.datadog.batch-size`](#application-properties.actuator.management.metrics.export.datadog.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.datadog.connect-timeout`](#application-properties.actuator.management.metrics.export.datadog.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.datadog.descriptions`](#application-properties.actuator.management.metrics.export.datadog.descriptions) | Whether to publish descriptions metadata to Datadog. Turn this off to minimize the amount of metadata sent. | `true` | | [`management.metrics.export.datadog.enabled`](#application-properties.actuator.management.metrics.export.datadog.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.datadog.host-tag`](#application-properties.actuator.management.metrics.export.datadog.host-tag) | Tag that will be mapped to "host" when shipping metrics to Datadog. | `instance` | | [`management.metrics.export.datadog.read-timeout`](#application-properties.actuator.management.metrics.export.datadog.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.datadog.step`](#application-properties.actuator.management.metrics.export.datadog.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.datadog.uri`](#application-properties.actuator.management.metrics.export.datadog.uri) | URI to ship metrics to. Set this if you need to publish metrics to a Datadog site other than US, or to an internal proxy en-route to Datadog. | `https://api.datadoghq.com` | | [`management.metrics.export.defaults.enabled`](#application-properties.actuator.management.metrics.export.defaults.enabled) | Whether to enable default metrics exporters. | `true` | | [`management.metrics.export.dynatrace.api-token`](#application-properties.actuator.management.metrics.export.dynatrace.api-token) | Dynatrace authentication token. | | | [`management.metrics.export.dynatrace.batch-size`](#application-properties.actuator.management.metrics.export.dynatrace.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.dynatrace.connect-timeout`](#application-properties.actuator.management.metrics.export.dynatrace.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.dynatrace.enabled`](#application-properties.actuator.management.metrics.export.dynatrace.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.dynatrace.read-timeout`](#application-properties.actuator.management.metrics.export.dynatrace.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.dynatrace.step`](#application-properties.actuator.management.metrics.export.dynatrace.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.dynatrace.uri`](#application-properties.actuator.management.metrics.export.dynatrace.uri) | URI to ship metrics to. Should be used for SaaS, self-managed instances or to en-route through an internal proxy. | | | [`management.metrics.export.dynatrace.v1.device-id`](#application-properties.actuator.management.metrics.export.dynatrace.v1.device-id) | ID of the custom device that is exporting metrics to Dynatrace. | | | [`management.metrics.export.dynatrace.v1.group`](#application-properties.actuator.management.metrics.export.dynatrace.v1.group) | Group for exported metrics. Used to specify custom device group name in the Dynatrace UI. | | | [`management.metrics.export.dynatrace.v1.technology-type`](#application-properties.actuator.management.metrics.export.dynatrace.v1.technology-type) | Technology type for exported metrics. Used to group metrics under a logical technology name in the Dynatrace UI. | `java` | | [`management.metrics.export.dynatrace.v2.default-dimensions.*`](#application-properties.actuator.management.metrics.export.dynatrace.v2.default-dimensions) | Default dimensions that are added to all metrics in the form of key-value pairs. These are overwritten by Micrometer tags if they use the same key. | | | [`management.metrics.export.dynatrace.v2.enrich-with-dynatrace-metadata`](#application-properties.actuator.management.metrics.export.dynatrace.v2.enrich-with-dynatrace-metadata) | Whether to enable Dynatrace metadata export. | `true` | | [`management.metrics.export.dynatrace.v2.metric-key-prefix`](#application-properties.actuator.management.metrics.export.dynatrace.v2.metric-key-prefix) | Prefix string that is added to all exported metrics. | | | [`management.metrics.export.dynatrace.v2.use-dynatrace-summary-instruments`](#application-properties.actuator.management.metrics.export.dynatrace.v2.use-dynatrace-summary-instruments) | Whether to fall back to the built-in micrometer instruments for Timer and DistributionSummary. | `true` | | [`management.metrics.export.elastic.api-key-credentials`](#application-properties.actuator.management.metrics.export.elastic.api-key-credentials) | Base64-encoded credentials string. Mutually exclusive with user-name and password. | | | [`management.metrics.export.elastic.auto-create-index`](#application-properties.actuator.management.metrics.export.elastic.auto-create-index) | Whether to create the index automatically if it does not exist. | `true` | | [`management.metrics.export.elastic.batch-size`](#application-properties.actuator.management.metrics.export.elastic.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.elastic.connect-timeout`](#application-properties.actuator.management.metrics.export.elastic.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.elastic.enabled`](#application-properties.actuator.management.metrics.export.elastic.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.elastic.host`](#application-properties.actuator.management.metrics.export.elastic.host) | Host to export metrics to. | `http://localhost:9200` | | [`management.metrics.export.elastic.index`](#application-properties.actuator.management.metrics.export.elastic.index) | Index to export metrics to. | `micrometer-metrics` | | [`management.metrics.export.elastic.index-date-format`](#application-properties.actuator.management.metrics.export.elastic.index-date-format) | Index date format used for rolling indices. Appended to the index name. | `yyyy-MM` | | [`management.metrics.export.elastic.index-date-separator`](#application-properties.actuator.management.metrics.export.elastic.index-date-separator) | Prefix to separate the index name from the date format used for rolling indices. | `-` | | [`management.metrics.export.elastic.password`](#application-properties.actuator.management.metrics.export.elastic.password) | Login password of the Elastic server. Mutually exclusive with api-key-credentials. | | | [`management.metrics.export.elastic.pipeline`](#application-properties.actuator.management.metrics.export.elastic.pipeline) | Ingest pipeline name. By default, events are not pre-processed. | | | [`management.metrics.export.elastic.read-timeout`](#application-properties.actuator.management.metrics.export.elastic.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.elastic.step`](#application-properties.actuator.management.metrics.export.elastic.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.elastic.timestamp-field-name`](#application-properties.actuator.management.metrics.export.elastic.timestamp-field-name) | Name of the timestamp field. | `@timestamp` | | [`management.metrics.export.elastic.user-name`](#application-properties.actuator.management.metrics.export.elastic.user-name) | Login user of the Elastic server. Mutually exclusive with api-key-credentials. | | | [`management.metrics.export.ganglia.addressing-mode`](#application-properties.actuator.management.metrics.export.ganglia.addressing-mode) | UDP addressing mode, either unicast or multicast. | `multicast` | | [`management.metrics.export.ganglia.duration-units`](#application-properties.actuator.management.metrics.export.ganglia.duration-units) | Base time unit used to report durations. | `milliseconds` | | [`management.metrics.export.ganglia.enabled`](#application-properties.actuator.management.metrics.export.ganglia.enabled) | Whether exporting of metrics to Ganglia is enabled. | `true` | | [`management.metrics.export.ganglia.host`](#application-properties.actuator.management.metrics.export.ganglia.host) | Host of the Ganglia server to receive exported metrics. | `localhost` | | [`management.metrics.export.ganglia.port`](#application-properties.actuator.management.metrics.export.ganglia.port) | Port of the Ganglia server to receive exported metrics. | `8649` | | [`management.metrics.export.ganglia.step`](#application-properties.actuator.management.metrics.export.ganglia.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.ganglia.time-to-live`](#application-properties.actuator.management.metrics.export.ganglia.time-to-live) | Time to live for metrics on Ganglia. Set the multi-cast Time-To-Live to be one greater than the number of hops (routers) between the hosts. | `1` | | [`management.metrics.export.graphite.duration-units`](#application-properties.actuator.management.metrics.export.graphite.duration-units) | Base time unit used to report durations. | `milliseconds` | | [`management.metrics.export.graphite.enabled`](#application-properties.actuator.management.metrics.export.graphite.enabled) | Whether exporting of metrics to Graphite is enabled. | `true` | | [`management.metrics.export.graphite.graphite-tags-enabled`](#application-properties.actuator.management.metrics.export.graphite.graphite-tags-enabled) | Whether Graphite tags should be used, as opposed to a hierarchical naming convention. Enabled by default unless "tagsAsPrefix" is set. | | | [`management.metrics.export.graphite.host`](#application-properties.actuator.management.metrics.export.graphite.host) | Host of the Graphite server to receive exported metrics. | `localhost` | | [`management.metrics.export.graphite.port`](#application-properties.actuator.management.metrics.export.graphite.port) | Port of the Graphite server to receive exported metrics. | `2004` | | [`management.metrics.export.graphite.protocol`](#application-properties.actuator.management.metrics.export.graphite.protocol) | Protocol to use while shipping data to Graphite. | `pickled` | | [`management.metrics.export.graphite.rate-units`](#application-properties.actuator.management.metrics.export.graphite.rate-units) | Base time unit used to report rates. | `seconds` | | [`management.metrics.export.graphite.step`](#application-properties.actuator.management.metrics.export.graphite.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.graphite.tags-as-prefix`](#application-properties.actuator.management.metrics.export.graphite.tags-as-prefix) | For the hierarchical naming convention, turn the specified tag keys into part of the metric prefix. Ignored if "graphiteTagsEnabled" is true. | `[]` | | [`management.metrics.export.humio.api-token`](#application-properties.actuator.management.metrics.export.humio.api-token) | Humio API token. | | | [`management.metrics.export.humio.batch-size`](#application-properties.actuator.management.metrics.export.humio.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.humio.connect-timeout`](#application-properties.actuator.management.metrics.export.humio.connect-timeout) | Connection timeout for requests to this backend. | `5s` | | [`management.metrics.export.humio.enabled`](#application-properties.actuator.management.metrics.export.humio.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.humio.read-timeout`](#application-properties.actuator.management.metrics.export.humio.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.humio.step`](#application-properties.actuator.management.metrics.export.humio.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.humio.tags.*`](#application-properties.actuator.management.metrics.export.humio.tags) | Humio tags describing the data source in which metrics will be stored. Humio tags are a distinct concept from Micrometer's tags. Micrometer's tags are used to divide metrics along dimensional boundaries. | | | [`management.metrics.export.humio.uri`](#application-properties.actuator.management.metrics.export.humio.uri) | URI to ship metrics to. If you need to publish metrics to an internal proxy en-route to Humio, you can define the location of the proxy with this. | `https://cloud.humio.com` | | [`management.metrics.export.influx.api-version`](#application-properties.actuator.management.metrics.export.influx.api-version) | API version of InfluxDB to use. Defaults to 'v1' unless an org is configured. If an org is configured, defaults to 'v2'. | | | [`management.metrics.export.influx.auto-create-db`](#application-properties.actuator.management.metrics.export.influx.auto-create-db) | Whether to create the Influx database if it does not exist before attempting to publish metrics to it. InfluxDB v1 only. | `true` | | [`management.metrics.export.influx.batch-size`](#application-properties.actuator.management.metrics.export.influx.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.influx.bucket`](#application-properties.actuator.management.metrics.export.influx.bucket) | Bucket for metrics. Use either the bucket name or ID. Defaults to the value of the db property if not set. InfluxDB v2 only. | | | [`management.metrics.export.influx.compressed`](#application-properties.actuator.management.metrics.export.influx.compressed) | Whether to enable GZIP compression of metrics batches published to Influx. | `true` | | [`management.metrics.export.influx.connect-timeout`](#application-properties.actuator.management.metrics.export.influx.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.influx.consistency`](#application-properties.actuator.management.metrics.export.influx.consistency) | Write consistency for each point. | `one` | | [`management.metrics.export.influx.db`](#application-properties.actuator.management.metrics.export.influx.db) | Database to send metrics to. InfluxDB v1 only. | `mydb` | | [`management.metrics.export.influx.enabled`](#application-properties.actuator.management.metrics.export.influx.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.influx.org`](#application-properties.actuator.management.metrics.export.influx.org) | Org to write metrics to. InfluxDB v2 only. | | | [`management.metrics.export.influx.password`](#application-properties.actuator.management.metrics.export.influx.password) | Login password of the Influx server. InfluxDB v1 only. | | | [`management.metrics.export.influx.read-timeout`](#application-properties.actuator.management.metrics.export.influx.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.influx.retention-duration`](#application-properties.actuator.management.metrics.export.influx.retention-duration) | Time period for which Influx should retain data in the current database. For instance 7d, check the influx documentation for more details on the duration format. InfluxDB v1 only. | | | [`management.metrics.export.influx.retention-policy`](#application-properties.actuator.management.metrics.export.influx.retention-policy) | Retention policy to use (Influx writes to the DEFAULT retention policy if one is not specified). InfluxDB v1 only. | | | [`management.metrics.export.influx.retention-replication-factor`](#application-properties.actuator.management.metrics.export.influx.retention-replication-factor) | How many copies of the data are stored in the cluster. Must be 1 for a single node instance. InfluxDB v1 only. | | | [`management.metrics.export.influx.retention-shard-duration`](#application-properties.actuator.management.metrics.export.influx.retention-shard-duration) | Time range covered by a shard group. For instance 2w, check the influx documentation for more details on the duration format. InfluxDB v1 only. | | | [`management.metrics.export.influx.step`](#application-properties.actuator.management.metrics.export.influx.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.influx.token`](#application-properties.actuator.management.metrics.export.influx.token) | Authentication token to use with calls to the InfluxDB backend. For InfluxDB v1, the Bearer scheme is used. For v2, the Token scheme is used. | | | [`management.metrics.export.influx.uri`](#application-properties.actuator.management.metrics.export.influx.uri) | URI of the Influx server. | `http://localhost:8086` | | [`management.metrics.export.influx.user-name`](#application-properties.actuator.management.metrics.export.influx.user-name) | Login user of the Influx server. InfluxDB v1 only. | | | [`management.metrics.export.jmx.domain`](#application-properties.actuator.management.metrics.export.jmx.domain) | Metrics JMX domain name. | `metrics` | | [`management.metrics.export.jmx.enabled`](#application-properties.actuator.management.metrics.export.jmx.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.jmx.step`](#application-properties.actuator.management.metrics.export.jmx.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.kairos.batch-size`](#application-properties.actuator.management.metrics.export.kairos.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.kairos.connect-timeout`](#application-properties.actuator.management.metrics.export.kairos.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.kairos.enabled`](#application-properties.actuator.management.metrics.export.kairos.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.kairos.password`](#application-properties.actuator.management.metrics.export.kairos.password) | Login password of the KairosDB server. | | | [`management.metrics.export.kairos.read-timeout`](#application-properties.actuator.management.metrics.export.kairos.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.kairos.step`](#application-properties.actuator.management.metrics.export.kairos.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.kairos.uri`](#application-properties.actuator.management.metrics.export.kairos.uri) | URI of the KairosDB server. | `http://localhost:8080/api/v1/datapoints` | | [`management.metrics.export.kairos.user-name`](#application-properties.actuator.management.metrics.export.kairos.user-name) | Login user of the KairosDB server. | | | [`management.metrics.export.newrelic.account-id`](#application-properties.actuator.management.metrics.export.newrelic.account-id) | New Relic account ID. | | | [`management.metrics.export.newrelic.api-key`](#application-properties.actuator.management.metrics.export.newrelic.api-key) | New Relic API key. | | | [`management.metrics.export.newrelic.batch-size`](#application-properties.actuator.management.metrics.export.newrelic.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.newrelic.client-provider-type`](#application-properties.actuator.management.metrics.export.newrelic.client-provider-type) | Client provider type to use. | | | [`management.metrics.export.newrelic.connect-timeout`](#application-properties.actuator.management.metrics.export.newrelic.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.newrelic.enabled`](#application-properties.actuator.management.metrics.export.newrelic.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.newrelic.event-type`](#application-properties.actuator.management.metrics.export.newrelic.event-type) | The event type that should be published. This property will be ignored if 'meter-name-event-type-enabled' is set to 'true'. | `SpringBootSample` | | [`management.metrics.export.newrelic.meter-name-event-type-enabled`](#application-properties.actuator.management.metrics.export.newrelic.meter-name-event-type-enabled) | Whether to send the meter name as the event type instead of using the 'event-type' configuration property value. Can be set to 'true' if New Relic guidelines are not being followed or event types consistent with previous Spring Boot releases are required. | `false` | | [`management.metrics.export.newrelic.read-timeout`](#application-properties.actuator.management.metrics.export.newrelic.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.newrelic.step`](#application-properties.actuator.management.metrics.export.newrelic.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.newrelic.uri`](#application-properties.actuator.management.metrics.export.newrelic.uri) | URI to ship metrics to. | `https://insights-collector.newrelic.com` | | [`management.metrics.export.prometheus.descriptions`](#application-properties.actuator.management.metrics.export.prometheus.descriptions) | Whether to enable publishing descriptions as part of the scrape payload to Prometheus. Turn this off to minimize the amount of data sent on each scrape. | `true` | | [`management.metrics.export.prometheus.enabled`](#application-properties.actuator.management.metrics.export.prometheus.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.prometheus.histogram-flavor`](#application-properties.actuator.management.metrics.export.prometheus.histogram-flavor) | Histogram type for backing DistributionSummary and Timer. | `prometheus` | | [`management.metrics.export.prometheus.pushgateway.base-url`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.base-url) | Base URL for the Pushgateway. | `http://localhost:9091` | | [`management.metrics.export.prometheus.pushgateway.enabled`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.enabled) | Enable publishing via a Prometheus Pushgateway. | `false` | | [`management.metrics.export.prometheus.pushgateway.grouping-key.*`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.grouping-key) | Grouping key for the pushed metrics. | | | [`management.metrics.export.prometheus.pushgateway.job`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.job) | Job identifier for this application instance. | | | [`management.metrics.export.prometheus.pushgateway.password`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.password) | Login password of the Prometheus Pushgateway. | | | [`management.metrics.export.prometheus.pushgateway.push-rate`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.push-rate) | Frequency with which to push metrics. | `1m` | | [`management.metrics.export.prometheus.pushgateway.shutdown-operation`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.shutdown-operation) | Operation that should be performed on shutdown. | `none` | | [`management.metrics.export.prometheus.pushgateway.username`](#application-properties.actuator.management.metrics.export.prometheus.pushgateway.username) | Login user of the Prometheus Pushgateway. | | | [`management.metrics.export.prometheus.step`](#application-properties.actuator.management.metrics.export.prometheus.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.signalfx.access-token`](#application-properties.actuator.management.metrics.export.signalfx.access-token) | SignalFX access token. | | | [`management.metrics.export.signalfx.batch-size`](#application-properties.actuator.management.metrics.export.signalfx.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.signalfx.connect-timeout`](#application-properties.actuator.management.metrics.export.signalfx.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.signalfx.enabled`](#application-properties.actuator.management.metrics.export.signalfx.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.signalfx.read-timeout`](#application-properties.actuator.management.metrics.export.signalfx.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.signalfx.source`](#application-properties.actuator.management.metrics.export.signalfx.source) | Uniquely identifies the app instance that is publishing metrics to SignalFx. Defaults to the local host name. | | | [`management.metrics.export.signalfx.step`](#application-properties.actuator.management.metrics.export.signalfx.step) | Step size (i.e. reporting frequency) to use. | `10s` | | [`management.metrics.export.signalfx.uri`](#application-properties.actuator.management.metrics.export.signalfx.uri) | URI to ship metrics to. | `https://ingest.signalfx.com` | | [`management.metrics.export.simple.enabled`](#application-properties.actuator.management.metrics.export.simple.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.simple.mode`](#application-properties.actuator.management.metrics.export.simple.mode) | Counting mode. | `cumulative` | | [`management.metrics.export.simple.step`](#application-properties.actuator.management.metrics.export.simple.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.stackdriver.batch-size`](#application-properties.actuator.management.metrics.export.stackdriver.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.stackdriver.connect-timeout`](#application-properties.actuator.management.metrics.export.stackdriver.connect-timeout) | Connection timeout for requests to this backend. | `1s` | | [`management.metrics.export.stackdriver.enabled`](#application-properties.actuator.management.metrics.export.stackdriver.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.stackdriver.project-id`](#application-properties.actuator.management.metrics.export.stackdriver.project-id) | Identifier of the Google Cloud project to monitor. | | | [`management.metrics.export.stackdriver.read-timeout`](#application-properties.actuator.management.metrics.export.stackdriver.read-timeout) | Read timeout for requests to this backend. | `10s` | | [`management.metrics.export.stackdriver.resource-labels.*`](#application-properties.actuator.management.metrics.export.stackdriver.resource-labels) | Monitored resource's labels. | | | [`management.metrics.export.stackdriver.resource-type`](#application-properties.actuator.management.metrics.export.stackdriver.resource-type) | Monitored resource type. | `global` | | [`management.metrics.export.stackdriver.step`](#application-properties.actuator.management.metrics.export.stackdriver.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.stackdriver.use-semantic-metric-types`](#application-properties.actuator.management.metrics.export.stackdriver.use-semantic-metric-types) | Whether to use semantically correct metric types. When false, counter metrics are published as the GAUGE MetricKind. When true, counter metrics are published as the CUMULATIVE MetricKind. | `false` | | [`management.metrics.export.statsd.buffered`](#application-properties.actuator.management.metrics.export.statsd.buffered) | Whether measurements should be buffered before sending to the StatsD server. | `true` | | [`management.metrics.export.statsd.enabled`](#application-properties.actuator.management.metrics.export.statsd.enabled) | Whether exporting of metrics to StatsD is enabled. | `true` | | [`management.metrics.export.statsd.flavor`](#application-properties.actuator.management.metrics.export.statsd.flavor) | StatsD line protocol to use. | `datadog` | | [`management.metrics.export.statsd.host`](#application-properties.actuator.management.metrics.export.statsd.host) | Host of the StatsD server to receive exported metrics. | `localhost` | | [`management.metrics.export.statsd.max-packet-length`](#application-properties.actuator.management.metrics.export.statsd.max-packet-length) | Total length of a single payload should be kept within your network's MTU. | `1400` | | [`management.metrics.export.statsd.polling-frequency`](#application-properties.actuator.management.metrics.export.statsd.polling-frequency) | How often gauges will be polled. When a gauge is polled, its value is recalculated and if the value has changed (or publishUnchangedMeters is true), it is sent to the StatsD server. | `10s` | | [`management.metrics.export.statsd.port`](#application-properties.actuator.management.metrics.export.statsd.port) | Port of the StatsD server to receive exported metrics. | `8125` | | [`management.metrics.export.statsd.protocol`](#application-properties.actuator.management.metrics.export.statsd.protocol) | Protocol of the StatsD server to receive exported metrics. | `udp` | | [`management.metrics.export.statsd.publish-unchanged-meters`](#application-properties.actuator.management.metrics.export.statsd.publish-unchanged-meters) | Whether to send unchanged meters to the StatsD server. | `true` | | [`management.metrics.export.statsd.step`](#application-properties.actuator.management.metrics.export.statsd.step) | Step size to use in computing windowed statistics like max. To get the most out of these statistics, align the step interval to be close to your scrape interval. | `1m` | | [`management.metrics.export.wavefront.api-token`](#application-properties.actuator.management.metrics.export.wavefront.api-token) | API token used when publishing metrics directly to the Wavefront API host. | | | [`management.metrics.export.wavefront.batch-size`](#application-properties.actuator.management.metrics.export.wavefront.batch-size) | Number of measurements per request to use for this backend. If more measurements are found, then multiple requests will be made. | `10000` | | [`management.metrics.export.wavefront.enabled`](#application-properties.actuator.management.metrics.export.wavefront.enabled) | Whether exporting of metrics to this backend is enabled. | `true` | | [`management.metrics.export.wavefront.global-prefix`](#application-properties.actuator.management.metrics.export.wavefront.global-prefix) | Global prefix to separate metrics originating from this app's instrumentation from those originating from other Wavefront integrations when viewed in the Wavefront UI. | | | [`management.metrics.export.wavefront.sender.flush-interval`](#application-properties.actuator.management.metrics.export.wavefront.sender.flush-interval) | | `1s` | | [`management.metrics.export.wavefront.sender.max-queue-size`](#application-properties.actuator.management.metrics.export.wavefront.sender.max-queue-size) | | `50000` | | [`management.metrics.export.wavefront.sender.message-size`](#application-properties.actuator.management.metrics.export.wavefront.sender.message-size) | | | | [`management.metrics.export.wavefront.source`](#application-properties.actuator.management.metrics.export.wavefront.source) | Unique identifier for the app instance that is the source of metrics being published to Wavefront. Defaults to the local host name. | | | [`management.metrics.export.wavefront.step`](#application-properties.actuator.management.metrics.export.wavefront.step) | Step size (i.e. reporting frequency) to use. | `1m` | | [`management.metrics.export.wavefront.uri`](#application-properties.actuator.management.metrics.export.wavefront.uri) | URI to ship metrics to. | `https://longboard.wavefront.com` | | [`management.metrics.graphql.autotime.enabled`](#application-properties.actuator.management.metrics.graphql.autotime.enabled) | | `true` | | [`management.metrics.graphql.autotime.percentiles`](#application-properties.actuator.management.metrics.graphql.autotime.percentiles) | | | | [`management.metrics.graphql.autotime.percentiles-histogram`](#application-properties.actuator.management.metrics.graphql.autotime.percentiles-histogram) | | `false` | | [`management.metrics.mongo.command.enabled`](#application-properties.actuator.management.metrics.mongo.command.enabled) | Whether to enable Mongo client command metrics. | `true` | | [`management.metrics.mongo.connectionpool.enabled`](#application-properties.actuator.management.metrics.mongo.connectionpool.enabled) | Whether to enable Mongo connection pool metrics. | `true` | | [`management.metrics.system.diskspace.paths`](#application-properties.actuator.management.metrics.system.diskspace.paths) | Comma-separated list of paths to report disk metrics for. | `[.]` | | [`management.metrics.tags.*`](#application-properties.actuator.management.metrics.tags) | Common tags that are applied to every meter. | | | [`management.metrics.use-global-registry`](#application-properties.actuator.management.metrics.use-global-registry) | Whether auto-configured MeterRegistry implementations should be bound to the global static registry on Metrics. For testing, set this to 'false' to maximize test independence. | `true` | | [`management.metrics.web.client.max-uri-tags`](#application-properties.actuator.management.metrics.web.client.max-uri-tags) | Maximum number of unique URI tag values allowed. After the max number of tag values is reached, metrics with additional tag values are denied by filter. | `100` | | [`management.metrics.web.client.request.autotime.enabled`](#application-properties.actuator.management.metrics.web.client.request.autotime.enabled) | Whether to automatically time web client requests. | `true` | | [`management.metrics.web.client.request.autotime.percentiles`](#application-properties.actuator.management.metrics.web.client.request.autotime.percentiles) | Computed non-aggregable percentiles to publish. | | | [`management.metrics.web.client.request.autotime.percentiles-histogram`](#application-properties.actuator.management.metrics.web.client.request.autotime.percentiles-histogram) | Whether percentile histograms should be published. | `false` | | [`management.metrics.web.client.request.metric-name`](#application-properties.actuator.management.metrics.web.client.request.metric-name) | Name of the metric for sent requests. | `http.client.requests` | | [`management.metrics.web.server.max-uri-tags`](#application-properties.actuator.management.metrics.web.server.max-uri-tags) | Maximum number of unique URI tag values allowed. After the max number of tag values is reached, metrics with additional tag values are denied by filter. | `100` | | [`management.metrics.web.server.request.autotime.enabled`](#application-properties.actuator.management.metrics.web.server.request.autotime.enabled) | Whether to automatically time web server requests. | `true` | | [`management.metrics.web.server.request.autotime.percentiles`](#application-properties.actuator.management.metrics.web.server.request.autotime.percentiles) | Computed non-aggregable percentiles to publish. | | | [`management.metrics.web.server.request.autotime.percentiles-histogram`](#application-properties.actuator.management.metrics.web.server.request.autotime.percentiles-histogram) | Whether percentile histograms should be published. | `false` | | [`management.metrics.web.server.request.ignore-trailing-slash`](#application-properties.actuator.management.metrics.web.server.request.ignore-trailing-slash) | Whether the trailing slash should be ignored when recording metrics. | `true` | | [`management.metrics.web.server.request.metric-name`](#application-properties.actuator.management.metrics.web.server.request.metric-name) | Name of the metric for received requests. | `http.server.requests` | | [`management.server.add-application-context-header`](#application-properties.actuator.management.server.add-application-context-header) | Add the "X-Application-Context" HTTP header in each response. | `false` | | [`management.server.address`](#application-properties.actuator.management.server.address) | Network address to which the management endpoints should bind. Requires a custom management.server.port. | | | [`management.server.base-path`](#application-properties.actuator.management.server.base-path) | Management endpoint base path (for instance, '/management'). Requires a custom management.server.port. | | | [`management.server.port`](#application-properties.actuator.management.server.port) | Management endpoint HTTP port (uses the same port as the application by default). Configure a different port to use management-specific SSL. | | | [`management.server.ssl.certificate`](#application-properties.actuator.management.server.ssl.certificate) | | | | [`management.server.ssl.certificate-private-key`](#application-properties.actuator.management.server.ssl.certificate-private-key) | | | | [`management.server.ssl.ciphers`](#application-properties.actuator.management.server.ssl.ciphers) | Supported SSL ciphers. | | | [`management.server.ssl.client-auth`](#application-properties.actuator.management.server.ssl.client-auth) | Client authentication mode. Requires a trust store. | | | [`management.server.ssl.enabled`](#application-properties.actuator.management.server.ssl.enabled) | Whether to enable SSL support. | `true` | | [`management.server.ssl.enabled-protocols`](#application-properties.actuator.management.server.ssl.enabled-protocols) | Enabled SSL protocols. | | | [`management.server.ssl.key-alias`](#application-properties.actuator.management.server.ssl.key-alias) | Alias that identifies the key in the key store. | | | [`management.server.ssl.key-password`](#application-properties.actuator.management.server.ssl.key-password) | Password used to access the key in the key store. | | | [`management.server.ssl.key-store`](#application-properties.actuator.management.server.ssl.key-store) | Path to the key store that holds the SSL certificate (typically a jks file). | | | [`management.server.ssl.key-store-password`](#application-properties.actuator.management.server.ssl.key-store-password) | Password used to access the key store. | | | [`management.server.ssl.key-store-provider`](#application-properties.actuator.management.server.ssl.key-store-provider) | Provider for the key store. | | | [`management.server.ssl.key-store-type`](#application-properties.actuator.management.server.ssl.key-store-type) | Type of the key store. | | | [`management.server.ssl.protocol`](#application-properties.actuator.management.server.ssl.protocol) | SSL protocol to use. | `TLS` | | [`management.server.ssl.trust-certificate`](#application-properties.actuator.management.server.ssl.trust-certificate) | | | | [`management.server.ssl.trust-certificate-private-key`](#application-properties.actuator.management.server.ssl.trust-certificate-private-key) | | | | [`management.server.ssl.trust-store`](#application-properties.actuator.management.server.ssl.trust-store) | Trust store that holds SSL certificates. | | | [`management.server.ssl.trust-store-password`](#application-properties.actuator.management.server.ssl.trust-store-password) | Password used to access the trust store. | | | [`management.server.ssl.trust-store-provider`](#application-properties.actuator.management.server.ssl.trust-store-provider) | Provider for the trust store. | | | [`management.server.ssl.trust-store-type`](#application-properties.actuator.management.server.ssl.trust-store-type) | Type of the trust store. | | | [`management.trace.http.enabled`](#application-properties.actuator.management.trace.http.enabled) | Whether to enable HTTP request-response tracing. | `true` | | [`management.trace.http.include`](#application-properties.actuator.management.trace.http.include) | Items to be included in the trace. Defaults to request headers (excluding Authorization and Cookie), response headers (excluding Set-Cookie), and time taken. | `[request-headers, response-headers, errors]` | 15. Devtools Properties ------------------------ | Name | Description | Default Value | | --- | --- | --- | | [`spring.devtools.add-properties`](#application-properties.devtools.spring.devtools.add-properties) | Whether to enable development property defaults. | `true` | | [`spring.devtools.livereload.enabled`](#application-properties.devtools.spring.devtools.livereload.enabled) | Whether to enable a livereload.com-compatible server. | `true` | | [`spring.devtools.livereload.port`](#application-properties.devtools.spring.devtools.livereload.port) | Server port. | `35729` | | [`spring.devtools.remote.context-path`](#application-properties.devtools.spring.devtools.remote.context-path) | Context path used to handle the remote connection. | `/.~~spring-boot!~` | | [`spring.devtools.remote.proxy.host`](#application-properties.devtools.spring.devtools.remote.proxy.host) | The host of the proxy to use to connect to the remote application. | | | [`spring.devtools.remote.proxy.port`](#application-properties.devtools.spring.devtools.remote.proxy.port) | The port of the proxy to use to connect to the remote application. | | | [`spring.devtools.remote.restart.enabled`](#application-properties.devtools.spring.devtools.remote.restart.enabled) | Whether to enable remote restart. | `true` | | [`spring.devtools.remote.secret`](#application-properties.devtools.spring.devtools.remote.secret) | A shared secret required to establish a connection (required to enable remote support). | | | [`spring.devtools.remote.secret-header-name`](#application-properties.devtools.spring.devtools.remote.secret-header-name) | HTTP header used to transfer the shared secret. | `X-AUTH-TOKEN` | | [`spring.devtools.restart.additional-exclude`](#application-properties.devtools.spring.devtools.restart.additional-exclude) | Additional patterns that should be excluded from triggering a full restart. | | | [`spring.devtools.restart.additional-paths`](#application-properties.devtools.spring.devtools.restart.additional-paths) | Additional paths to watch for changes. | | | [`spring.devtools.restart.enabled`](#application-properties.devtools.spring.devtools.restart.enabled) | Whether to enable automatic restart. | `true` | | [`spring.devtools.restart.exclude`](#application-properties.devtools.spring.devtools.restart.exclude) | Patterns that should be excluded from triggering a full restart. | `META-INF/maven/**,META-INF/resources/**,resources/**,static/**,public/**,templates/**,**/*Test.class,**/*Tests.class,git.properties,META-INF/build-info.properties` | | [`spring.devtools.restart.log-condition-evaluation-delta`](#application-properties.devtools.spring.devtools.restart.log-condition-evaluation-delta) | Whether to log the condition evaluation delta upon restart. | `true` | | [`spring.devtools.restart.poll-interval`](#application-properties.devtools.spring.devtools.restart.poll-interval) | Amount of time to wait between polling for classpath changes. | `1s` | | [`spring.devtools.restart.quiet-period`](#application-properties.devtools.spring.devtools.restart.quiet-period) | Amount of quiet time required without any classpath changes before a restart is triggered. | `400ms` | | [`spring.devtools.restart.trigger-file`](#application-properties.devtools.spring.devtools.restart.trigger-file) | Name of a specific file that, when changed, triggers the restart check. Must be a simple name (without any path) of a file that appears on your classpath. If not specified, any classpath file change triggers the restart. | | 16. Testing Properties ----------------------- | Name | Description | Default Value | | --- | --- | --- | | [`spring.test.database.replace`](#application-properties.testing.spring.test.database.replace) | Type of existing DataSource to replace. | `any` | | [`spring.test.mockmvc.print`](#application-properties.testing.spring.test.mockmvc.print) | MVC Print option. | `default` |
programming_docs
spring_boot IO IO == Most applications will need to deal with input and output concerns at some point. Spring Boot provides utilities and integrations with a range of technologies to help when you need IO capabilities. This section covers standard IO features such as caching and validation as well as more advanced topics such as scheduling and distributed transactions. We will also cover calling remote REST or SOAP services and sending email. 1. Caching ----------- The Spring Framework provides support for transparently adding caching to an application. At its core, the abstraction applies caching to methods, thus reducing the number of executions based on the information available in the cache. The caching logic is applied transparently, without any interference to the invoker. Spring Boot auto-configures the cache infrastructure as long as caching support is enabled by using the `@EnableCaching` annotation. | | | | --- | --- | | | Check the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/integration.html#cache) of the Spring Framework reference for more details. | In a nutshell, to add caching to an operation of your service add the relevant annotation to its method, as shown in the following example: Java ``` import org.springframework.cache.annotation.Cacheable; import org.springframework.stereotype.Component; @Component public class MyMathService { @Cacheable("piDecimals") public int computePiDecimal(int precision) { ... } } ``` Kotlin ``` import org.springframework.cache.annotation.Cacheable import org.springframework.stereotype.Component @Component class MyMathService { @Cacheable("piDecimals") fun computePiDecimal(precision: Int): Int { ... } } ``` This example demonstrates the use of caching on a potentially costly operation. Before invoking `computePiDecimal`, the abstraction looks for an entry in the `piDecimals` cache that matches the `i` argument. If an entry is found, the content in the cache is immediately returned to the caller, and the method is not invoked. Otherwise, the method is invoked, and the cache is updated before returning the value. | | | | --- | --- | | | You can also use the standard JSR-107 (JCache) annotations (such as `@CacheResult`) transparently. However, we strongly advise you to not mix and match the Spring Cache and JCache annotations. | If you do not add any specific cache library, Spring Boot auto-configures a [simple provider](#io.caching.provider.simple) that uses concurrent maps in memory. When a cache is required (such as `piDecimals` in the preceding example), this provider creates it for you. The simple provider is not really recommended for production usage, but it is great for getting started and making sure that you understand the features. When you have made up your mind about the cache provider to use, please make sure to read its documentation to figure out how to configure the caches that your application uses. Nearly all providers require you to explicitly configure every cache that you use in the application. Some offer a way to customize the default caches defined by the `spring.cache.cache-names` property. | | | | --- | --- | | | It is also possible to transparently [update](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/integration.html#cache-annotations-put) or [evict](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/integration.html#cache-annotations-evict) data from the cache. | ### 1.1. Supported Cache Providers The cache abstraction does not provide an actual store and relies on abstraction materialized by the `org.springframework.cache.Cache` and `org.springframework.cache.CacheManager` interfaces. If you have not defined a bean of type `CacheManager` or a `CacheResolver` named `cacheResolver` (see [`CachingConfigurer`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/cache/annotation/CachingConfigurer.html)), Spring Boot tries to detect the following providers (in the indicated order): 1. [Generic](#io.caching.provider.generic) 2. [JCache (JSR-107)](#io.caching.provider.jcache) (EhCache 3, Hazelcast, Infinispan, and others) 3. [EhCache 2.x](#io.caching.provider.ehcache2) 4. [Hazelcast](#io.caching.provider.hazelcast) 5. [Infinispan](#io.caching.provider.infinispan) 6. [Couchbase](#io.caching.provider.couchbase) 7. [Redis](#io.caching.provider.redis) 8. [Caffeine](#io.caching.provider.caffeine) 9. [Cache2k](#io.caching.provider.cache2k) 10. [Simple](#io.caching.provider.simple) Additionally, [Spring Boot for Apache Geode](https://github.com/spring-projects/spring-boot-data-geode) provides [auto-configuration for using Apache Geode as a cache provider](https://docs.spring.io/spring-boot-data-geode-build/1.7.x/reference/html5/#geode-caching-provider). | | | | --- | --- | | | It is also possible to *force* a particular cache provider by setting the `spring.cache.type` property. Use this property if you need to [disable caching altogether](#io.caching.provider.none) in certain environments (such as tests). | | | | | --- | --- | | | Use the `spring-boot-starter-cache` “Starter” to quickly add basic caching dependencies. The starter brings in `spring-context-support`. If you add dependencies manually, you must include `spring-context-support` in order to use the JCache, EhCache 2.x, or Caffeine support. | If the `CacheManager` is auto-configured by Spring Boot, you can further tune its configuration before it is fully initialized by exposing a bean that implements the `CacheManagerCustomizer` interface. The following example sets a flag to say that `null` values should not be passed down to the underlying map: Java ``` import org.springframework.boot.autoconfigure.cache.CacheManagerCustomizer; import org.springframework.cache.concurrent.ConcurrentMapCacheManager; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyCacheManagerConfiguration { @Bean public CacheManagerCustomizer<ConcurrentMapCacheManager> cacheManagerCustomizer() { return (cacheManager) -> cacheManager.setAllowNullValues(false); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.cache.CacheManagerCustomizer import org.springframework.cache.concurrent.ConcurrentMapCacheManager import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyCacheManagerConfiguration { @Bean fun cacheManagerCustomizer(): CacheManagerCustomizer<ConcurrentMapCacheManager> { return CacheManagerCustomizer { cacheManager -> cacheManager.isAllowNullValues = false } } } ``` | | | | --- | --- | | | In the preceding example, an auto-configured `ConcurrentMapCacheManager` is expected. If that is not the case (either you provided your own config or a different cache provider was auto-configured), the customizer is not invoked at all. You can have as many customizers as you want, and you can also order them by using `@Order` or `Ordered`. | #### 1.1.1. Generic Generic caching is used if the context defines *at least* one `org.springframework.cache.Cache` bean. A `CacheManager` wrapping all beans of that type is created. #### 1.1.2. JCache (JSR-107) [JCache](https://jcp.org/en/jsr/detail?id=107) is bootstrapped through the presence of a `javax.cache.spi.CachingProvider` on the classpath (that is, a JSR-107 compliant caching library exists on the classpath), and the `JCacheCacheManager` is provided by the `spring-boot-starter-cache` “Starter”. Various compliant libraries are available, and Spring Boot provides dependency management for Ehcache 3, Hazelcast, and Infinispan. Any other compliant library can be added as well. It might happen that more than one provider is present, in which case the provider must be explicitly specified. Even if the JSR-107 standard does not enforce a standardized way to define the location of the configuration file, Spring Boot does its best to accommodate setting a cache with implementation details, as shown in the following example: Properties ``` # Only necessary if more than one provider is present spring.cache.jcache.provider=com.example.MyCachingProvider spring.cache.jcache.config=classpath:example.xml ``` Yaml ``` # Only necessary if more than one provider is present spring: cache: jcache: provider: "com.example.MyCachingProvider" config: "classpath:example.xml" ``` | | | | --- | --- | | | When a cache library offers both a native implementation and JSR-107 support, Spring Boot prefers the JSR-107 support, so that the same features are available if you switch to a different JSR-107 implementation. | | | | | --- | --- | | | Spring Boot has [general support for Hazelcast](#io.hazelcast). If a single `HazelcastInstance` is available, it is automatically reused for the `CacheManager` as well, unless the `spring.cache.jcache.config` property is specified. | There are two ways to customize the underlying `javax.cache.cacheManager`: * Caches can be created on startup by setting the `spring.cache.cache-names` property. If a custom `javax.cache.configuration.Configuration` bean is defined, it is used to customize them. * `org.springframework.boot.autoconfigure.cache.JCacheManagerCustomizer` beans are invoked with the reference of the `CacheManager` for full customization. | | | | --- | --- | | | If a standard `javax.cache.CacheManager` bean is defined, it is wrapped automatically in an `org.springframework.cache.CacheManager` implementation that the abstraction expects. No further customization is applied to it. | #### 1.1.3. EhCache 2.x [EhCache](https://www.ehcache.org/) 2.x is used if a file named `ehcache.xml` can be found at the root of the classpath. If EhCache 2.x is found, the `EhCacheCacheManager` provided by the `spring-boot-starter-cache` “Starter” is used to bootstrap the cache manager. An alternate configuration file can be provided as well, as shown in the following example: Properties ``` spring.cache.ehcache.config=classpath:config/another-config.xml ``` Yaml ``` spring: cache: ehcache: config: "classpath:config/another-config.xml" ``` #### 1.1.4. Hazelcast Spring Boot has [general support for Hazelcast](#io.hazelcast). If a `HazelcastInstance` has been auto-configured, it is automatically wrapped in a `CacheManager`. #### 1.1.5. Infinispan [Infinispan](https://infinispan.org/) has no default configuration file location, so it must be specified explicitly. Otherwise, the default bootstrap is used. Properties ``` spring.cache.infinispan.config=infinispan.xml ``` Yaml ``` spring: cache: infinispan: config: "infinispan.xml" ``` Caches can be created on startup by setting the `spring.cache.cache-names` property. If a custom `ConfigurationBuilder` bean is defined, it is used to customize the caches. | | | | --- | --- | | | The support of Infinispan in Spring Boot is restricted to the embedded mode and is quite basic. If you want more options, you should use the official Infinispan Spring Boot starter instead. See [Infinispan’s documentation](https://github.com/infinispan/infinispan-spring-boot) for more details. | #### 1.1.6. Couchbase If Spring Data Couchbase is available and Couchbase is [configured](data#data.nosql.couchbase), a `CouchbaseCacheManager` is auto-configured. It is possible to create additional caches on startup by setting the `spring.cache.cache-names` property and cache defaults can be configured by using `spring.cache.couchbase.*` properties. For instance, the following configuration creates `cache1` and `cache2` caches with an entry *expiration* of 10 minutes: Properties ``` spring.cache.cache-names=cache1,cache2 spring.cache.couchbase.expiration=10m ``` Yaml ``` spring: cache: cache-names: "cache1,cache2" couchbase: expiration: "10m" ``` If you need more control over the configuration, consider registering a `CouchbaseCacheManagerBuilderCustomizer` bean. The following example shows a customizer that configures a specific entry expiration for `cache1` and `cache2`: Java ``` import java.time.Duration; import org.springframework.boot.autoconfigure.cache.CouchbaseCacheManagerBuilderCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.data.couchbase.cache.CouchbaseCacheConfiguration; @Configuration(proxyBeanMethods = false) public class MyCouchbaseCacheManagerConfiguration { @Bean public CouchbaseCacheManagerBuilderCustomizer myCouchbaseCacheManagerBuilderCustomizer() { return (builder) -> builder .withCacheConfiguration("cache1", CouchbaseCacheConfiguration .defaultCacheConfig().entryExpiry(Duration.ofSeconds(10))) .withCacheConfiguration("cache2", CouchbaseCacheConfiguration .defaultCacheConfig().entryExpiry(Duration.ofMinutes(1))); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.cache.CouchbaseCacheManagerBuilderCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.data.couchbase.cache.CouchbaseCacheConfiguration import java.time.Duration @Configuration(proxyBeanMethods = false) class MyCouchbaseCacheManagerConfiguration { @Bean fun myCouchbaseCacheManagerBuilderCustomizer(): CouchbaseCacheManagerBuilderCustomizer { return CouchbaseCacheManagerBuilderCustomizer { builder -> builder .withCacheConfiguration( "cache1", CouchbaseCacheConfiguration .defaultCacheConfig().entryExpiry(Duration.ofSeconds(10)) ) .withCacheConfiguration( "cache2", CouchbaseCacheConfiguration .defaultCacheConfig().entryExpiry(Duration.ofMinutes(1)) ) } } } ``` #### 1.1.7. Redis If [Redis](https://redis.io/) is available and configured, a `RedisCacheManager` is auto-configured. It is possible to create additional caches on startup by setting the `spring.cache.cache-names` property and cache defaults can be configured by using `spring.cache.redis.*` properties. For instance, the following configuration creates `cache1` and `cache2` caches with a *time to live* of 10 minutes: Properties ``` spring.cache.cache-names=cache1,cache2 spring.cache.redis.time-to-live=10m ``` Yaml ``` spring: cache: cache-names: "cache1,cache2" redis: time-to-live: "10m" ``` | | | | --- | --- | | | By default, a key prefix is added so that, if two separate caches use the same key, Redis does not have overlapping keys and cannot return invalid values. We strongly recommend keeping this setting enabled if you create your own `RedisCacheManager`. | | | | | --- | --- | | | You can take full control of the default configuration by adding a `RedisCacheConfiguration` `@Bean` of your own. This can be useful if you need to customize the default serialization strategy. | If you need more control over the configuration, consider registering a `RedisCacheManagerBuilderCustomizer` bean. The following example shows a customizer that configures a specific time to live for `cache1` and `cache2`: Java ``` import java.time.Duration; import org.springframework.boot.autoconfigure.cache.RedisCacheManagerBuilderCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.data.redis.cache.RedisCacheConfiguration; @Configuration(proxyBeanMethods = false) public class MyRedisCacheManagerConfiguration { @Bean public RedisCacheManagerBuilderCustomizer myRedisCacheManagerBuilderCustomizer() { return (builder) -> builder .withCacheConfiguration("cache1", RedisCacheConfiguration .defaultCacheConfig().entryTtl(Duration.ofSeconds(10))) .withCacheConfiguration("cache2", RedisCacheConfiguration .defaultCacheConfig().entryTtl(Duration.ofMinutes(1))); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.cache.RedisCacheManagerBuilderCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.data.redis.cache.RedisCacheConfiguration import java.time.Duration @Configuration(proxyBeanMethods = false) class MyRedisCacheManagerConfiguration { @Bean fun myRedisCacheManagerBuilderCustomizer(): RedisCacheManagerBuilderCustomizer { return RedisCacheManagerBuilderCustomizer { builder -> builder .withCacheConfiguration( "cache1", RedisCacheConfiguration .defaultCacheConfig().entryTtl(Duration.ofSeconds(10)) ) .withCacheConfiguration( "cache2", RedisCacheConfiguration .defaultCacheConfig().entryTtl(Duration.ofMinutes(1)) ) } } } ``` #### 1.1.8. Caffeine [Caffeine](https://github.com/ben-manes/caffeine) is a Java 8 rewrite of Guava’s cache that supersedes support for Guava. If Caffeine is present, a `CaffeineCacheManager` (provided by the `spring-boot-starter-cache` “Starter”) is auto-configured. Caches can be created on startup by setting the `spring.cache.cache-names` property and can be customized by one of the following (in the indicated order): 1. A cache spec defined by `spring.cache.caffeine.spec` 2. A `com.github.benmanes.caffeine.cache.CaffeineSpec` bean is defined 3. A `com.github.benmanes.caffeine.cache.Caffeine` bean is defined For instance, the following configuration creates `cache1` and `cache2` caches with a maximum size of 500 and a *time to live* of 10 minutes Properties ``` spring.cache.cache-names=cache1,cache2 spring.cache.caffeine.spec=maximumSize=500,expireAfterAccess=600s ``` Yaml ``` spring: cache: cache-names: "cache1,cache2" caffeine: spec: "maximumSize=500,expireAfterAccess=600s" ``` If a `com.github.benmanes.caffeine.cache.CacheLoader` bean is defined, it is automatically associated to the `CaffeineCacheManager`. Since the `CacheLoader` is going to be associated with *all* caches managed by the cache manager, it must be defined as `CacheLoader<Object, Object>`. The auto-configuration ignores any other generic type. #### 1.1.9. Cache2k [Cache2k](https://cache2k.org/) is an in-memory cache. If the Cache2k spring integration is present, a `SpringCache2kCacheManager` is auto-configured. Caches can be created on startup by setting the `spring.cache.cache-names` property. Cache defaults can be customized using a `Cache2kBuilderCustomizer` bean. The following example shows a customizer that configures the capacity of the cache to 200 entries, with an expiration of 5 minutes: Java ``` import java.util.concurrent.TimeUnit; import org.springframework.boot.autoconfigure.cache.Cache2kBuilderCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyCache2kDefaultsConfiguration { @Bean public Cache2kBuilderCustomizer myCache2kDefaultsCustomizer() { return (builder) -> builder.entryCapacity(200) .expireAfterWrite(5, TimeUnit.MINUTES); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.cache.Cache2kBuilderCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import java.util.concurrent.TimeUnit @Configuration(proxyBeanMethods = false) class MyCache2kDefaultsConfiguration { @Bean fun myCache2kDefaultsCustomizer(): Cache2kBuilderCustomizer { return Cache2kBuilderCustomizer { builder -> builder.entryCapacity(200) .expireAfterWrite(5, TimeUnit.MINUTES) } } } ``` #### 1.1.10. Simple If none of the other providers can be found, a simple implementation using a `ConcurrentHashMap` as the cache store is configured. This is the default if no caching library is present in your application. By default, caches are created as needed, but you can restrict the list of available caches by setting the `cache-names` property. For instance, if you want only `cache1` and `cache2` caches, set the `cache-names` property as follows: Properties ``` spring.cache.cache-names=cache1,cache2 ``` Yaml ``` spring: cache: cache-names: "cache1,cache2" ``` If you do so and your application uses a cache not listed, then it fails at runtime when the cache is needed, but not on startup. This is similar to the way the "real" cache providers behave if you use an undeclared cache. #### 1.1.11. None When `@EnableCaching` is present in your configuration, a suitable cache configuration is expected as well. If you need to disable caching altogether in certain environments, force the cache type to `none` to use a no-op implementation, as shown in the following example: Properties ``` spring.cache.type=none ``` Yaml ``` spring: cache: type: "none" ``` 2. Hazelcast ------------- If [Hazelcast](https://hazelcast.com/) is on the classpath and a suitable configuration is found, Spring Boot auto-configures a `HazelcastInstance` that you can inject in your application. Spring Boot first attempts to create a client by checking the following configuration options: * The presence of a `com.hazelcast.client.config.ClientConfig` bean. * A configuration file defined by the `spring.hazelcast.config` property. * The presence of the `hazelcast.client.config` system property. * A `hazelcast-client.xml` in the working directory or at the root of the classpath. * A `hazelcast-client.yaml` in the working directory or at the root of the classpath. | | | | --- | --- | | | Hazelcast 3 support is deprecated. If you still need to downgrade to Hazelcast 3, `hazelcast-client` should be added to the classpath to configure a client. | If a client can not be created, Spring Boot attempts to configure an embedded server. If you define a `com.hazelcast.config.Config` bean, Spring Boot uses that. If your configuration defines an instance name, Spring Boot tries to locate an existing instance rather than creating a new one. You could also specify the Hazelcast configuration file to use through configuration, as shown in the following example: Properties ``` spring.hazelcast.config=classpath:config/my-hazelcast.xml ``` Yaml ``` spring: hazelcast: config: "classpath:config/my-hazelcast.xml" ``` Otherwise, Spring Boot tries to find the Hazelcast configuration from the default locations: `hazelcast.xml` in the working directory or at the root of the classpath, or a `.yaml` counterpart in the same locations. We also check if the `hazelcast.config` system property is set. See the [Hazelcast documentation](https://docs.hazelcast.org/docs/latest/manual/html-single/) for more details. | | | | --- | --- | | | By default, `@SpringAware` on Hazelcast components is supported. The `ManagementContext` can be overridden by declaring a `HazelcastConfigCustomizer` bean with an `@Order` higher than zero. | | | | | --- | --- | | | Spring Boot also has [explicit caching support for Hazelcast](#io.caching.provider.hazelcast). If caching is enabled, the `HazelcastInstance` is automatically wrapped in a `CacheManager` implementation. | 3. Quartz Scheduler -------------------- Spring Boot offers several conveniences for working with the [Quartz scheduler](https://www.quartz-scheduler.org/), including the `spring-boot-starter-quartz` “Starter”. If Quartz is available, a `Scheduler` is auto-configured (through the `SchedulerFactoryBean` abstraction). Beans of the following types are automatically picked up and associated with the `Scheduler`: * `JobDetail`: defines a particular Job. `JobDetail` instances can be built with the `JobBuilder` API. * `Calendar`. * `Trigger`: defines when a particular job is triggered. By default, an in-memory `JobStore` is used. However, it is possible to configure a JDBC-based store if a `DataSource` bean is available in your application and if the `spring.quartz.job-store-type` property is configured accordingly, as shown in the following example: Properties ``` spring.quartz.job-store-type=jdbc ``` Yaml ``` spring: quartz: job-store-type: "jdbc" ``` When the JDBC store is used, the schema can be initialized on startup, as shown in the following example: Properties ``` spring.quartz.jdbc.initialize-schema=always ``` Yaml ``` spring: quartz: jdbc: initialize-schema: "always" ``` | | | | --- | --- | | | By default, the database is detected and initialized by using the standard scripts provided with the Quartz library. These scripts drop existing tables, deleting all triggers on every restart. It is also possible to provide a custom script by setting the `spring.quartz.jdbc.schema` property. | To have Quartz use a `DataSource` other than the application’s main `DataSource`, declare a `DataSource` bean, annotating its `@Bean` method with `@QuartzDataSource`. Doing so ensures that the Quartz-specific `DataSource` is used by both the `SchedulerFactoryBean` and for schema initialization. Similarly, to have Quartz use a `TransactionManager` other than the application’s main `TransactionManager` declare a `TransactionManager` bean, annotating its `@Bean` method with `@QuartzTransactionManager`. By default, jobs created by configuration will not overwrite already registered jobs that have been read from a persistent job store. To enable overwriting existing job definitions set the `spring.quartz.overwrite-existing-jobs` property. Quartz Scheduler configuration can be customized using `spring.quartz` properties and `SchedulerFactoryBeanCustomizer` beans, which allow programmatic `SchedulerFactoryBean` customization. Advanced Quartz configuration properties can be customized using `spring.quartz.properties.*`. | | | | --- | --- | | | In particular, an `Executor` bean is not associated with the scheduler as Quartz offers a way to configure the scheduler through `spring.quartz.properties`. If you need to customize the task executor, consider implementing `SchedulerFactoryBeanCustomizer`. | Jobs can define setters to inject data map properties. Regular beans can also be injected in a similar manner, as shown in the following example: Java ``` import org.quartz.JobExecutionContext; import org.quartz.JobExecutionException; import org.springframework.scheduling.quartz.QuartzJobBean; public class MySampleJob extends QuartzJobBean { // fields ... private MyService myService; private String name; // Inject "MyService" bean public void setMyService(MyService myService) { this.myService = myService; } // Inject the "name" job data property public void setName(String name) { this.name = name; } @Override protected void executeInternal(JobExecutionContext context) throws JobExecutionException { this.myService.someMethod(context.getFireTime(), this.name); } } ``` Kotlin ``` import org.quartz.JobExecutionContext import org.springframework.scheduling.quartz.QuartzJobBean class MySampleJob : QuartzJobBean() { // fields ... private var myService: MyService? = null private var name: String? = null // Inject "MyService" bean fun setMyService(myService: MyService?) { this.myService = myService } // Inject the "name" job data property fun setName(name: String?) { this.name = name } override fun executeInternal(context: JobExecutionContext) { myService!!.someMethod(context.fireTime, name) } } ``` 4. Sending Email ----------------- The Spring Framework provides an abstraction for sending email by using the `JavaMailSender` interface, and Spring Boot provides auto-configuration for it as well as a starter module. | | | | --- | --- | | | See the [reference documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/integration.html#mail) for a detailed explanation of how you can use `JavaMailSender`. | If `spring.mail.host` and the relevant libraries (as defined by `spring-boot-starter-mail`) are available, a default `JavaMailSender` is created if none exists. The sender can be further customized by configuration items from the `spring.mail` namespace. See [`MailProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mail/MailProperties.java) for more details. In particular, certain default timeout values are infinite, and you may want to change that to avoid having a thread blocked by an unresponsive mail server, as shown in the following example: Properties ``` spring.mail.properties[mail.smtp.connectiontimeout]=5000 spring.mail.properties[mail.smtp.timeout]=3000 spring.mail.properties[mail.smtp.writetimeout]=5000 ``` Yaml ``` spring: mail: properties: "[mail.smtp.connectiontimeout]": 5000 "[mail.smtp.timeout]": 3000 "[mail.smtp.writetimeout]": 5000 ``` It is also possible to configure a `JavaMailSender` with an existing `Session` from JNDI: Properties ``` spring.mail.jndi-name=mail/Session ``` Yaml ``` spring: mail: jndi-name: "mail/Session" ``` When a `jndi-name` is set, it takes precedence over all other Session-related settings. 5. Validation -------------- The method validation feature supported by Bean Validation 1.1 is automatically enabled as long as a JSR-303 implementation (such as Hibernate validator) is on the classpath. This lets bean methods be annotated with `javax.validation` constraints on their parameters and/or on their return value. Target classes with such annotated methods need to be annotated with the `@Validated` annotation at the type level for their methods to be searched for inline constraint annotations. For instance, the following service triggers the validation of the first argument, making sure its size is between 8 and 10: Java ``` import javax.validation.constraints.Size; import org.springframework.stereotype.Service; import org.springframework.validation.annotation.Validated; @Service @Validated public class MyBean { public Archive findByCodeAndAuthor(@Size(min = 8, max = 10) String code, Author author) { return ... } } ``` Kotlin ``` import org.springframework.stereotype.Service import org.springframework.validation.annotation.Validated import javax.validation.constraints.Size @Service @Validated class MyBean { fun findByCodeAndAuthor(code: @Size(min = 8, max = 10) String?, author: Author?): Archive? { return null } } ``` The application’s `MessageSource` is used when resolving `{parameters}` in constraint messages. This allows you to use [your application’s `messages.properties` files](features#features.internationalization) for Bean Validation messages. Once the parameters have been resolved, message interpolation is completed using Bean Validation’s default interpolator. 6. Calling REST Services ------------------------- If your application calls remote REST services, Spring Boot makes that very convenient using a `RestTemplate` or a `WebClient`. ### 6.1. RestTemplate If you need to call remote REST services from your application, you can use the Spring Framework’s [`RestTemplate`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/web/client/RestTemplate.html) class. Since `RestTemplate` instances often need to be customized before being used, Spring Boot does not provide any single auto-configured `RestTemplate` bean. It does, however, auto-configure a `RestTemplateBuilder`, which can be used to create `RestTemplate` instances when needed. The auto-configured `RestTemplateBuilder` ensures that sensible `HttpMessageConverters` are applied to `RestTemplate` instances. The following code shows a typical example: Java ``` import org.springframework.boot.web.client.RestTemplateBuilder; import org.springframework.stereotype.Service; import org.springframework.web.client.RestTemplate; @Service public class MyService { private final RestTemplate restTemplate; public MyService(RestTemplateBuilder restTemplateBuilder) { this.restTemplate = restTemplateBuilder.build(); } public Details someRestCall(String name) { return this.restTemplate.getForObject("/{name}/details", Details.class, name); } } ``` Kotlin ``` import org.springframework.boot.web.client.RestTemplateBuilder import org.springframework.stereotype.Service import org.springframework.web.client.RestTemplate @Service class MyService(restTemplateBuilder: RestTemplateBuilder) { private val restTemplate: RestTemplate init { restTemplate = restTemplateBuilder.build() } fun someRestCall(name: String): Details { return restTemplate.getForObject( "/{name}/details", Details::class.java, name )!! } } ``` | | | | --- | --- | | | `RestTemplateBuilder` includes a number of useful methods that can be used to quickly configure a `RestTemplate`. For example, to add BASIC auth support, you can use `builder.basicAuthentication("user", "password").build()`. | #### 6.1.1. RestTemplate Customization There are three main approaches to `RestTemplate` customization, depending on how broadly you want the customizations to apply. To make the scope of any customizations as narrow as possible, inject the auto-configured `RestTemplateBuilder` and then call its methods as required. Each method call returns a new `RestTemplateBuilder` instance, so the customizations only affect this use of the builder. To make an application-wide, additive customization, use a `RestTemplateCustomizer` bean. All such beans are automatically registered with the auto-configured `RestTemplateBuilder` and are applied to any templates that are built with it. The following example shows a customizer that configures the use of a proxy for all hosts except `192.168.0.5`: Java ``` import org.apache.http.HttpException; import org.apache.http.HttpHost; import org.apache.http.HttpRequest; import org.apache.http.client.HttpClient; import org.apache.http.conn.routing.HttpRoutePlanner; import org.apache.http.impl.client.HttpClientBuilder; import org.apache.http.impl.conn.DefaultProxyRoutePlanner; import org.apache.http.protocol.HttpContext; import org.springframework.boot.web.client.RestTemplateCustomizer; import org.springframework.http.client.HttpComponentsClientHttpRequestFactory; import org.springframework.web.client.RestTemplate; public class MyRestTemplateCustomizer implements RestTemplateCustomizer { @Override public void customize(RestTemplate restTemplate) { HttpRoutePlanner routePlanner = new CustomRoutePlanner(new HttpHost("proxy.example.com")); HttpClient httpClient = HttpClientBuilder.create().setRoutePlanner(routePlanner).build(); restTemplate.setRequestFactory(new HttpComponentsClientHttpRequestFactory(httpClient)); } static class CustomRoutePlanner extends DefaultProxyRoutePlanner { CustomRoutePlanner(HttpHost proxy) { super(proxy); } @Override public HttpHost determineProxy(HttpHost target, HttpRequest request, HttpContext context) throws HttpException { if (target.getHostName().equals("192.168.0.5")) { return null; } return super.determineProxy(target, request, context); } } } ``` Kotlin ``` import org.apache.http.HttpException import org.apache.http.HttpHost import org.apache.http.HttpRequest import org.apache.http.client.HttpClient import org.apache.http.conn.routing.HttpRoutePlanner import org.apache.http.impl.client.HttpClientBuilder import org.apache.http.impl.conn.DefaultProxyRoutePlanner import org.apache.http.protocol.HttpContext import org.springframework.boot.web.client.RestTemplateCustomizer import org.springframework.http.client.HttpComponentsClientHttpRequestFactory import org.springframework.web.client.RestTemplate import kotlin.jvm.Throws class MyRestTemplateCustomizer : RestTemplateCustomizer { override fun customize(restTemplate: RestTemplate) { val routePlanner: HttpRoutePlanner = CustomRoutePlanner(HttpHost("proxy.example.com")) val httpClient: HttpClient = HttpClientBuilder.create().setRoutePlanner(routePlanner).build() restTemplate.requestFactory = HttpComponentsClientHttpRequestFactory(httpClient) } internal class CustomRoutePlanner(proxy: HttpHost?) : DefaultProxyRoutePlanner(proxy) { @Throws(HttpException::class) public override fun determineProxy(target: HttpHost, request: HttpRequest, context: HttpContext): HttpHost? { if (target.hostName == "192.168.0.5") { return null } return super.determineProxy(target, request, context) } } } ``` Finally, you can define your own `RestTemplateBuilder` bean. Doing so will replace the auto-configured builder. If you want any `RestTemplateCustomizer` beans to be applied to your custom builder, as the auto-configuration would have done, configure it using a `RestTemplateBuilderConfigurer`. The following example exposes a `RestTemplateBuilder` that matches what Spring Boot’s auto-configuration would have done, except that custom connect and read timeouts are also specified: Java ``` import java.time.Duration; import org.springframework.boot.autoconfigure.web.client.RestTemplateBuilderConfigurer; import org.springframework.boot.web.client.RestTemplateBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyRestTemplateBuilderConfiguration { @Bean public RestTemplateBuilder restTemplateBuilder(RestTemplateBuilderConfigurer configurer) { return configurer.configure(new RestTemplateBuilder()).setConnectTimeout(Duration.ofSeconds(5)) .setReadTimeout(Duration.ofSeconds(2)); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.web.client.RestTemplateBuilderConfigurer import org.springframework.boot.web.client.RestTemplateBuilder import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import java.time.Duration @Configuration(proxyBeanMethods = false) class MyRestTemplateBuilderConfiguration { @Bean fun restTemplateBuilder(configurer: RestTemplateBuilderConfigurer): RestTemplateBuilder { return configurer.configure(RestTemplateBuilder()).setConnectTimeout(Duration.ofSeconds(5)) .setReadTimeout(Duration.ofSeconds(2)) } } ``` The most extreme (and rarely used) option is to create your own `RestTemplateBuilder` bean without using a configurer. In addition to replacing the auto-configured builder, this also prevents any `RestTemplateCustomizer` beans from being used. ### 6.2. WebClient If you have Spring WebFlux on your classpath, you can also choose to use `WebClient` to call remote REST services. Compared to `RestTemplate`, this client has a more functional feel and is fully reactive. You can learn more about the `WebClient` in the dedicated [section in the Spring Framework docs](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-client). Spring Boot creates and pre-configures a `WebClient.Builder` for you. It is strongly advised to inject it in your components and use it to create `WebClient` instances. Spring Boot is configuring that builder to share HTTP resources, reflect codecs setup in the same fashion as the server ones (see [WebFlux HTTP codecs auto-configuration](web#web.reactive.webflux.httpcodecs)), and more. The following code shows a typical example: Java ``` import org.neo4j.cypherdsl.core.Relationship.Details; import reactor.core.publisher.Mono; import org.springframework.stereotype.Service; import org.springframework.web.reactive.function.client.WebClient; @Service public class MyService { private final WebClient webClient; public MyService(WebClient.Builder webClientBuilder) { this.webClient = webClientBuilder.baseUrl("https://example.org").build(); } public Mono<Details> someRestCall(String name) { return this.webClient.get().uri("/{name}/details", name).retrieve().bodyToMono(Details.class); } } ``` Kotlin ``` import org.neo4j.cypherdsl.core.Relationship import org.springframework.stereotype.Service import org.springframework.web.reactive.function.client.WebClient import reactor.core.publisher.Mono @Service class MyService(webClientBuilder: WebClient.Builder) { private val webClient: WebClient init { webClient = webClientBuilder.baseUrl("https://example.org").build() } fun someRestCall(name: String?): Mono<Relationship.Details> { return webClient.get().uri("/{name}/details", name).retrieve().bodyToMono( Relationship.Details::class.java ) } } ``` #### 6.2.1. WebClient Runtime Spring Boot will auto-detect which `ClientHttpConnector` to use to drive `WebClient`, depending on the libraries available on the application classpath. For now, Reactor Netty and Jetty RS client are supported. The `spring-boot-starter-webflux` starter depends on `io.projectreactor.netty:reactor-netty` by default, which brings both server and client implementations. If you choose to use Jetty as a reactive server instead, you should add a dependency on the Jetty Reactive HTTP client library, `org.eclipse.jetty:jetty-reactive-httpclient`. Using the same technology for server and client has it advantages, as it will automatically share HTTP resources between client and server. Developers can override the resource configuration for Jetty and Reactor Netty by providing a custom `ReactorResourceFactory` or `JettyResourceFactory` bean - this will be applied to both clients and servers. If you wish to override that choice for the client, you can define your own `ClientHttpConnector` bean and have full control over the client configuration. You can learn more about the [`WebClient` configuration options in the Spring Framework reference documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-client-builder). #### 6.2.2. WebClient Customization There are three main approaches to `WebClient` customization, depending on how broadly you want the customizations to apply. To make the scope of any customizations as narrow as possible, inject the auto-configured `WebClient.Builder` and then call its methods as required. `WebClient.Builder` instances are stateful: Any change on the builder is reflected in all clients subsequently created with it. If you want to create several clients with the same builder, you can also consider cloning the builder with `WebClient.Builder other = builder.clone();`. To make an application-wide, additive customization to all `WebClient.Builder` instances, you can declare `WebClientCustomizer` beans and change the `WebClient.Builder` locally at the point of injection. Finally, you can fall back to the original API and use `WebClient.create()`. In that case, no auto-configuration or `WebClientCustomizer` is applied. 7. Web Services ---------------- Spring Boot provides Web Services auto-configuration so that all you must do is define your `Endpoints`. The [Spring Web Services features](https://docs.spring.io/spring-ws/docs/3.1.3/reference/html/) can be easily accessed with the `spring-boot-starter-webservices` module. `SimpleWsdl11Definition` and `SimpleXsdSchema` beans can be automatically created for your WSDLs and XSDs respectively. To do so, configure their location, as shown in the following example: Properties ``` spring.webservices.wsdl-locations=classpath:/wsdl ``` Yaml ``` spring: webservices: wsdl-locations: "classpath:/wsdl" ``` ### 7.1. Calling Web Services with WebServiceTemplate If you need to call remote Web services from your application, you can use the [`WebServiceTemplate`](https://docs.spring.io/spring-ws/docs/3.1.3/reference/html/#client-web-service-template) class. Since `WebServiceTemplate` instances often need to be customized before being used, Spring Boot does not provide any single auto-configured `WebServiceTemplate` bean. It does, however, auto-configure a `WebServiceTemplateBuilder`, which can be used to create `WebServiceTemplate` instances when needed. The following code shows a typical example: Java ``` import org.springframework.boot.webservices.client.WebServiceTemplateBuilder; import org.springframework.stereotype.Service; import org.springframework.ws.client.core.WebServiceTemplate; import org.springframework.ws.soap.client.core.SoapActionCallback; @Service public class MyService { private final WebServiceTemplate webServiceTemplate; public MyService(WebServiceTemplateBuilder webServiceTemplateBuilder) { this.webServiceTemplate = webServiceTemplateBuilder.build(); } public SomeResponse someWsCall(SomeRequest detailsReq) { return (SomeResponse) this.webServiceTemplate.marshalSendAndReceive(detailsReq, new SoapActionCallback("https://ws.example.com/action")); } } ``` Kotlin ``` import org.springframework.boot.webservices.client.WebServiceTemplateBuilder import org.springframework.stereotype.Service import org.springframework.ws.client.core.WebServiceTemplate import org.springframework.ws.soap.client.core.SoapActionCallback @Service class MyService(webServiceTemplateBuilder: WebServiceTemplateBuilder) { private val webServiceTemplate: WebServiceTemplate init { webServiceTemplate = webServiceTemplateBuilder.build() } fun someWsCall(detailsReq: SomeRequest?): SomeResponse { return webServiceTemplate.marshalSendAndReceive( detailsReq, SoapActionCallback("https://ws.example.com/action") ) as SomeResponse } } ``` By default, `WebServiceTemplateBuilder` detects a suitable HTTP-based `WebServiceMessageSender` using the available HTTP client libraries on the classpath. You can also customize read and connection timeouts as follows: Java ``` import java.time.Duration; import org.springframework.boot.webservices.client.HttpWebServiceMessageSenderBuilder; import org.springframework.boot.webservices.client.WebServiceTemplateBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.ws.client.core.WebServiceTemplate; import org.springframework.ws.transport.WebServiceMessageSender; @Configuration(proxyBeanMethods = false) public class MyWebServiceTemplateConfiguration { @Bean public WebServiceTemplate webServiceTemplate(WebServiceTemplateBuilder builder) { WebServiceMessageSender sender = new HttpWebServiceMessageSenderBuilder() .setConnectTimeout(Duration.ofSeconds(5)) .setReadTimeout(Duration.ofSeconds(2)) .build(); return builder.messageSenders(sender).build(); } } ``` Kotlin ``` import org.springframework.boot.webservices.client.HttpWebServiceMessageSenderBuilder import org.springframework.boot.webservices.client.WebServiceTemplateBuilder import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.ws.client.core.WebServiceTemplate import java.time.Duration @Configuration(proxyBeanMethods = false) class MyWebServiceTemplateConfiguration { @Bean fun webServiceTemplate(builder: WebServiceTemplateBuilder): WebServiceTemplate { val sender = HttpWebServiceMessageSenderBuilder() .setConnectTimeout(Duration.ofSeconds(5)) .setReadTimeout(Duration.ofSeconds(2)) .build() return builder.messageSenders(sender).build() } } ``` 8. Distributed Transactions with JTA ------------------------------------- Spring Boot supports distributed JTA transactions across multiple XA resources by using an [Atomikos](https://www.atomikos.com/) embedded transaction manager. JTA transactions are also supported when deploying to a suitable Java EE Application Server. When a JTA environment is detected, Spring’s `JtaTransactionManager` is used to manage transactions. Auto-configured JMS, DataSource, and JPA beans are upgraded to support XA transactions. You can use standard Spring idioms, such as `@Transactional`, to participate in a distributed transaction. If you are within a JTA environment and still want to use local transactions, you can set the `spring.jta.enabled` property to `false` to disable the JTA auto-configuration. ### 8.1. Using an Atomikos Transaction Manager [Atomikos](https://www.atomikos.com/) is a popular open source transaction manager which can be embedded into your Spring Boot application. You can use the `spring-boot-starter-jta-atomikos` starter to pull in the appropriate Atomikos libraries. Spring Boot auto-configures Atomikos and ensures that appropriate `depends-on` settings are applied to your Spring beans for correct startup and shutdown ordering. By default, Atomikos transaction logs are written to a `transaction-logs` directory in your application’s home directory (the directory in which your application jar file resides). You can customize the location of this directory by setting a `spring.jta.log-dir` property in your `application.properties` file. Properties starting with `spring.jta.atomikos.properties` can also be used to customize the Atomikos `UserTransactionServiceImp`. See the [`AtomikosProperties` Javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/jta/atomikos/AtomikosProperties.html) for complete details. | | | | --- | --- | | | To ensure that multiple transaction managers can safely coordinate the same resource managers, each Atomikos instance must be configured with a unique ID. By default, this ID is the IP address of the machine on which Atomikos is running. To ensure uniqueness in production, you should configure the `spring.jta.transaction-manager-id` property with a different value for each instance of your application. | ### 8.2. Using a Java EE Managed Transaction Manager If you package your Spring Boot application as a `war` or `ear` file and deploy it to a Java EE application server, you can use your application server’s built-in transaction manager. Spring Boot tries to auto-configure a transaction manager by looking at common JNDI locations (`java:comp/UserTransaction`, `java:comp/TransactionManager`, and so on). If you use a transaction service provided by your application server, you generally also want to ensure that all resources are managed by the server and exposed over JNDI. Spring Boot tries to auto-configure JMS by looking for a `ConnectionFactory` at the JNDI path (`java:/JmsXA` or `java:/XAConnectionFactory`), and you can use the [`spring.datasource.jndi-name` property](data#data.sql.datasource.jndi) to configure your `DataSource`. ### 8.3. Mixing XA and Non-XA JMS Connections When using JTA, the primary JMS `ConnectionFactory` bean is XA-aware and participates in distributed transactions. You can inject into your bean without needing to use any `@Qualifier`: Java ``` public MyBean(ConnectionFactory connectionFactory) { // ... } ``` Kotlin In some situations, you might want to process certain JMS messages by using a non-XA `ConnectionFactory`. For example, your JMS processing logic might take longer than the XA timeout. If you want to use a non-XA `ConnectionFactory`, you can the `nonXaJmsConnectionFactory` bean: Java ``` public MyBean(@Qualifier("nonXaJmsConnectionFactory") ConnectionFactory connectionFactory) { // ... } ``` Kotlin For consistency, the `jmsConnectionFactory` bean is also provided by using the bean alias `xaJmsConnectionFactory`: Java ``` public MyBean(@Qualifier("xaJmsConnectionFactory") ConnectionFactory connectionFactory) { // ... } ``` Kotlin ### 8.4. Supporting an Alternative Embedded Transaction Manager The [`XAConnectionFactoryWrapper`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jms/XAConnectionFactoryWrapper.java) and [`XADataSourceWrapper`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jdbc/XADataSourceWrapper.java) interfaces can be used to support alternative embedded transaction managers. The interfaces are responsible for wrapping `XAConnectionFactory` and `XADataSource` beans and exposing them as regular `ConnectionFactory` and `DataSource` beans, which transparently enroll in the distributed transaction. DataSource and JMS auto-configuration use JTA variants, provided you have a `JtaTransactionManager` bean and appropriate XA wrapper beans registered within your `ApplicationContext`. The [AtomikosXAConnectionFactoryWrapper](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jta/atomikos/AtomikosXAConnectionFactoryWrapper.java) and [AtomikosXADataSourceWrapper](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jta/atomikos/AtomikosXADataSourceWrapper.java) provide good examples of how to write XA wrappers. 9. What to Read Next --------------------- You should now have a good understanding of Spring Boot’s [core features](features#features) and the various technologies that Spring Boot provides support for via auto-configuration. The next few sections go into detail about deploying applications to cloud platforms. You can read about [building container images](container-images#container-images) in the next section or skip to the [production-ready features](actuator#actuator) section.
programming_docs
spring_boot Auto-configuration Classes Auto-configuration Classes ========================== This appendix contains details of all of the auto-configuration classes provided by Spring Boot, with links to documentation and source code. Remember to also look at the conditions report in your application for more details of which features are switched on. (To do so, start the app with `--debug` or `-Ddebug` or, in an Actuator application, use the `conditions` endpoint). 1. spring-boot-autoconfigure ----------------------------- The following auto-configuration classes are from the `spring-boot-autoconfigure` module: | Configuration Class | Links | | --- | --- | | [`ActiveMQAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jms/activemq/ActiveMQAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jms/activemq/ActiveMQAutoConfiguration.html) | | [`AopAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/aop/AopAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/aop/AopAutoConfiguration.html) | | [`ApplicationAvailabilityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/availability/ApplicationAvailabilityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/availability/ApplicationAvailabilityAutoConfiguration.html) | | [`ArtemisAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jms/artemis/ArtemisAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jms/artemis/ArtemisAutoConfiguration.html) | | [`BatchAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/batch/BatchAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/batch/BatchAutoConfiguration.html) | | [`CacheAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/cache/CacheAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/cache/CacheAutoConfiguration.html) | | [`CassandraAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/cassandra/CassandraAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/cassandra/CassandraAutoConfiguration.html) | | [`CassandraDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/cassandra/CassandraDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/cassandra/CassandraDataAutoConfiguration.html) | | [`CassandraReactiveDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/cassandra/CassandraReactiveDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/cassandra/CassandraReactiveDataAutoConfiguration.html) | | [`CassandraReactiveRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/cassandra/CassandraReactiveRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/cassandra/CassandraReactiveRepositoriesAutoConfiguration.html) | | [`CassandraRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/cassandra/CassandraRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/cassandra/CassandraRepositoriesAutoConfiguration.html) | | [`ClientHttpConnectorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/function/client/ClientHttpConnectorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/function/client/ClientHttpConnectorAutoConfiguration.html) | | [`CodecsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/http/codec/CodecsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/http/codec/CodecsAutoConfiguration.html) | | [`ConfigurationPropertiesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/context/ConfigurationPropertiesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/context/ConfigurationPropertiesAutoConfiguration.html) | | [`CouchbaseAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/couchbase/CouchbaseAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/couchbase/CouchbaseAutoConfiguration.html) | | [`CouchbaseDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseDataAutoConfiguration.html) | | [`CouchbaseReactiveDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseReactiveDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseReactiveDataAutoConfiguration.html) | | [`CouchbaseReactiveRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseReactiveRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseReactiveRepositoriesAutoConfiguration.html) | | [`CouchbaseRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/couchbase/CouchbaseRepositoriesAutoConfiguration.html) | | [`DataSourceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/DataSourceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jdbc/DataSourceAutoConfiguration.html) | | [`DataSourceTransactionManagerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/DataSourceTransactionManagerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jdbc/DataSourceTransactionManagerAutoConfiguration.html) | | [`DispatcherServletAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/DispatcherServletAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/servlet/DispatcherServletAutoConfiguration.html) | | [`ElasticsearchDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/elasticsearch/ElasticsearchDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/elasticsearch/ElasticsearchDataAutoConfiguration.html) | | [`ElasticsearchRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/elasticsearch/ElasticsearchRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/elasticsearch/ElasticsearchRepositoriesAutoConfiguration.html) | | [`ElasticsearchRestClientAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/elasticsearch/ElasticsearchRestClientAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/elasticsearch/ElasticsearchRestClientAutoConfiguration.html) | | [`EmbeddedLdapAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/ldap/embedded/EmbeddedLdapAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/ldap/embedded/EmbeddedLdapAutoConfiguration.html) | | [`EmbeddedMongoAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mongo/embedded/EmbeddedMongoAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/mongo/embedded/EmbeddedMongoAutoConfiguration.html) | | [`EmbeddedWebServerFactoryCustomizerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/embedded/EmbeddedWebServerFactoryCustomizerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/embedded/EmbeddedWebServerFactoryCustomizerAutoConfiguration.html) | | [`ErrorMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/error/ErrorMvcAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/servlet/error/ErrorMvcAutoConfiguration.html) | | [`ErrorWebFluxAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/error/ErrorWebFluxAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/error/ErrorWebFluxAutoConfiguration.html) | | [`FlywayAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration.html) | | [`FreeMarkerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/freemarker/FreeMarkerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/freemarker/FreeMarkerAutoConfiguration.html) | | [`GraphQlAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/GraphQlAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/GraphQlAutoConfiguration.html) | | [`GraphQlQueryByExampleAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/data/GraphQlQueryByExampleAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/data/GraphQlQueryByExampleAutoConfiguration.html) | | [`GraphQlQuerydslAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/data/GraphQlQuerydslAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/data/GraphQlQuerydslAutoConfiguration.html) | | [`GraphQlRSocketAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/rsocket/GraphQlRSocketAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/rsocket/GraphQlRSocketAutoConfiguration.html) | | [`GraphQlReactiveQueryByExampleAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/data/GraphQlReactiveQueryByExampleAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/data/GraphQlReactiveQueryByExampleAutoConfiguration.html) | | [`GraphQlReactiveQuerydslAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/data/GraphQlReactiveQuerydslAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/data/GraphQlReactiveQuerydslAutoConfiguration.html) | | [`GraphQlWebFluxAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/reactive/GraphQlWebFluxAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/reactive/GraphQlWebFluxAutoConfiguration.html) | | [`GraphQlWebFluxSecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/security/GraphQlWebFluxSecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/security/GraphQlWebFluxSecurityAutoConfiguration.html) | | [`GraphQlWebMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/servlet/GraphQlWebMvcAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/servlet/GraphQlWebMvcAutoConfiguration.html) | | [`GraphQlWebMvcSecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/security/GraphQlWebMvcSecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/security/GraphQlWebMvcSecurityAutoConfiguration.html) | | [`GroovyTemplateAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/groovy/template/GroovyTemplateAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/groovy/template/GroovyTemplateAutoConfiguration.html) | | [`GsonAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/gson/GsonAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/gson/GsonAutoConfiguration.html) | | [`H2ConsoleAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/h2/H2ConsoleAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/h2/H2ConsoleAutoConfiguration.html) | | [`HazelcastAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/hazelcast/HazelcastAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/hazelcast/HazelcastAutoConfiguration.html) | | [`HazelcastJpaDependencyAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/hazelcast/HazelcastJpaDependencyAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/hazelcast/HazelcastJpaDependencyAutoConfiguration.html) | | [`HibernateJpaAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaAutoConfiguration.html) | | [`HttpEncodingAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/HttpEncodingAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/servlet/HttpEncodingAutoConfiguration.html) | | [`HttpHandlerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/HttpHandlerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/HttpHandlerAutoConfiguration.html) | | [`HttpMessageConvertersAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/http/HttpMessageConvertersAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/http/HttpMessageConvertersAutoConfiguration.html) | | [`HypermediaAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/hateoas/HypermediaAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/hateoas/HypermediaAutoConfiguration.html) | | [`InfluxDbAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/influx/InfluxDbAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/influx/InfluxDbAutoConfiguration.html) | | [`IntegrationAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/integration/IntegrationAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/integration/IntegrationAutoConfiguration.html) | | [`JacksonAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jackson/JacksonAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jackson/JacksonAutoConfiguration.html) | | [`JdbcRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration.html) | | [`JdbcTemplateAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/JdbcTemplateAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jdbc/JdbcTemplateAutoConfiguration.html) | | [`JerseyAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jersey/JerseyAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jersey/JerseyAutoConfiguration.html) | | [`JmsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jms/JmsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jms/JmsAutoConfiguration.html) | | [`JmxAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jmx/JmxAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jmx/JmxAutoConfiguration.html) | | [`JndiConnectionFactoryAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jms/JndiConnectionFactoryAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jms/JndiConnectionFactoryAutoConfiguration.html) | | [`JndiDataSourceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/JndiDataSourceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jdbc/JndiDataSourceAutoConfiguration.html) | | [`JooqAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jooq/JooqAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jooq/JooqAutoConfiguration.html) | | [`JpaRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/jpa/JpaRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/jpa/JpaRepositoriesAutoConfiguration.html) | | [`JsonbAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jsonb/JsonbAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jsonb/JsonbAutoConfiguration.html) | | [`JtaAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/transaction/jta/JtaAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/transaction/jta/JtaAutoConfiguration.html) | | [`KafkaAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/kafka/KafkaAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/kafka/KafkaAutoConfiguration.html) | | [`LdapAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/ldap/LdapAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/ldap/LdapAutoConfiguration.html) | | [`LdapRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/ldap/LdapRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/ldap/LdapRepositoriesAutoConfiguration.html) | | [`LifecycleAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/context/LifecycleAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/context/LifecycleAutoConfiguration.html) | | [`LiquibaseAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/liquibase/LiquibaseAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/liquibase/LiquibaseAutoConfiguration.html) | | [`MailSenderAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mail/MailSenderAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/mail/MailSenderAutoConfiguration.html) | | [`MailSenderValidatorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mail/MailSenderValidatorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/mail/MailSenderValidatorAutoConfiguration.html) | | [`MessageSourceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/context/MessageSourceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/context/MessageSourceAutoConfiguration.html) | | [`MongoAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mongo/MongoAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/mongo/MongoAutoConfiguration.html) | | [`MongoDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/mongo/MongoDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/mongo/MongoDataAutoConfiguration.html) | | [`MongoReactiveAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mongo/MongoReactiveAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/mongo/MongoReactiveAutoConfiguration.html) | | [`MongoReactiveDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/mongo/MongoReactiveDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/mongo/MongoReactiveDataAutoConfiguration.html) | | [`MongoReactiveRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/mongo/MongoReactiveRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/mongo/MongoReactiveRepositoriesAutoConfiguration.html) | | [`MongoRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/mongo/MongoRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/mongo/MongoRepositoriesAutoConfiguration.html) | | [`MultipartAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/MultipartAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/servlet/MultipartAutoConfiguration.html) | | [`MustacheAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/mustache/MustacheAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/mustache/MustacheAutoConfiguration.html) | | [`Neo4jAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/neo4j/Neo4jAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/neo4j/Neo4jAutoConfiguration.html) | | [`Neo4jDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/neo4j/Neo4jDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/neo4j/Neo4jDataAutoConfiguration.html) | | [`Neo4jReactiveDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/neo4j/Neo4jReactiveDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/neo4j/Neo4jReactiveDataAutoConfiguration.html) | | [`Neo4jReactiveRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/neo4j/Neo4jReactiveRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/neo4j/Neo4jReactiveRepositoriesAutoConfiguration.html) | | [`Neo4jRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/neo4j/Neo4jRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/neo4j/Neo4jRepositoriesAutoConfiguration.html) | | [`NettyAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/netty/NettyAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/netty/NettyAutoConfiguration.html) | | [`OAuth2ClientAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/oauth2/client/servlet/OAuth2ClientAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/oauth2/client/servlet/OAuth2ClientAutoConfiguration.html) | | [`OAuth2ResourceServerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/oauth2/resource/servlet/OAuth2ResourceServerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/oauth2/resource/servlet/OAuth2ResourceServerAutoConfiguration.html) | | [`PersistenceExceptionTranslationAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/dao/PersistenceExceptionTranslationAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/dao/PersistenceExceptionTranslationAutoConfiguration.html) | | [`ProjectInfoAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/info/ProjectInfoAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/info/ProjectInfoAutoConfiguration.html) | | [`PropertyPlaceholderAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/context/PropertyPlaceholderAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/context/PropertyPlaceholderAutoConfiguration.html) | | [`QuartzAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/quartz/QuartzAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/quartz/QuartzAutoConfiguration.html) | | [`R2dbcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/r2dbc/R2dbcAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/r2dbc/R2dbcAutoConfiguration.html) | | [`R2dbcDataAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/r2dbc/R2dbcDataAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/r2dbc/R2dbcDataAutoConfiguration.html) | | [`R2dbcRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/r2dbc/R2dbcRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/r2dbc/R2dbcRepositoriesAutoConfiguration.html) | | [`R2dbcTransactionManagerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/r2dbc/R2dbcTransactionManagerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/r2dbc/R2dbcTransactionManagerAutoConfiguration.html) | | [`RSocketGraphQlClientAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/graphql/rsocket/RSocketGraphQlClientAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/graphql/rsocket/RSocketGraphQlClientAutoConfiguration.html) | | [`RSocketMessagingAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/rsocket/RSocketMessagingAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/rsocket/RSocketMessagingAutoConfiguration.html) | | [`RSocketRequesterAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/rsocket/RSocketRequesterAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/rsocket/RSocketRequesterAutoConfiguration.html) | | [`RSocketSecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/rsocket/RSocketSecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/rsocket/RSocketSecurityAutoConfiguration.html) | | [`RSocketServerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/rsocket/RSocketServerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/rsocket/RSocketServerAutoConfiguration.html) | | [`RSocketStrategiesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/rsocket/RSocketStrategiesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/rsocket/RSocketStrategiesAutoConfiguration.html) | | [`RabbitAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/amqp/RabbitAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/amqp/RabbitAutoConfiguration.html) | | [`ReactiveElasticsearchRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/elasticsearch/ReactiveElasticsearchRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/elasticsearch/ReactiveElasticsearchRepositoriesAutoConfiguration.html) | | [`ReactiveElasticsearchRestClientAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/elasticsearch/ReactiveElasticsearchRestClientAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/elasticsearch/ReactiveElasticsearchRestClientAutoConfiguration.html) | | [`ReactiveMultipartAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/ReactiveMultipartAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/ReactiveMultipartAutoConfiguration.html) | | [`ReactiveOAuth2ClientAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/oauth2/client/reactive/ReactiveOAuth2ClientAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/oauth2/client/reactive/ReactiveOAuth2ClientAutoConfiguration.html) | | [`ReactiveOAuth2ResourceServerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/oauth2/resource/reactive/ReactiveOAuth2ResourceServerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/oauth2/resource/reactive/ReactiveOAuth2ResourceServerAutoConfiguration.html) | | [`ReactiveSecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/reactive/ReactiveSecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/reactive/ReactiveSecurityAutoConfiguration.html) | | [`ReactiveUserDetailsServiceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/reactive/ReactiveUserDetailsServiceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/reactive/ReactiveUserDetailsServiceAutoConfiguration.html) | | [`ReactiveWebServerFactoryAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/ReactiveWebServerFactoryAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/ReactiveWebServerFactoryAutoConfiguration.html) | | [`RedisAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/redis/RedisAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/redis/RedisAutoConfiguration.html) | | [`RedisReactiveAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/redis/RedisReactiveAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/redis/RedisReactiveAutoConfiguration.html) | | [`RedisRepositoriesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/redis/RedisRepositoriesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/redis/RedisRepositoriesAutoConfiguration.html) | | [`RepositoryRestMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/rest/RepositoryRestMvcAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/rest/RepositoryRestMvcAutoConfiguration.html) | | [`RestTemplateAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/client/RestTemplateAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/client/RestTemplateAutoConfiguration.html) | | [`Saml2RelyingPartyAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/saml2/Saml2RelyingPartyAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/saml2/Saml2RelyingPartyAutoConfiguration.html) | | [`SecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/servlet/SecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/servlet/SecurityAutoConfiguration.html) | | [`SecurityFilterAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/servlet/SecurityFilterAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/servlet/SecurityFilterAutoConfiguration.html) | | [`SendGridAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/sendgrid/SendGridAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/sendgrid/SendGridAutoConfiguration.html) | | [`ServletWebServerFactoryAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/ServletWebServerFactoryAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/servlet/ServletWebServerFactoryAutoConfiguration.html) | | [`SessionAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/session/SessionAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/session/SessionAutoConfiguration.html) | | [`SolrAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/solr/SolrAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/solr/SolrAutoConfiguration.html) | | [`SpringApplicationAdminJmxAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/admin/SpringApplicationAdminJmxAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/admin/SpringApplicationAdminJmxAutoConfiguration.html) | | [`SpringDataWebAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/web/SpringDataWebAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/data/web/SpringDataWebAutoConfiguration.html) | | [`SqlInitializationAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/sql/init/SqlInitializationAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/sql/init/SqlInitializationAutoConfiguration.html) | | [`TaskExecutionAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/task/TaskExecutionAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/task/TaskExecutionAutoConfiguration.html) | | [`TaskSchedulingAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/task/TaskSchedulingAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/task/TaskSchedulingAutoConfiguration.html) | | [`ThymeleafAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/thymeleaf/ThymeleafAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/thymeleaf/ThymeleafAutoConfiguration.html) | | [`TransactionAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/transaction/TransactionAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/transaction/TransactionAutoConfiguration.html) | | [`UserDetailsServiceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/security/servlet/UserDetailsServiceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/servlet/UserDetailsServiceAutoConfiguration.html) | | [`ValidationAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/validation/ValidationAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/validation/ValidationAutoConfiguration.html) | | [`WebClientAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/function/client/WebClientAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/function/client/WebClientAutoConfiguration.html) | | [`WebFluxAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/WebFluxAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/WebFluxAutoConfiguration.html) | | [`WebMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration.html) | | [`WebServiceTemplateAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/webservices/client/WebServiceTemplateAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/webservices/client/WebServiceTemplateAutoConfiguration.html) | | [`WebServicesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/webservices/WebServicesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/webservices/WebServicesAutoConfiguration.html) | | [`WebSessionIdResolverAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/reactive/WebSessionIdResolverAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/web/reactive/WebSessionIdResolverAutoConfiguration.html) | | [`WebSocketMessagingAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/websocket/servlet/WebSocketMessagingAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/websocket/servlet/WebSocketMessagingAutoConfiguration.html) | | [`WebSocketReactiveAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/websocket/reactive/WebSocketReactiveAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/websocket/reactive/WebSocketReactiveAutoConfiguration.html) | | [`WebSocketServletAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/websocket/servlet/WebSocketServletAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/websocket/servlet/WebSocketServletAutoConfiguration.html) | | [`XADataSourceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/XADataSourceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/jdbc/XADataSourceAutoConfiguration.html) | 2. spring-boot-actuator-autoconfigure -------------------------------------- The following auto-configuration classes are from the `spring-boot-actuator-autoconfigure` module: | Configuration Class | Links | | --- | --- | | [`AppOpticsMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/appoptics/AppOpticsMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/appoptics/AppOpticsMetricsExportAutoConfiguration.html) | | [`AtlasMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/atlas/AtlasMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/atlas/AtlasMetricsExportAutoConfiguration.html) | | [`AuditAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/audit/AuditAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/audit/AuditAutoConfiguration.html) | | [`AuditEventsEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/audit/AuditEventsEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/audit/AuditEventsEndpointAutoConfiguration.html) | | [`AvailabilityHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/availability/AvailabilityHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/availability/AvailabilityHealthContributorAutoConfiguration.html) | | [`AvailabilityProbesAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/availability/AvailabilityProbesAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/availability/AvailabilityProbesAutoConfiguration.html) | | [`BeansEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/beans/BeansEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/beans/BeansEndpointAutoConfiguration.html) | | [`CacheMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/cache/CacheMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/cache/CacheMetricsAutoConfiguration.html) | | [`CachesEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/cache/CachesEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/cache/CachesEndpointAutoConfiguration.html) | | [`CassandraHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/cassandra/CassandraHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/cassandra/CassandraHealthContributorAutoConfiguration.html) | | [`CassandraReactiveHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/cassandra/CassandraReactiveHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/cassandra/CassandraReactiveHealthContributorAutoConfiguration.html) | | [`CloudFoundryActuatorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/cloudfoundry/servlet/CloudFoundryActuatorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/cloudfoundry/servlet/CloudFoundryActuatorAutoConfiguration.html) | | [`CompositeMeterRegistryAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/CompositeMeterRegistryAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/CompositeMeterRegistryAutoConfiguration.html) | | [`ConditionsReportEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/condition/ConditionsReportEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/condition/ConditionsReportEndpointAutoConfiguration.html) | | [`ConfigurationPropertiesReportEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/context/properties/ConfigurationPropertiesReportEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/context/properties/ConfigurationPropertiesReportEndpointAutoConfiguration.html) | | [`ConnectionFactoryHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/r2dbc/ConnectionFactoryHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/r2dbc/ConnectionFactoryHealthContributorAutoConfiguration.html) | | [`ConnectionPoolMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/r2dbc/ConnectionPoolMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/r2dbc/ConnectionPoolMetricsAutoConfiguration.html) | | [`CouchbaseHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/couchbase/CouchbaseHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/couchbase/CouchbaseHealthContributorAutoConfiguration.html) | | [`CouchbaseReactiveHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/couchbase/CouchbaseReactiveHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/couchbase/CouchbaseReactiveHealthContributorAutoConfiguration.html) | | [`DataSourceHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/jdbc/DataSourceHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/jdbc/DataSourceHealthContributorAutoConfiguration.html) | | [`DataSourcePoolMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/jdbc/DataSourcePoolMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/jdbc/DataSourcePoolMetricsAutoConfiguration.html) | | [`DatadogMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/datadog/DatadogMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/datadog/DatadogMetricsExportAutoConfiguration.html) | | [`DiskSpaceHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/system/DiskSpaceHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/system/DiskSpaceHealthContributorAutoConfiguration.html) | | [`DynatraceMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/dynatrace/DynatraceMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/dynatrace/DynatraceMetricsExportAutoConfiguration.html) | | [`ElasticMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/elastic/ElasticMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/elastic/ElasticMetricsExportAutoConfiguration.html) | | [`ElasticSearchReactiveHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/elasticsearch/ElasticSearchReactiveHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/elasticsearch/ElasticSearchReactiveHealthContributorAutoConfiguration.html) | | [`ElasticSearchRestHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/elasticsearch/ElasticSearchRestHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/elasticsearch/ElasticSearchRestHealthContributorAutoConfiguration.html) | | [`EndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/endpoint/EndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/endpoint/EndpointAutoConfiguration.html) | | [`EnvironmentEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/env/EnvironmentEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/env/EnvironmentEndpointAutoConfiguration.html) | | [`FlywayEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/flyway/FlywayEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/flyway/FlywayEndpointAutoConfiguration.html) | | [`GangliaMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/ganglia/GangliaMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/ganglia/GangliaMetricsExportAutoConfiguration.html) | | [`GraphQlMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/graphql/GraphQlMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/graphql/GraphQlMetricsAutoConfiguration.html) | | [`GraphiteMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/graphite/GraphiteMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/graphite/GraphiteMetricsExportAutoConfiguration.html) | | [`HazelcastHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/hazelcast/HazelcastHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/hazelcast/HazelcastHealthContributorAutoConfiguration.html) | | [`HealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/health/HealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/health/HealthContributorAutoConfiguration.html) | | [`HealthEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/health/HealthEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/health/HealthEndpointAutoConfiguration.html) | | [`HeapDumpWebEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/management/HeapDumpWebEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/management/HeapDumpWebEndpointAutoConfiguration.html) | | [`HibernateMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/orm/jpa/HibernateMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/orm/jpa/HibernateMetricsAutoConfiguration.html) | | [`HttpClientMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/web/client/HttpClientMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/web/client/HttpClientMetricsAutoConfiguration.html) | | [`HttpTraceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/trace/http/HttpTraceAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/trace/http/HttpTraceAutoConfiguration.html) | | [`HttpTraceEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/trace/http/HttpTraceEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/trace/http/HttpTraceEndpointAutoConfiguration.html) | | [`HumioMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/humio/HumioMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/humio/HumioMetricsExportAutoConfiguration.html) | | [`InfluxDbHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/influx/InfluxDbHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/influx/InfluxDbHealthContributorAutoConfiguration.html) | | [`InfluxMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/influx/InfluxMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/influx/InfluxMetricsExportAutoConfiguration.html) | | [`InfoContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/info/InfoContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/info/InfoContributorAutoConfiguration.html) | | [`InfoEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/info/InfoEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/info/InfoEndpointAutoConfiguration.html) | | [`IntegrationGraphEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/integration/IntegrationGraphEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/integration/IntegrationGraphEndpointAutoConfiguration.html) | | [`JerseyServerMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/jersey/JerseyServerMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/jersey/JerseyServerMetricsAutoConfiguration.html) | | [`JettyMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/web/jetty/JettyMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/web/jetty/JettyMetricsAutoConfiguration.html) | | [`JmsHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/jms/JmsHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/jms/JmsHealthContributorAutoConfiguration.html) | | [`JmxEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/endpoint/jmx/JmxEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/endpoint/jmx/JmxEndpointAutoConfiguration.html) | | [`JmxMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/jmx/JmxMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/jmx/JmxMetricsExportAutoConfiguration.html) | | [`JolokiaEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/jolokia/JolokiaEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/jolokia/JolokiaEndpointAutoConfiguration.html) | | [`JvmMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/JvmMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/JvmMetricsAutoConfiguration.html) | | [`KafkaMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/KafkaMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/KafkaMetricsAutoConfiguration.html) | | [`KairosMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/kairos/KairosMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/kairos/KairosMetricsExportAutoConfiguration.html) | | [`LdapHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/ldap/LdapHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/ldap/LdapHealthContributorAutoConfiguration.html) | | [`LettuceMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/redis/LettuceMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/redis/LettuceMetricsAutoConfiguration.html) | | [`LiquibaseEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/liquibase/LiquibaseEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/liquibase/LiquibaseEndpointAutoConfiguration.html) | | [`Log4J2MetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/Log4J2MetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/Log4J2MetricsAutoConfiguration.html) | | [`LogFileWebEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/logging/LogFileWebEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/logging/LogFileWebEndpointAutoConfiguration.html) | | [`LogbackMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/LogbackMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/LogbackMetricsAutoConfiguration.html) | | [`LoggersEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/logging/LoggersEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/logging/LoggersEndpointAutoConfiguration.html) | | [`MailHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/mail/MailHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/mail/MailHealthContributorAutoConfiguration.html) | | [`ManagementContextAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/web/server/ManagementContextAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/web/server/ManagementContextAutoConfiguration.html) | | [`ManagementWebSecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/security/servlet/ManagementWebSecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/security/servlet/ManagementWebSecurityAutoConfiguration.html) | | [`MappingsEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/web/mappings/MappingsEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/web/mappings/MappingsEndpointAutoConfiguration.html) | | [`MetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/MetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/MetricsAutoConfiguration.html) | | [`MetricsEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/MetricsEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/MetricsEndpointAutoConfiguration.html) | | [`MongoHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/mongo/MongoHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/mongo/MongoHealthContributorAutoConfiguration.html) | | [`MongoMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/mongo/MongoMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/mongo/MongoMetricsAutoConfiguration.html) | | [`MongoReactiveHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/mongo/MongoReactiveHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/mongo/MongoReactiveHealthContributorAutoConfiguration.html) | | [`Neo4jHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/neo4j/Neo4jHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/neo4j/Neo4jHealthContributorAutoConfiguration.html) | | [`NewRelicMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/newrelic/NewRelicMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/newrelic/NewRelicMetricsExportAutoConfiguration.html) | | [`PrometheusMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/prometheus/PrometheusMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/prometheus/PrometheusMetricsExportAutoConfiguration.html) | | [`QuartzEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/quartz/QuartzEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/quartz/QuartzEndpointAutoConfiguration.html) | | [`RabbitHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/amqp/RabbitHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/amqp/RabbitHealthContributorAutoConfiguration.html) | | [`RabbitMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/amqp/RabbitMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/amqp/RabbitMetricsAutoConfiguration.html) | | [`ReactiveCloudFoundryActuatorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/cloudfoundry/reactive/ReactiveCloudFoundryActuatorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/cloudfoundry/reactive/ReactiveCloudFoundryActuatorAutoConfiguration.html) | | [`ReactiveManagementContextAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/web/reactive/ReactiveManagementContextAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/web/reactive/ReactiveManagementContextAutoConfiguration.html) | | [`ReactiveManagementWebSecurityAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/security/reactive/ReactiveManagementWebSecurityAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/security/reactive/ReactiveManagementWebSecurityAutoConfiguration.html) | | [`RedisHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/redis/RedisHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/redis/RedisHealthContributorAutoConfiguration.html) | | [`RedisReactiveHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/redis/RedisReactiveHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/redis/RedisReactiveHealthContributorAutoConfiguration.html) | | [`RepositoryMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/data/RepositoryMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/data/RepositoryMetricsAutoConfiguration.html) | | [`ScheduledTasksEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/scheduling/ScheduledTasksEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/scheduling/ScheduledTasksEndpointAutoConfiguration.html) | | [`ServletManagementContextAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/web/servlet/ServletManagementContextAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/web/servlet/ServletManagementContextAutoConfiguration.html) | | [`SessionsEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/session/SessionsEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/session/SessionsEndpointAutoConfiguration.html) | | [`ShutdownEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/context/ShutdownEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/context/ShutdownEndpointAutoConfiguration.html) | | [`SignalFxMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/signalfx/SignalFxMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/signalfx/SignalFxMetricsExportAutoConfiguration.html) | | [`SimpleMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/simple/SimpleMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/simple/SimpleMetricsExportAutoConfiguration.html) | | [`SolrHealthContributorAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/solr/SolrHealthContributorAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/solr/SolrHealthContributorAutoConfiguration.html) | | [`StackdriverMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/stackdriver/StackdriverMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/stackdriver/StackdriverMetricsExportAutoConfiguration.html) | | [`StartupEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/startup/StartupEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/startup/StartupEndpointAutoConfiguration.html) | | [`StartupTimeMetricsListenerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/startup/StartupTimeMetricsListenerAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/startup/StartupTimeMetricsListenerAutoConfiguration.html) | | [`StatsdMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/statsd/StatsdMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/statsd/StatsdMetricsExportAutoConfiguration.html) | | [`SystemMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/SystemMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/SystemMetricsAutoConfiguration.html) | | [`TaskExecutorMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/task/TaskExecutorMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/task/TaskExecutorMetricsAutoConfiguration.html) | | [`ThreadDumpEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/management/ThreadDumpEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/management/ThreadDumpEndpointAutoConfiguration.html) | | [`TomcatMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/web/tomcat/TomcatMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/web/tomcat/TomcatMetricsAutoConfiguration.html) | | [`WavefrontMetricsExportAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/export/wavefront/WavefrontMetricsExportAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/export/wavefront/WavefrontMetricsExportAutoConfiguration.html) | | [`WebEndpointAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/endpoint/web/WebEndpointAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/endpoint/web/WebEndpointAutoConfiguration.html) | | [`WebFluxMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/web/reactive/WebFluxMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/web/reactive/WebFluxMetricsAutoConfiguration.html) | | [`WebMvcMetricsAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/metrics/web/servlet/WebMvcMetricsAutoConfiguration.java) | [javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/actuate/autoconfigure/metrics/web/servlet/WebMvcMetricsAutoConfiguration.html) |
programming_docs
spring_boot Web Web === Spring Boot is well suited for web application development. You can create a self-contained HTTP server by using embedded Tomcat, Jetty, Undertow, or Netty. Most web applications use the `spring-boot-starter-web` module to get up and running quickly. You can also choose to build reactive web applications by using the `spring-boot-starter-webflux` module. If you have not yet developed a Spring Boot web application, you can follow the "Hello World!" example in the *[Getting started](getting-started#getting-started.first-application)* section. 1. Servlet Web Applications ---------------------------- If you want to build servlet-based web applications, you can take advantage of Spring Boot’s auto-configuration for Spring MVC or Jersey. ### 1.1. The “Spring Web MVC Framework” The [Spring Web MVC framework](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc) (often referred to as “Spring MVC”) is a rich “model view controller” web framework. Spring MVC lets you create special `@Controller` or `@RestController` beans to handle incoming HTTP requests. Methods in your controller are mapped to HTTP by using `@RequestMapping` annotations. The following code shows a typical `@RestController` that serves JSON data: Java ``` import java.util.List; import org.springframework.web.bind.annotation.DeleteMapping; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.PathVariable; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; @RestController @RequestMapping("/users") public class MyRestController { private final UserRepository userRepository; private final CustomerRepository customerRepository; public MyRestController(UserRepository userRepository, CustomerRepository customerRepository) { this.userRepository = userRepository; this.customerRepository = customerRepository; } @GetMapping("/{userId}") public User getUser(@PathVariable Long userId) { return this.userRepository.findById(userId).get(); } @GetMapping("/{userId}/customers") public List<Customer> getUserCustomers(@PathVariable Long userId) { return this.userRepository.findById(userId).map(this.customerRepository::findByUser).get(); } @DeleteMapping("/{userId}") public void deleteUser(@PathVariable Long userId) { this.userRepository.deleteById(userId); } } ``` Kotlin ``` import org.springframework.web.bind.annotation.DeleteMapping import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.bind.annotation.PathVariable import org.springframework.web.bind.annotation.RequestMapping import org.springframework.web.bind.annotation.RestController @RestController @RequestMapping("/users") class MyRestController(private val userRepository: UserRepository, private val customerRepository: CustomerRepository) { @GetMapping("/{userId}") fun getUser(@PathVariable userId: Long): User { return userRepository.findById(userId).get() } @GetMapping("/{userId}/customers") fun getUserCustomers(@PathVariable userId: Long): List<Customer> { return userRepository.findById(userId).map(customerRepository::findByUser).get() } @DeleteMapping("/{userId}") fun deleteUser(@PathVariable userId: Long) { userRepository.deleteById(userId) } } ``` “WebMvc.fn”, the functional variant, separates the routing configuration from the actual handling of the requests, as shown in the following example: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.MediaType; import org.springframework.web.servlet.function.RequestPredicate; import org.springframework.web.servlet.function.RouterFunction; import org.springframework.web.servlet.function.ServerResponse; import static org.springframework.web.servlet.function.RequestPredicates.accept; import static org.springframework.web.servlet.function.RouterFunctions.route; @Configuration(proxyBeanMethods = false) public class MyRoutingConfiguration { private static final RequestPredicate ACCEPT\_JSON = accept(MediaType.APPLICATION\_JSON); @Bean public RouterFunction<ServerResponse> routerFunction(MyUserHandler userHandler) { return route() .GET("/{user}", ACCEPT\_JSON, userHandler::getUser) .GET("/{user}/customers", ACCEPT\_JSON, userHandler::getUserCustomers) .DELETE("/{user}", ACCEPT\_JSON, userHandler::deleteUser) .build(); } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.http.MediaType import org.springframework.web.servlet.function.RequestPredicates.accept import org.springframework.web.servlet.function.RouterFunction import org.springframework.web.servlet.function.RouterFunctions import org.springframework.web.servlet.function.ServerResponse @Configuration(proxyBeanMethods = false) class MyRoutingConfiguration { @Bean fun routerFunction(userHandler: MyUserHandler): RouterFunction<ServerResponse> { return RouterFunctions.route() .GET("/{user}", ACCEPT\_JSON, userHandler::getUser) .GET("/{user}/customers", ACCEPT\_JSON, userHandler::getUserCustomers) .DELETE("/{user}", ACCEPT\_JSON, userHandler::deleteUser) .build() } companion object { private val ACCEPT\_JSON = accept(MediaType.APPLICATION\_JSON) } } ``` Java ``` import org.springframework.stereotype.Component; import org.springframework.web.servlet.function.ServerRequest; import org.springframework.web.servlet.function.ServerResponse; @Component public class MyUserHandler { public ServerResponse getUser(ServerRequest request) { ... return ServerResponse.ok().build(); } public ServerResponse getUserCustomers(ServerRequest request) { ... return ServerResponse.ok().build(); } public ServerResponse deleteUser(ServerRequest request) { ... return ServerResponse.ok().build(); } } ``` Kotlin ``` import org.springframework.stereotype.Component import org.springframework.web.servlet.function.ServerRequest import org.springframework.web.servlet.function.ServerResponse @Component class MyUserHandler { fun getUser(request: ServerRequest?): ServerResponse { return ServerResponse.ok().build() } fun getUserCustomers(request: ServerRequest?): ServerResponse { return ServerResponse.ok().build() } fun deleteUser(request: ServerRequest?): ServerResponse { return ServerResponse.ok().build() } } ``` Spring MVC is part of the core Spring Framework, and detailed information is available in the [reference documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc). There are also several guides that cover Spring MVC available at [spring.io/guides](https://spring.io/guides). | | | | --- | --- | | | You can define as many `RouterFunction` beans as you like to modularize the definition of the router. Beans can be ordered if you need to apply a precedence. | #### 1.1.1. Spring MVC Auto-configuration Spring Boot provides auto-configuration for Spring MVC that works well with most applications. The auto-configuration adds the following features on top of Spring’s defaults: * Inclusion of `ContentNegotiatingViewResolver` and `BeanNameViewResolver` beans. * Support for serving static resources, including support for WebJars (covered [later in this document](features#web.servlet.spring-mvc.static-content)). * Automatic registration of `Converter`, `GenericConverter`, and `Formatter` beans. * Support for `HttpMessageConverters` (covered [later in this document](features#web.servlet.spring-mvc.message-converters)). * Automatic registration of `MessageCodesResolver` (covered [later in this document](features#web.servlet.spring-mvc.message-codes)). * Static `index.html` support. * Automatic use of a `ConfigurableWebBindingInitializer` bean (covered [later in this document](features#web.servlet.spring-mvc.binding-initializer)). If you want to keep those Spring Boot MVC customizations and make more [MVC customizations](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc) (interceptors, formatters, view controllers, and other features), you can add your own `@Configuration` class of type `WebMvcConfigurer` but **without** `@EnableWebMvc`. If you want to provide custom instances of `RequestMappingHandlerMapping`, `RequestMappingHandlerAdapter`, or `ExceptionHandlerExceptionResolver`, and still keep the Spring Boot MVC customizations, you can declare a bean of type `WebMvcRegistrations` and use it to provide custom instances of those components. If you want to take complete control of Spring MVC, you can add your own `@Configuration` annotated with `@EnableWebMvc`, or alternatively add your own `@Configuration`-annotated `DelegatingWebMvcConfiguration` as described in the Javadoc of `@EnableWebMvc`. | | | | --- | --- | | | Spring MVC uses a different `ConversionService` to the one used to convert values from your `application.properties` or `application.yaml` file. It means that `Period`, `Duration` and `DataSize` converters are not available and that `@DurationUnit` and `@DataSizeUnit` annotations will be ignored. If you want to customize the `ConversionService` used by Spring MVC, you can provide a `WebMvcConfigurer` bean with an `addFormatters` method. From this method you can register any converter that you like, or you can delegate to the static methods available on `ApplicationConversionService`. | #### 1.1.2. HttpMessageConverters Spring MVC uses the `HttpMessageConverter` interface to convert HTTP requests and responses. Sensible defaults are included out of the box. For example, objects can be automatically converted to JSON (by using the Jackson library) or XML (by using the Jackson XML extension, if available, or by using JAXB if the Jackson XML extension is not available). By default, strings are encoded in `UTF-8`. If you need to add or customize converters, you can use Spring Boot’s `HttpMessageConverters` class, as shown in the following listing: Java ``` import org.springframework.boot.autoconfigure.http.HttpMessageConverters; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.converter.HttpMessageConverter; @Configuration(proxyBeanMethods = false) public class MyHttpMessageConvertersConfiguration { @Bean public HttpMessageConverters customConverters() { HttpMessageConverter<?> additional = new AdditionalHttpMessageConverter(); HttpMessageConverter<?> another = new AnotherHttpMessageConverter(); return new HttpMessageConverters(additional, another); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.http.HttpMessageConverters import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.http.converter.HttpMessageConverter @Configuration(proxyBeanMethods = false) class MyHttpMessageConvertersConfiguration { @Bean fun customConverters(): HttpMessageConverters { val additional: HttpMessageConverter<\*> = AdditionalHttpMessageConverter() val another: HttpMessageConverter<\*> = AnotherHttpMessageConverter() return HttpMessageConverters(additional, another) } } ``` Any `HttpMessageConverter` bean that is present in the context is added to the list of converters. You can also override default converters in the same way. #### 1.1.3. MessageCodesResolver Spring MVC has a strategy for generating error codes for rendering error messages from binding errors: `MessageCodesResolver`. If you set the `spring.mvc.message-codes-resolver-format` property `PREFIX_ERROR_CODE` or `POSTFIX_ERROR_CODE`, Spring Boot creates one for you (see the enumeration in [`DefaultMessageCodesResolver.Format`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/validation/DefaultMessageCodesResolver.Format.html)). #### 1.1.4. Static Content By default, Spring Boot serves static content from a directory called `/static` (or `/public` or `/resources` or `/META-INF/resources`) in the classpath or from the root of the `ServletContext`. It uses the `ResourceHttpRequestHandler` from Spring MVC so that you can modify that behavior by adding your own `WebMvcConfigurer` and overriding the `addResourceHandlers` method. In a stand-alone web application, the default servlet from the container is also enabled and acts as a fallback, serving content from the root of the `ServletContext` if Spring decides not to handle it. Most of the time, this does not happen (unless you modify the default MVC configuration), because Spring can always handle requests through the `DispatcherServlet`. By default, resources are mapped on `/**`, but you can tune that with the `spring.mvc.static-path-pattern` property. For instance, relocating all resources to `/resources/**` can be achieved as follows: Properties ``` spring.mvc.static-path-pattern=/resources/** ``` Yaml ``` spring: mvc: static-path-pattern: "/resources/**" ``` You can also customize the static resource locations by using the `spring.web.resources.static-locations` property (replacing the default values with a list of directory locations). The root servlet context path, `"/"`, is automatically added as a location as well. In addition to the “standard” static resource locations mentioned earlier, a special case is made for [Webjars content](https://www.webjars.org/). Any resources with a path in `/webjars/**` are served from jar files if they are packaged in the Webjars format. | | | | --- | --- | | | Do not use the `src/main/webapp` directory if your application is packaged as a jar. Although this directory is a common standard, it works **only** with war packaging, and it is silently ignored by most build tools if you generate a jar. | Spring Boot also supports the advanced resource handling features provided by Spring MVC, allowing use cases such as cache-busting static resources or using version agnostic URLs for Webjars. To use version agnostic URLs for Webjars, add the `webjars-locator-core` dependency. Then declare your Webjar. Using jQuery as an example, adding `"/webjars/jquery/jquery.min.js"` results in `"/webjars/jquery/x.y.z/jquery.min.js"` where `x.y.z` is the Webjar version. | | | | --- | --- | | | If you use JBoss, you need to declare the `webjars-locator-jboss-vfs` dependency instead of the `webjars-locator-core`. Otherwise, all Webjars resolve as a `404`. | To use cache busting, the following configuration configures a cache busting solution for all static resources, effectively adding a content hash, such as `<link href="/css/spring-2a2d595e6ed9a0b24f027f2b63b134d6.css"/>`, in URLs: Properties ``` spring.web.resources.chain.strategy.content.enabled=true spring.web.resources.chain.strategy.content.paths=/** ``` Yaml ``` spring: web: resources: chain: strategy: content: enabled: true paths: "/**" ``` | | | | --- | --- | | | Links to resources are rewritten in templates at runtime, thanks to a `ResourceUrlEncodingFilter` that is auto-configured for Thymeleaf and FreeMarker. You should manually declare this filter when using JSPs. Other template engines are currently not automatically supported but can be with custom template macros/helpers and the use of the [`ResourceUrlProvider`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/web/servlet/resource/ResourceUrlProvider.html). | When loading resources dynamically with, for example, a JavaScript module loader, renaming files is not an option. That is why other strategies are also supported and can be combined. A "fixed" strategy adds a static version string in the URL without changing the file name, as shown in the following example: Properties ``` spring.web.resources.chain.strategy.content.enabled=true spring.web.resources.chain.strategy.content.paths=/** spring.web.resources.chain.strategy.fixed.enabled=true spring.web.resources.chain.strategy.fixed.paths=/js/lib/ spring.web.resources.chain.strategy.fixed.version=v12 ``` Yaml ``` spring: web: resources: chain: strategy: content: enabled: true paths: "/**" fixed: enabled: true paths: "/js/lib/" version: "v12" ``` With this configuration, JavaScript modules located under `"/js/lib/"` use a fixed versioning strategy (`"/v12/js/lib/mymodule.js"`), while other resources still use the content one (`<link href="/css/spring-2a2d595e6ed9a0b24f027f2b63b134d6.css"/>`). See [`WebProperties.Resources`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/WebProperties.java) for more supported options. | | | | --- | --- | | | This feature has been thoroughly described in a dedicated [blog post](https://spring.io/blog/2014/07/24/spring-framework-4-1-handling-static-web-resources) and in Spring Framework’s [reference documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-config-static-resources). | #### 1.1.5. Welcome Page Spring Boot supports both static and templated welcome pages. It first looks for an `index.html` file in the configured static content locations. If one is not found, it then looks for an `index` template. If either is found, it is automatically used as the welcome page of the application. #### 1.1.6. Path Matching and Content Negotiation Spring MVC can map incoming HTTP requests to handlers by looking at the request path and matching it to the mappings defined in your application (for example, `@GetMapping` annotations on Controller methods). Spring Boot chooses to disable suffix pattern matching by default, which means that requests like `"GET /projects/spring-boot.json"` will not be matched to `@GetMapping("/projects/spring-boot")` mappings. This is considered as a [best practice for Spring MVC applications](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-ann-requestmapping-suffix-pattern-match). This feature was mainly useful in the past for HTTP clients which did not send proper "Accept" request headers; we needed to make sure to send the correct Content Type to the client. Nowadays, Content Negotiation is much more reliable. There are other ways to deal with HTTP clients that do not consistently send proper "Accept" request headers. Instead of using suffix matching, we can use a query parameter to ensure that requests like `"GET /projects/spring-boot?format=json"` will be mapped to `@GetMapping("/projects/spring-boot")`: Properties ``` spring.mvc.contentnegotiation.favor-parameter=true ``` Yaml ``` spring: mvc: contentnegotiation: favor-parameter: true ``` Or if you prefer to use a different parameter name: Properties ``` spring.mvc.contentnegotiation.favor-parameter=true spring.mvc.contentnegotiation.parameter-name=myparam ``` Yaml ``` spring: mvc: contentnegotiation: favor-parameter: true parameter-name: "myparam" ``` Most standard media types are supported out-of-the-box, but you can also define new ones: Properties ``` spring.mvc.contentnegotiation.media-types.markdown=text/markdown ``` Yaml ``` spring: mvc: contentnegotiation: media-types: markdown: "text/markdown" ``` Suffix pattern matching is deprecated and will be removed in a future release. If you understand the caveats and would still like your application to use suffix pattern matching, the following configuration is required: Properties ``` spring.mvc.contentnegotiation.favor-path-extension=true spring.mvc.pathmatch.use-suffix-pattern=true ``` Yaml ``` spring: mvc: contentnegotiation: favor-path-extension: true pathmatch: use-suffix-pattern: true ``` Alternatively, rather than open all suffix patterns, it is more secure to only support registered suffix patterns: Properties ``` spring.mvc.contentnegotiation.favor-path-extension=true spring.mvc.pathmatch.use-registered-suffix-pattern=true ``` Yaml ``` spring: mvc: contentnegotiation: favor-path-extension: true pathmatch: use-registered-suffix-pattern: true ``` As of Spring Framework 5.3, Spring MVC supports several implementation strategies for matching request paths to Controller handlers. It was previously only supporting the `AntPathMatcher` strategy, but it now also offers `PathPatternParser`. Spring Boot now provides a configuration property to choose and opt in the new strategy: Properties ``` spring.mvc.pathmatch.matching-strategy=path-pattern-parser ``` Yaml ``` spring: mvc: pathmatch: matching-strategy: "path-pattern-parser" ``` For more details on why you should consider this new implementation, see the [dedicated blog post](https://spring.io/blog/2020/06/30/url-matching-with-pathpattern-in-spring-mvc). | | | | --- | --- | | | `PathPatternParser` is an optimized implementation but restricts usage of [some path patterns variants](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-ann-requestmapping-uri-templates) and is incompatible with suffix pattern matching (`spring.mvc.pathmatch.use-suffix-pattern`, `spring.mvc.pathmatch.use-registered-suffix-pattern`) or mapping the `DispatcherServlet` with a servlet prefix (`spring.mvc.servlet.path`). | #### 1.1.7. ConfigurableWebBindingInitializer Spring MVC uses a `WebBindingInitializer` to initialize a `WebDataBinder` for a particular request. If you create your own `ConfigurableWebBindingInitializer` `@Bean`, Spring Boot automatically configures Spring MVC to use it. #### 1.1.8. Template Engines As well as REST web services, you can also use Spring MVC to serve dynamic HTML content. Spring MVC supports a variety of templating technologies, including Thymeleaf, FreeMarker, and JSPs. Also, many other templating engines include their own Spring MVC integrations. Spring Boot includes auto-configuration support for the following templating engines: * [FreeMarker](https://freemarker.apache.org/docs/) * [Groovy](https://docs.groovy-lang.org/docs/next/html/documentation/template-engines.html#_the_markuptemplateengine) * [Thymeleaf](https://www.thymeleaf.org) * [Mustache](https://mustache.github.io/) | | | | --- | --- | | | If possible, JSPs should be avoided. There are several [known limitations](#web.servlet.embedded-container.jsp-limitations) when using them with embedded servlet containers. | When you use one of these templating engines with the default configuration, your templates are picked up automatically from `src/main/resources/templates`. | | | | --- | --- | | | Depending on how you run your application, your IDE may order the classpath differently. Running your application in the IDE from its main method results in a different ordering than when you run your application by using Maven or Gradle or from its packaged jar. This can cause Spring Boot to fail to find the expected template. If you have this problem, you can reorder the classpath in the IDE to place the module’s classes and resources first. | #### 1.1.9. Error Handling By default, Spring Boot provides an `/error` mapping that handles all errors in a sensible way, and it is registered as a “global” error page in the servlet container. For machine clients, it produces a JSON response with details of the error, the HTTP status, and the exception message. For browser clients, there is a “whitelabel” error view that renders the same data in HTML format (to customize it, add a `View` that resolves to `error`). There are a number of `server.error` properties that can be set if you want to customize the default error handling behavior. See the [“Server Properties”](application-properties#appendix.application-properties.server) section of the Appendix. To replace the default behavior completely, you can implement `ErrorController` and register a bean definition of that type or add a bean of type `ErrorAttributes` to use the existing mechanism but replace the contents. | | | | --- | --- | | | The `BasicErrorController` can be used as a base class for a custom `ErrorController`. This is particularly useful if you want to add a handler for a new content type (the default is to handle `text/html` specifically and provide a fallback for everything else). To do so, extend `BasicErrorController`, add a public method with a `@RequestMapping` that has a `produces` attribute, and create a bean of your new type. | You can also define a class annotated with `@ControllerAdvice` to customize the JSON document to return for a particular controller and/or exception type, as shown in the following example: Java ``` import javax.servlet.RequestDispatcher; import javax.servlet.http.HttpServletRequest; import org.springframework.http.HttpStatus; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.ControllerAdvice; import org.springframework.web.bind.annotation.ExceptionHandler; import org.springframework.web.bind.annotation.ResponseBody; import org.springframework.web.servlet.mvc.method.annotation.ResponseEntityExceptionHandler; @ControllerAdvice(basePackageClasses = SomeController.class) public class MyControllerAdvice extends ResponseEntityExceptionHandler { @ResponseBody @ExceptionHandler(MyException.class) public ResponseEntity<?> handleControllerException(HttpServletRequest request, Throwable ex) { HttpStatus status = getStatus(request); return new ResponseEntity<>(new MyErrorBody(status.value(), ex.getMessage()), status); } private HttpStatus getStatus(HttpServletRequest request) { Integer code = (Integer) request.getAttribute(RequestDispatcher.ERROR\_STATUS\_CODE); HttpStatus status = HttpStatus.resolve(code); return (status != null) ? status : HttpStatus.INTERNAL\_SERVER\_ERROR; } } ``` Kotlin ``` import org.springframework.http.HttpStatus import org.springframework.http.ResponseEntity import org.springframework.web.bind.annotation.ControllerAdvice import org.springframework.web.bind.annotation.ExceptionHandler import org.springframework.web.bind.annotation.ResponseBody import org.springframework.web.servlet.mvc.method.annotation.ResponseEntityExceptionHandler import javax.servlet.RequestDispatcher import javax.servlet.http.HttpServletRequest @ControllerAdvice(basePackageClasses = [SomeController::class]) class MyControllerAdvice : ResponseEntityExceptionHandler() { @ResponseBody @ExceptionHandler(MyException::class) fun handleControllerException(request: HttpServletRequest, ex: Throwable): ResponseEntity<\*> { val status = getStatus(request) return ResponseEntity(MyErrorBody(status.value(), ex.message), status) } private fun getStatus(request: HttpServletRequest): HttpStatus { val code = request.getAttribute(RequestDispatcher.ERROR\_STATUS\_CODE) as Int val status = HttpStatus.resolve(code) return status ?: HttpStatus.INTERNAL\_SERVER\_ERROR } } ``` In the preceding example, if `YourException` is thrown by a controller defined in the same package as `SomeController`, a JSON representation of the `CustomErrorType` POJO is used instead of the `ErrorAttributes` representation. In some cases, errors handled at the controller level are not recorded by the [metrics infrastructure](actuator#actuator.metrics.supported.spring-mvc). Applications can ensure that such exceptions are recorded with the request metrics by setting the handled exception as a request attribute: Java ``` import javax.servlet.http.HttpServletRequest; import org.springframework.boot.web.servlet.error.ErrorAttributes; import org.springframework.stereotype.Controller; import org.springframework.web.bind.annotation.ExceptionHandler; @Controller public class MyController { @ExceptionHandler(CustomException.class) String handleCustomException(HttpServletRequest request, CustomException ex) { request.setAttribute(ErrorAttributes.ERROR\_ATTRIBUTE, ex); return "errorView"; } } ``` Kotlin ``` import org.springframework.boot.web.servlet.error.ErrorAttributes import org.springframework.stereotype.Controller import org.springframework.web.bind.annotation.ExceptionHandler import javax.servlet.http.HttpServletRequest @Controller class MyController { @ExceptionHandler(CustomException::class) fun handleCustomException(request: HttpServletRequest, ex: CustomException?): String { request.setAttribute(ErrorAttributes.ERROR\_ATTRIBUTE, ex) return "errorView" } } ``` ##### Custom Error Pages If you want to display a custom HTML error page for a given status code, you can add a file to an `/error` directory. Error pages can either be static HTML (that is, added under any of the static resource directories) or be built by using templates. The name of the file should be the exact status code or a series mask. For example, to map `404` to a static HTML file, your directory structure would be as follows: ``` src/ +- main/ +- java/ | + <source code> +- resources/ +- public/ +- error/ | +- 404.html +- <other public assets> ``` To map all `5xx` errors by using a FreeMarker template, your directory structure would be as follows: ``` src/ +- main/ +- java/ | + <source code> +- resources/ +- templates/ +- error/ | +- 5xx.ftlh +- <other templates> ``` For more complex mappings, you can also add beans that implement the `ErrorViewResolver` interface, as shown in the following example: Java ``` import java.util.Map; import javax.servlet.http.HttpServletRequest; import org.springframework.boot.autoconfigure.web.servlet.error.ErrorViewResolver; import org.springframework.http.HttpStatus; import org.springframework.web.servlet.ModelAndView; public class MyErrorViewResolver implements ErrorViewResolver { @Override public ModelAndView resolveErrorView(HttpServletRequest request, HttpStatus status, Map<String, Object> model) { // Use the request or status to optionally return a ModelAndView if (status == HttpStatus.INSUFFICIENT\_STORAGE) { // We could add custom model values here new ModelAndView("myview"); } return null; } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.web.servlet.error.ErrorViewResolver import org.springframework.http.HttpStatus import org.springframework.web.servlet.ModelAndView import javax.servlet.http.HttpServletRequest class MyErrorViewResolver : ErrorViewResolver { override fun resolveErrorView(request: HttpServletRequest, status: HttpStatus, model: Map<String, Any>): ModelAndView? { // Use the request or status to optionally return a ModelAndView if (status == HttpStatus.INSUFFICIENT\_STORAGE) { // We could add custom model values here return ModelAndView("myview") } return null } } ``` You can also use regular Spring MVC features such as [`@ExceptionHandler` methods](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-exceptionhandlers) and [`@ControllerAdvice`](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-ann-controller-advice). The `ErrorController` then picks up any unhandled exceptions. ##### Mapping Error Pages outside of Spring MVC For applications that do not use Spring MVC, you can use the `ErrorPageRegistrar` interface to directly register `ErrorPages`. This abstraction works directly with the underlying embedded servlet container and works even if you do not have a Spring MVC `DispatcherServlet`. Java ``` import org.springframework.boot.web.server.ErrorPage; import org.springframework.boot.web.server.ErrorPageRegistrar; import org.springframework.boot.web.server.ErrorPageRegistry; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.HttpStatus; @Configuration(proxyBeanMethods = false) public class MyErrorPagesConfiguration { @Bean public ErrorPageRegistrar errorPageRegistrar() { return this::registerErrorPages; } private void registerErrorPages(ErrorPageRegistry registry) { registry.addErrorPages(new ErrorPage(HttpStatus.BAD\_REQUEST, "/400")); } } ``` Kotlin ``` import org.springframework.boot.web.server.ErrorPage import org.springframework.boot.web.server.ErrorPageRegistrar import org.springframework.boot.web.server.ErrorPageRegistry import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.http.HttpStatus @Configuration(proxyBeanMethods = false) class MyErrorPagesConfiguration { @Bean fun errorPageRegistrar(): ErrorPageRegistrar { return ErrorPageRegistrar { registry: ErrorPageRegistry -> registerErrorPages(registry) } } private fun registerErrorPages(registry: ErrorPageRegistry) { registry.addErrorPages(ErrorPage(HttpStatus.BAD\_REQUEST, "/400")) } } ``` | | | | --- | --- | | | If you register an `ErrorPage` with a path that ends up being handled by a `Filter` (as is common with some non-Spring web frameworks, like Jersey and Wicket), then the `Filter` has to be explicitly registered as an `ERROR` dispatcher, as shown in the following example: | Java ``` import java.util.EnumSet; import javax.servlet.DispatcherType; import org.springframework.boot.web.servlet.FilterRegistrationBean; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyFilterConfiguration { @Bean public FilterRegistrationBean<MyFilter> myFilter() { FilterRegistrationBean<MyFilter> registration = new FilterRegistrationBean<>(new MyFilter()); // ... registration.setDispatcherTypes(EnumSet.allOf(DispatcherType.class)); return registration; } } ``` Kotlin ``` import org.springframework.boot.web.servlet.FilterRegistrationBean import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import java.util.EnumSet import javax.servlet.DispatcherType @Configuration(proxyBeanMethods = false) class MyFilterConfiguration { @Bean fun myFilter(): FilterRegistrationBean<MyFilter> { val registration = FilterRegistrationBean(MyFilter()) // ... registration.setDispatcherTypes(EnumSet.allOf(DispatcherType::class.java)) return registration } } ``` Note that the default `FilterRegistrationBean` does not include the `ERROR` dispatcher type. ##### Error handling in a war deployment When deployed to a servlet container, Spring Boot uses its error page filter to forward a request with an error status to the appropriate error page. This is necessary as the servlet specification does not provide an API for registering error pages. Depending on the container that you are deploying your war file to and the technologies that your application uses, some additional configuration may be required. The error page filter can only forward the request to the correct error page if the response has not already been committed. By default, WebSphere Application Server 8.0 and later commits the response upon successful completion of a servlet’s service method. You should disable this behavior by setting `com.ibm.ws.webcontainer.invokeFlushAfterService` to `false`. If you are using Spring Security and want to access the principal in an error page, you must configure Spring Security’s filter to be invoked on error dispatches. To do so, set the `spring.security.filter.dispatcher-types` property to `async, error, forward, request`. #### 1.1.10. CORS Support [Cross-origin resource sharing](https://en.wikipedia.org/wiki/Cross-origin_resource_sharing) (CORS) is a [W3C specification](https://www.w3.org/TR/cors/) implemented by [most browsers](https://caniuse.com/#feat=cors) that lets you specify in a flexible way what kind of cross-domain requests are authorized, instead of using some less secure and less powerful approaches such as IFRAME or JSONP. As of version 4.2, Spring MVC [supports CORS](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-cors). Using [controller method CORS configuration](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-cors-controller) with [`@CrossOrigin`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/web/bind/annotation/CrossOrigin.html) annotations in your Spring Boot application does not require any specific configuration. [Global CORS configuration](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-cors-global) can be defined by registering a `WebMvcConfigurer` bean with a customized `addCorsMappings(CorsRegistry)` method, as shown in the following example: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.web.servlet.config.annotation.CorsRegistry; import org.springframework.web.servlet.config.annotation.WebMvcConfigurer; @Configuration(proxyBeanMethods = false) public class MyCorsConfiguration { @Bean public WebMvcConfigurer corsConfigurer() { return new WebMvcConfigurer() { @Override public void addCorsMappings(CorsRegistry registry) { registry.addMapping("/api/\*\*"); } }; } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.web.servlet.config.annotation.CorsRegistry import org.springframework.web.servlet.config.annotation.WebMvcConfigurer @Configuration(proxyBeanMethods = false) class MyCorsConfiguration { @Bean fun corsConfigurer(): WebMvcConfigurer { return object : WebMvcConfigurer { override fun addCorsMappings(registry: CorsRegistry) { registry.addMapping("/api/\*\*") } } } } ``` ### 1.2. JAX-RS and Jersey If you prefer the JAX-RS programming model for REST endpoints, you can use one of the available implementations instead of Spring MVC. [Jersey](https://jersey.github.io/) and [Apache CXF](https://cxf.apache.org/) work quite well out of the box. CXF requires you to register its `Servlet` or `Filter` as a `@Bean` in your application context. Jersey has some native Spring support, so we also provide auto-configuration support for it in Spring Boot, together with a starter. To get started with Jersey, include the `spring-boot-starter-jersey` as a dependency and then you need one `@Bean` of type `ResourceConfig` in which you register all the endpoints, as shown in the following example: Java ``` import org.glassfish.jersey.server.ResourceConfig; import org.springframework.stereotype.Component; @Component public class MyJerseyConfig extends ResourceConfig { public MyJerseyConfig() { register(MyEndpoint.class); } } ``` Kotlin ``` import org.glassfish.jersey.server.ResourceConfig import org.springframework.stereotype.Component @Component class MyJerseyConfig : ResourceConfig() { init { register(MyEndpoint::class.java) } } ``` | | | | --- | --- | | | Jersey’s support for scanning executable archives is rather limited. For example, it cannot scan for endpoints in a package found in a [fully executable jar file](deployment#deployment.installing) or in `WEB-INF/classes` when running an executable war file. To avoid this limitation, the `packages` method should not be used, and endpoints should be registered individually by using the `register` method, as shown in the preceding example. | For more advanced customizations, you can also register an arbitrary number of beans that implement `ResourceConfigCustomizer`. All the registered endpoints should be `@Components` with HTTP resource annotations (`@GET` and others), as shown in the following example: Java ``` import javax.ws.rs.GET; import javax.ws.rs.Path; import org.springframework.stereotype.Component; @Component @Path("/hello") public class MyEndpoint { @GET public String message() { return "Hello"; } } ``` Kotlin ``` import org.springframework.stereotype.Component import javax.ws.rs.GET import javax.ws.rs.Path @Component @Path("/hello") class MyEndpoint { @GET fun message(): String { return "Hello" } } ``` Since the `Endpoint` is a Spring `@Component`, its lifecycle is managed by Spring and you can use the `@Autowired` annotation to inject dependencies and use the `@Value` annotation to inject external configuration. By default, the Jersey servlet is registered and mapped to `/*`. You can change the mapping by adding `@ApplicationPath` to your `ResourceConfig`. By default, Jersey is set up as a servlet in a `@Bean` of type `ServletRegistrationBean` named `jerseyServletRegistration`. By default, the servlet is initialized lazily, but you can customize that behavior by setting `spring.jersey.servlet.load-on-startup`. You can disable or override that bean by creating one of your own with the same name. You can also use a filter instead of a servlet by setting `spring.jersey.type=filter` (in which case, the `@Bean` to replace or override is `jerseyFilterRegistration`). The filter has an `@Order`, which you can set with `spring.jersey.filter.order`. When using Jersey as a filter, a servlet that will handle any requests that are not intercepted by Jersey must be present. If your application does not contain such a servlet, you may want to enable the default servlet by setting `server.servlet.register-default-servlet` to `true`. Both the servlet and the filter registrations can be given init parameters by using `spring.jersey.init.*` to specify a map of properties. ### 1.3. Embedded Servlet Container Support For servlet application, Spring Boot includes support for embedded [Tomcat](https://tomcat.apache.org/), [Jetty](https://www.eclipse.org/jetty/), and [Undertow](https://github.com/undertow-io/undertow) servers. Most developers use the appropriate “Starter” to obtain a fully configured instance. By default, the embedded server listens for HTTP requests on port `8080`. #### 1.3.1. Servlets, Filters, and listeners When using an embedded servlet container, you can register servlets, filters, and all the listeners (such as `HttpSessionListener`) from the servlet spec, either by using Spring beans or by scanning for servlet components. ##### Registering Servlets, Filters, and Listeners as Spring Beans Any `Servlet`, `Filter`, or servlet `*Listener` instance that is a Spring bean is registered with the embedded container. This can be particularly convenient if you want to refer to a value from your `application.properties` during configuration. By default, if the context contains only a single Servlet, it is mapped to `/`. In the case of multiple servlet beans, the bean name is used as a path prefix. Filters map to `/*`. If convention-based mapping is not flexible enough, you can use the `ServletRegistrationBean`, `FilterRegistrationBean`, and `ServletListenerRegistrationBean` classes for complete control. It is usually safe to leave filter beans unordered. If a specific order is required, you should annotate the `Filter` with `@Order` or make it implement `Ordered`. You cannot configure the order of a `Filter` by annotating its bean method with `@Order`. If you cannot change the `Filter` class to add `@Order` or implement `Ordered`, you must define a `FilterRegistrationBean` for the `Filter` and set the registration bean’s order using the `setOrder(int)` method. Avoid configuring a filter that reads the request body at `Ordered.HIGHEST_PRECEDENCE`, since it might go against the character encoding configuration of your application. If a servlet filter wraps the request, it should be configured with an order that is less than or equal to `OrderedFilter.REQUEST_WRAPPER_FILTER_MAX_ORDER`. | | | | --- | --- | | | To see the order of every `Filter` in your application, enable debug level logging for the `web` [logging group](features#features.logging.log-groups) (`logging.level.web=debug`). Details of the registered filters, including their order and URL patterns, will then be logged at startup. | | | | | --- | --- | | | Take care when registering `Filter` beans since they are initialized very early in the application lifecycle. If you need to register a `Filter` that interacts with other beans, consider using a [`DelegatingFilterProxyRegistrationBean`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/web/servlet/DelegatingFilterProxyRegistrationBean.html) instead. | #### 1.3.2. Servlet Context Initialization Embedded servlet containers do not directly execute the servlet 3.0+ `javax.servlet.ServletContainerInitializer` interface or Spring’s `org.springframework.web.WebApplicationInitializer` interface. This is an intentional design decision intended to reduce the risk that third party libraries designed to run inside a war may break Spring Boot applications. If you need to perform servlet context initialization in a Spring Boot application, you should register a bean that implements the `org.springframework.boot.web.servlet.ServletContextInitializer` interface. The single `onStartup` method provides access to the `ServletContext` and, if necessary, can easily be used as an adapter to an existing `WebApplicationInitializer`. ##### Scanning for Servlets, Filters, and listeners When using an embedded container, automatic registration of classes annotated with `@WebServlet`, `@WebFilter`, and `@WebListener` can be enabled by using `@ServletComponentScan`. | | | | --- | --- | | | `@ServletComponentScan` has no effect in a standalone container, where the container’s built-in discovery mechanisms are used instead. | #### 1.3.3. The ServletWebServerApplicationContext Under the hood, Spring Boot uses a different type of `ApplicationContext` for embedded servlet container support. The `ServletWebServerApplicationContext` is a special type of `WebApplicationContext` that bootstraps itself by searching for a single `ServletWebServerFactory` bean. Usually a `TomcatServletWebServerFactory`, `JettyServletWebServerFactory`, or `UndertowServletWebServerFactory` has been auto-configured. | | | | --- | --- | | | You usually do not need to be aware of these implementation classes. Most applications are auto-configured, and the appropriate `ApplicationContext` and `ServletWebServerFactory` are created on your behalf. | In an embedded container setup, the `ServletContext` is set as part of server startup which happens during application context initialization. Because of this beans in the `ApplicationContext` cannot be reliably initialized with a `ServletContext`. One way to get around this is to inject `ApplicationContext` as a dependency of the bean and access the `ServletContext` only when it is needed. Another way is to use a callback once the server has started. This can be done using an `ApplicationListener` which listens for the `ApplicationStartedEvent` as follows: ``` import javax.servlet.ServletContext; import org.springframework.boot.context.event.ApplicationStartedEvent; import org.springframework.context.ApplicationContext; import org.springframework.context.ApplicationListener; import org.springframework.web.context.WebApplicationContext; public class MyDemoBean implements ApplicationListener<ApplicationStartedEvent> { private ServletContext servletContext; @Override public void onApplicationEvent(ApplicationStartedEvent event) { ApplicationContext applicationContext = event.getApplicationContext(); this.servletContext = ((WebApplicationContext) applicationContext).getServletContext(); } } ``` #### 1.3.4. Customizing Embedded Servlet Containers Common servlet container settings can be configured by using Spring `Environment` properties. Usually, you would define the properties in your `application.properties` or `application.yaml` file. Common server settings include: * Network settings: Listen port for incoming HTTP requests (`server.port`), interface address to bind to `server.address`, and so on. * Session settings: Whether the session is persistent (`server.servlet.session.persistent`), session timeout (`server.servlet.session.timeout`), location of session data (`server.servlet.session.store-dir`), and session-cookie configuration (`server.servlet.session.cookie.*`). * Error management: Location of the error page (`server.error.path`) and so on. * [SSL](howto#howto.webserver.configure-ssl) * [HTTP compression](howto#howto.webserver.enable-response-compression) Spring Boot tries as much as possible to expose common settings, but this is not always possible. For those cases, dedicated namespaces offer server-specific customizations (see `server.tomcat` and `server.undertow`). For instance, [access logs](howto#howto.webserver.configure-access-logs) can be configured with specific features of the embedded servlet container. | | | | --- | --- | | | See the [`ServerProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/ServerProperties.java) class for a complete list. | ##### SameSite Cookies The `SameSite` cookie attribute can be used by web browsers to control if and how cookies are submitted in cross-site requests. The attribute is particularly relevant for modern web browsers which have started to change the default value that is used when the attribute is missing. If you want to change the `SameSite` attribute of your session cookie, you can use the `server.servlet.session.cookie.same-site` property. This property is supported by auto-configured Tomcat, Jetty and Undertow servers. It is also used to configure Spring Session servlet based `SessionRepository` beans. For example, if you want your session cookie to have a `SameSite` attribute of `None`, you can add the following to your `application.properties` or `application.yaml` file: Properties ``` server.servlet.session.cookie.same-site=none ``` Yaml ``` server: servlet: session: cookie: same-site: "none" ``` If you want to change the `SameSite` attribute on other cookies added to your `HttpServletResponse`, you can use a `CookieSameSiteSupplier`. The `CookieSameSiteSupplier` is passed a `Cookie` and may return a `SameSite` value, or `null`. There are a number of convenience factory and filter methods that you can use to quickly match specific cookies. For example, adding the following bean will automatically apply a `SameSite` of `Lax` for all cookies with a name that matches the regular expression `myapp.*`. Java ``` import org.springframework.boot.web.servlet.server.CookieSameSiteSupplier; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MySameSiteConfiguration { @Bean public CookieSameSiteSupplier applicationCookieSameSiteSupplier() { return CookieSameSiteSupplier.ofLax().whenHasNameMatching("myapp.\*"); } } ``` Kotlin ``` import org.springframework.boot.web.servlet.server.CookieSameSiteSupplier import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MySameSiteConfiguration { @Bean fun applicationCookieSameSiteSupplier(): CookieSameSiteSupplier { return CookieSameSiteSupplier.ofLax().whenHasNameMatching("myapp.\*") } } ``` ##### Programmatic Customization If you need to programmatically configure your embedded servlet container, you can register a Spring bean that implements the `WebServerFactoryCustomizer` interface. `WebServerFactoryCustomizer` provides access to the `ConfigurableServletWebServerFactory`, which includes numerous customization setter methods. The following example shows programmatically setting the port: Java ``` import org.springframework.boot.web.server.WebServerFactoryCustomizer; import org.springframework.boot.web.servlet.server.ConfigurableServletWebServerFactory; import org.springframework.stereotype.Component; @Component public class MyWebServerFactoryCustomizer implements WebServerFactoryCustomizer<ConfigurableServletWebServerFactory> { @Override public void customize(ConfigurableServletWebServerFactory server) { server.setPort(9000); } } ``` Kotlin ``` import org.springframework.boot.web.server.WebServerFactoryCustomizer import org.springframework.boot.web.servlet.server.ConfigurableServletWebServerFactory import org.springframework.stereotype.Component @Component class MyWebServerFactoryCustomizer : WebServerFactoryCustomizer<ConfigurableServletWebServerFactory> { override fun customize(server: ConfigurableServletWebServerFactory) { server.setPort(9000) } } ``` `TomcatServletWebServerFactory`, `JettyServletWebServerFactory` and `UndertowServletWebServerFactory` are dedicated variants of `ConfigurableServletWebServerFactory` that have additional customization setter methods for Tomcat, Jetty and Undertow respectively. The following example shows how to customize `TomcatServletWebServerFactory` that provides access to Tomcat-specific configuration options: Java ``` import java.time.Duration; import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory; import org.springframework.boot.web.server.WebServerFactoryCustomizer; import org.springframework.stereotype.Component; @Component public class MyTomcatWebServerFactoryCustomizer implements WebServerFactoryCustomizer<TomcatServletWebServerFactory> { @Override public void customize(TomcatServletWebServerFactory server) { server.addConnectorCustomizers((connector) -> connector.setAsyncTimeout(Duration.ofSeconds(20).toMillis())); } } ``` Kotlin ``` import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory import org.springframework.boot.web.server.WebServerFactoryCustomizer import org.springframework.stereotype.Component import java.time.Duration @Component class MyTomcatWebServerFactoryCustomizer : WebServerFactoryCustomizer<TomcatServletWebServerFactory> { override fun customize(server: TomcatServletWebServerFactory) { server.addConnectorCustomizers({ connector -> connector.asyncTimeout = Duration.ofSeconds(20).toMillis() }) } } ``` ##### Customizing ConfigurableServletWebServerFactory Directly For more advanced use cases that require you to extend from `ServletWebServerFactory`, you can expose a bean of such type yourself. Setters are provided for many configuration options. Several protected method “hooks” are also provided should you need to do something more exotic. See the [source code documentation](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/web/servlet/server/ConfigurableServletWebServerFactory.html) for details. | | | | --- | --- | | | Auto-configured customizers are still applied on your custom factory, so use that option carefully. | #### 1.3.5. JSP Limitations When running a Spring Boot application that uses an embedded servlet container (and is packaged as an executable archive), there are some limitations in the JSP support. * With Jetty and Tomcat, it should work if you use war packaging. An executable war will work when launched with `java -jar`, and will also be deployable to any standard container. JSPs are not supported when using an executable jar. * Undertow does not support JSPs. * Creating a custom `error.jsp` page does not override the default view for [error handling](#web.servlet.spring-mvc.error-handling). [Custom error pages](#web.servlet.spring-mvc.error-handling.error-pages) should be used instead. 2. Reactive Web Applications ----------------------------- Spring Boot simplifies development of reactive web applications by providing auto-configuration for Spring Webflux. ### 2.1. The “Spring WebFlux Framework” Spring WebFlux is the new reactive web framework introduced in Spring Framework 5.0. Unlike Spring MVC, it does not require the servlet API, is fully asynchronous and non-blocking, and implements the [Reactive Streams](https://www.reactive-streams.org/) specification through [the Reactor project](https://projectreactor.io/). Spring WebFlux comes in two flavors: functional and annotation-based. The annotation-based one is quite close to the Spring MVC model, as shown in the following example: Java ``` import reactor.core.publisher.Flux; import reactor.core.publisher.Mono; import org.springframework.web.bind.annotation.DeleteMapping; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.PathVariable; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; @RestController @RequestMapping("/users") public class MyRestController { private final UserRepository userRepository; private final CustomerRepository customerRepository; public MyRestController(UserRepository userRepository, CustomerRepository customerRepository) { this.userRepository = userRepository; this.customerRepository = customerRepository; } @GetMapping("/{userId}") public Mono<User> getUser(@PathVariable Long userId) { return this.userRepository.findById(userId); } @GetMapping("/{userId}/customers") public Flux<Customer> getUserCustomers(@PathVariable Long userId) { return this.userRepository.findById(userId).flatMapMany(this.customerRepository::findByUser); } @DeleteMapping("/{userId}") public Mono<Void> deleteUser(@PathVariable Long userId) { return this.userRepository.deleteById(userId); } } ``` Kotlin ``` import org.springframework.web.bind.annotation.DeleteMapping import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.bind.annotation.PathVariable import org.springframework.web.bind.annotation.RequestMapping import org.springframework.web.bind.annotation.RestController import reactor.core.publisher.Flux import reactor.core.publisher.Mono @RestController @RequestMapping("/users") class MyRestController(private val userRepository: UserRepository, private val customerRepository: CustomerRepository) { @GetMapping("/{userId}") fun getUser(@PathVariable userId: Long): Mono<User?> { return userRepository.findById(userId) } @GetMapping("/{userId}/customers") fun getUserCustomers(@PathVariable userId: Long): Flux<Customer> { return userRepository.findById(userId).flatMapMany { user: User? -> customerRepository.findByUser(user) } } @DeleteMapping("/{userId}") fun deleteUser(@PathVariable userId: Long): Mono<Void> { return userRepository.deleteById(userId) } } ``` “WebFlux.fn”, the functional variant, separates the routing configuration from the actual handling of the requests, as shown in the following example: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.MediaType; import org.springframework.web.reactive.function.server.RequestPredicate; import org.springframework.web.reactive.function.server.RouterFunction; import org.springframework.web.reactive.function.server.ServerResponse; import static org.springframework.web.reactive.function.server.RequestPredicates.accept; import static org.springframework.web.reactive.function.server.RouterFunctions.route; @Configuration(proxyBeanMethods = false) public class MyRoutingConfiguration { private static final RequestPredicate ACCEPT\_JSON = accept(MediaType.APPLICATION\_JSON); @Bean public RouterFunction<ServerResponse> monoRouterFunction(MyUserHandler userHandler) { return route() .GET("/{user}", ACCEPT\_JSON, userHandler::getUser) .GET("/{user}/customers", ACCEPT\_JSON, userHandler::getUserCustomers) .DELETE("/{user}", ACCEPT\_JSON, userHandler::deleteUser) .build(); } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.http.MediaType import org.springframework.web.reactive.function.server.RequestPredicates.DELETE import org.springframework.web.reactive.function.server.RequestPredicates.GET import org.springframework.web.reactive.function.server.RequestPredicates.accept import org.springframework.web.reactive.function.server.RouterFunction import org.springframework.web.reactive.function.server.RouterFunctions import org.springframework.web.reactive.function.server.ServerResponse @Configuration(proxyBeanMethods = false) class MyRoutingConfiguration { @Bean fun monoRouterFunction(userHandler: MyUserHandler): RouterFunction<ServerResponse> { return RouterFunctions.route( GET("/{user}").and(ACCEPT\_JSON), userHandler::getUser).andRoute( GET("/{user}/customers").and(ACCEPT\_JSON), userHandler::getUserCustomers).andRoute( DELETE("/{user}").and(ACCEPT\_JSON), userHandler::deleteUser) } companion object { private val ACCEPT\_JSON = accept(MediaType.APPLICATION\_JSON) } } ``` Java ``` import reactor.core.publisher.Mono; import org.springframework.stereotype.Component; import org.springframework.web.reactive.function.server.ServerRequest; import org.springframework.web.reactive.function.server.ServerResponse; @Component public class MyUserHandler { public Mono<ServerResponse> getUser(ServerRequest request) { ... } public Mono<ServerResponse> getUserCustomers(ServerRequest request) { ... } public Mono<ServerResponse> deleteUser(ServerRequest request) { ... } } ``` Kotlin ``` import org.springframework.stereotype.Component import org.springframework.web.reactive.function.server.ServerRequest import org.springframework.web.reactive.function.server.ServerResponse import reactor.core.publisher.Mono @Component class MyUserHandler { fun getUser(request: ServerRequest?): Mono<ServerResponse> { return ServerResponse.ok().build() } fun getUserCustomers(request: ServerRequest?): Mono<ServerResponse> { return ServerResponse.ok().build() } fun deleteUser(request: ServerRequest?): Mono<ServerResponse> { return ServerResponse.ok().build() } } ``` WebFlux is part of the Spring Framework and detailed information is available in its [reference documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-fn). | | | | --- | --- | | | You can define as many `RouterFunction` beans as you like to modularize the definition of the router. Beans can be ordered if you need to apply a precedence. | To get started, add the `spring-boot-starter-webflux` module to your application. | | | | --- | --- | | | Adding both `spring-boot-starter-web` and `spring-boot-starter-webflux` modules in your application results in Spring Boot auto-configuring Spring MVC, not WebFlux. This behavior has been chosen because many Spring developers add `spring-boot-starter-webflux` to their Spring MVC application to use the reactive `WebClient`. You can still enforce your choice by setting the chosen application type to `SpringApplication.setWebApplicationType(WebApplicationType.REACTIVE)`. | “WebFlux.fn”, the functional variant, separates the routing configuration from the actual handling of the requests, as shown in the following example: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.MediaType; import org.springframework.web.reactive.function.server.RequestPredicate; import org.springframework.web.reactive.function.server.RouterFunction; import org.springframework.web.reactive.function.server.ServerResponse; import static org.springframework.web.reactive.function.server.RequestPredicates.accept; import static org.springframework.web.reactive.function.server.RouterFunctions.route; @Configuration(proxyBeanMethods = false) public class MyRoutingConfiguration { private static final RequestPredicate ACCEPT\_JSON = accept(MediaType.APPLICATION\_JSON); @Bean public RouterFunction<ServerResponse> monoRouterFunction(MyUserHandler userHandler) { return route() .GET("/{user}", ACCEPT\_JSON, userHandler::getUser) .GET("/{user}/customers", ACCEPT\_JSON, userHandler::getUserCustomers) .DELETE("/{user}", ACCEPT\_JSON, userHandler::deleteUser) .build(); } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.http.MediaType import org.springframework.web.reactive.function.server.RequestPredicates.DELETE import org.springframework.web.reactive.function.server.RequestPredicates.GET import org.springframework.web.reactive.function.server.RequestPredicates.accept import org.springframework.web.reactive.function.server.RouterFunction import org.springframework.web.reactive.function.server.RouterFunctions import org.springframework.web.reactive.function.server.ServerResponse @Configuration(proxyBeanMethods = false) class MyRoutingConfiguration { @Bean fun monoRouterFunction(userHandler: MyUserHandler): RouterFunction<ServerResponse> { return RouterFunctions.route( GET("/{user}").and(ACCEPT\_JSON), userHandler::getUser).andRoute( GET("/{user}/customers").and(ACCEPT\_JSON), userHandler::getUserCustomers).andRoute( DELETE("/{user}").and(ACCEPT\_JSON), userHandler::deleteUser) } companion object { private val ACCEPT\_JSON = accept(MediaType.APPLICATION\_JSON) } } ``` Java ``` import reactor.core.publisher.Mono; import org.springframework.stereotype.Component; import org.springframework.web.reactive.function.server.ServerRequest; import org.springframework.web.reactive.function.server.ServerResponse; @Component public class MyUserHandler { public Mono<ServerResponse> getUser(ServerRequest request) { ... } public Mono<ServerResponse> getUserCustomers(ServerRequest request) { ... } public Mono<ServerResponse> deleteUser(ServerRequest request) { ... } } ``` Kotlin ``` import org.springframework.stereotype.Component import org.springframework.web.reactive.function.server.ServerRequest import org.springframework.web.reactive.function.server.ServerResponse import reactor.core.publisher.Mono @Component class MyUserHandler { fun getUser(request: ServerRequest?): Mono<ServerResponse> { return ServerResponse.ok().build() } fun getUserCustomers(request: ServerRequest?): Mono<ServerResponse> { return ServerResponse.ok().build() } fun deleteUser(request: ServerRequest?): Mono<ServerResponse> { return ServerResponse.ok().build() } } ``` WebFlux is part of the Spring Framework and detailed information is available in its [reference documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-fn). | | | | --- | --- | | | You can define as many `RouterFunction` beans as you like to modularize the definition of the router. Beans can be ordered if you need to apply a precedence. | To get started, add the `spring-boot-starter-webflux` module to your application. | | | | --- | --- | | | Adding both `spring-boot-starter-web` and `spring-boot-starter-webflux` modules in your application results in Spring Boot auto-configuring Spring MVC, not WebFlux. This behavior has been chosen because many Spring developers add `spring-boot-starter-webflux` to their Spring MVC application to use the reactive `WebClient`. You can still enforce your choice by setting the chosen application type to `SpringApplication.setWebApplicationType(WebApplicationType.REACTIVE)`. | #### 2.1.1. Spring WebFlux Auto-configuration Spring Boot provides auto-configuration for Spring WebFlux that works well with most applications. The auto-configuration adds the following features on top of Spring’s defaults: * Configuring codecs for `HttpMessageReader` and `HttpMessageWriter` instances (described [later in this document](#web.reactive.webflux.httpcodecs)). * Support for serving static resources, including support for WebJars (described [later in this document](#web.servlet.spring-mvc.static-content)). If you want to keep Spring Boot WebFlux features and you want to add additional [WebFlux configuration](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-config), you can add your own `@Configuration` class of type `WebFluxConfigurer` but **without** `@EnableWebFlux`. If you want to take complete control of Spring WebFlux, you can add your own `@Configuration` annotated with `@EnableWebFlux`. #### 2.1.2. HTTP Codecs with HttpMessageReaders and HttpMessageWriters Spring WebFlux uses the `HttpMessageReader` and `HttpMessageWriter` interfaces to convert HTTP requests and responses. They are configured with `CodecConfigurer` to have sensible defaults by looking at the libraries available in your classpath. Spring Boot provides dedicated configuration properties for codecs, `spring.codec.*`. It also applies further customization by using `CodecCustomizer` instances. For example, `spring.jackson.*` configuration keys are applied to the Jackson codec. If you need to add or customize codecs, you can create a custom `CodecCustomizer` component, as shown in the following example: Java ``` import org.springframework.boot.web.codec.CodecCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.codec.ServerSentEventHttpMessageReader; @Configuration(proxyBeanMethods = false) public class MyCodecsConfiguration { @Bean public CodecCustomizer myCodecCustomizer() { return (configurer) -> { configurer.registerDefaults(false); configurer.customCodecs().register(new ServerSentEventHttpMessageReader()); // ... }; } } ``` Kotlin ``` import org.springframework.boot.web.codec.CodecCustomizer import org.springframework.context.annotation.Bean import org.springframework.http.codec.CodecConfigurer import org.springframework.http.codec.ServerSentEventHttpMessageReader class MyCodecsConfiguration { @Bean fun myCodecCustomizer(): CodecCustomizer { return CodecCustomizer { configurer: CodecConfigurer -> configurer.registerDefaults(false) configurer.customCodecs().register(ServerSentEventHttpMessageReader()) } } } ``` You can also leverage [Boot’s custom JSON serializers and deserializers](features#features.json.jackson.custom-serializers-and-deserializers). #### 2.1.3. Static Content By default, Spring Boot serves static content from a directory called `/static` (or `/public` or `/resources` or `/META-INF/resources`) in the classpath. It uses the `ResourceWebHandler` from Spring WebFlux so that you can modify that behavior by adding your own `WebFluxConfigurer` and overriding the `addResourceHandlers` method. By default, resources are mapped on `/**`, but you can tune that by setting the `spring.webflux.static-path-pattern` property. For instance, relocating all resources to `/resources/**` can be achieved as follows: Properties ``` spring.webflux.static-path-pattern=/resources/** ``` Yaml ``` spring: webflux: static-path-pattern: "/resources/**" ``` You can also customize the static resource locations by using `spring.web.resources.static-locations`. Doing so replaces the default values with a list of directory locations. If you do so, the default welcome page detection switches to your custom locations. So, if there is an `index.html` in any of your locations on startup, it is the home page of the application. In addition to the “standard” static resource locations listed earlier, a special case is made for [Webjars content](https://www.webjars.org/). Any resources with a path in `/webjars/**` are served from jar files if they are packaged in the Webjars format. | | | | --- | --- | | | Spring WebFlux applications do not strictly depend on the servlet API, so they cannot be deployed as war files and do not use the `src/main/webapp` directory. | #### 2.1.4. Welcome Page Spring Boot supports both static and templated welcome pages. It first looks for an `index.html` file in the configured static content locations. If one is not found, it then looks for an `index` template. If either is found, it is automatically used as the welcome page of the application. #### 2.1.5. Template Engines As well as REST web services, you can also use Spring WebFlux to serve dynamic HTML content. Spring WebFlux supports a variety of templating technologies, including Thymeleaf, FreeMarker, and Mustache. Spring Boot includes auto-configuration support for the following templating engines: * [FreeMarker](https://freemarker.apache.org/docs/) * [Thymeleaf](https://www.thymeleaf.org) * [Mustache](https://mustache.github.io/) When you use one of these templating engines with the default configuration, your templates are picked up automatically from `src/main/resources/templates`. #### 2.1.6. Error Handling Spring Boot provides a `WebExceptionHandler` that handles all errors in a sensible way. Its position in the processing order is immediately before the handlers provided by WebFlux, which are considered last. For machine clients, it produces a JSON response with details of the error, the HTTP status, and the exception message. For browser clients, there is a “whitelabel” error handler that renders the same data in HTML format. You can also provide your own HTML templates to display errors (see the [next section](#web.reactive.webflux.error-handling.error-pages)). The first step to customizing this feature often involves using the existing mechanism but replacing or augmenting the error contents. For that, you can add a bean of type `ErrorAttributes`. To change the error handling behavior, you can implement `ErrorWebExceptionHandler` and register a bean definition of that type. Because an `ErrorWebExceptionHandler` is quite low-level, Spring Boot also provides a convenient `AbstractErrorWebExceptionHandler` to let you handle errors in a WebFlux functional way, as shown in the following example: Java ``` import reactor.core.publisher.Mono; import org.springframework.boot.autoconfigure.web.WebProperties.Resources; import org.springframework.boot.autoconfigure.web.reactive.error.AbstractErrorWebExceptionHandler; import org.springframework.boot.web.reactive.error.ErrorAttributes; import org.springframework.context.ApplicationContext; import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.stereotype.Component; import org.springframework.web.reactive.function.server.RouterFunction; import org.springframework.web.reactive.function.server.RouterFunctions; import org.springframework.web.reactive.function.server.ServerRequest; import org.springframework.web.reactive.function.server.ServerResponse; import org.springframework.web.reactive.function.server.ServerResponse.BodyBuilder; @Component public class MyErrorWebExceptionHandler extends AbstractErrorWebExceptionHandler { public MyErrorWebExceptionHandler(ErrorAttributes errorAttributes, Resources resources, ApplicationContext applicationContext) { super(errorAttributes, resources, applicationContext); } @Override protected RouterFunction<ServerResponse> getRoutingFunction(ErrorAttributes errorAttributes) { return RouterFunctions.route(this::acceptsXml, this::handleErrorAsXml); } private boolean acceptsXml(ServerRequest request) { return request.headers().accept().contains(MediaType.APPLICATION\_XML); } public Mono<ServerResponse> handleErrorAsXml(ServerRequest request) { BodyBuilder builder = ServerResponse.status(HttpStatus.INTERNAL\_SERVER\_ERROR); // ... additional builder calls return builder.build(); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.web.WebProperties import org.springframework.boot.autoconfigure.web.reactive.error.AbstractErrorWebExceptionHandler import org.springframework.boot.web.reactive.error.ErrorAttributes import org.springframework.context.ApplicationContext import org.springframework.http.HttpStatus import org.springframework.http.MediaType import org.springframework.stereotype.Component import org.springframework.web.reactive.function.server.RouterFunction import org.springframework.web.reactive.function.server.RouterFunctions import org.springframework.web.reactive.function.server.ServerRequest import org.springframework.web.reactive.function.server.ServerResponse import reactor.core.publisher.Mono @Component class MyErrorWebExceptionHandler(errorAttributes: ErrorAttributes?, resources: WebProperties.Resources?, applicationContext: ApplicationContext?) : AbstractErrorWebExceptionHandler(errorAttributes, resources, applicationContext) { override fun getRoutingFunction(errorAttributes: ErrorAttributes): RouterFunction<ServerResponse> { return RouterFunctions.route(this::acceptsXml, this::handleErrorAsXml) } private fun acceptsXml(request: ServerRequest): Boolean { return request.headers().accept().contains(MediaType.APPLICATION\_XML) } fun handleErrorAsXml(request: ServerRequest?): Mono<ServerResponse> { val builder = ServerResponse.status(HttpStatus.INTERNAL\_SERVER\_ERROR) // ... additional builder calls return builder.build() } } ``` For a more complete picture, you can also subclass `DefaultErrorWebExceptionHandler` directly and override specific methods. In some cases, errors handled at the controller or handler function level are not recorded by the [metrics infrastructure](actuator#actuator.metrics.supported.spring-webflux). Applications can ensure that such exceptions are recorded with the request metrics by setting the handled exception as a request attribute: Java ``` import org.springframework.boot.web.reactive.error.ErrorAttributes; import org.springframework.stereotype.Controller; import org.springframework.web.bind.annotation.ExceptionHandler; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.reactive.result.view.Rendering; import org.springframework.web.server.ServerWebExchange; @Controller public class MyExceptionHandlingController { @GetMapping("/profile") public Rendering userProfile() { // ... throw new IllegalStateException(); } @ExceptionHandler(IllegalStateException.class) public Rendering handleIllegalState(ServerWebExchange exchange, IllegalStateException exc) { exchange.getAttributes().putIfAbsent(ErrorAttributes.ERROR\_ATTRIBUTE, exc); return Rendering.view("errorView").modelAttribute("message", exc.getMessage()).build(); } } ``` Kotlin ``` import org.springframework.boot.web.reactive.error.ErrorAttributes import org.springframework.stereotype.Controller import org.springframework.web.bind.annotation.ExceptionHandler import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.reactive.result.view.Rendering import org.springframework.web.server.ServerWebExchange @Controller class MyExceptionHandlingController { @GetMapping("/profile") fun userProfile(): Rendering { // ... throw IllegalStateException() } @ExceptionHandler(IllegalStateException::class) fun handleIllegalState(exchange: ServerWebExchange, exc: IllegalStateException): Rendering { exchange.attributes.putIfAbsent(ErrorAttributes.ERROR\_ATTRIBUTE, exc) return Rendering.view("errorView").modelAttribute("message", exc.message ?: "").build() } } ``` ##### Custom Error Pages If you want to display a custom HTML error page for a given status code, you can add a file to an `/error` directory. Error pages can either be static HTML (that is, added under any of the static resource directories) or built with templates. The name of the file should be the exact status code or a series mask. For example, to map `404` to a static HTML file, your directory structure would be as follows: ``` src/ +- main/ +- java/ | + <source code> +- resources/ +- public/ +- error/ | +- 404.html +- <other public assets> ``` To map all `5xx` errors by using a Mustache template, your directory structure would be as follows: ``` src/ +- main/ +- java/ | + <source code> +- resources/ +- templates/ +- error/ | +- 5xx.mustache +- <other templates> ``` #### 2.1.7. Web Filters Spring WebFlux provides a `WebFilter` interface that can be implemented to filter HTTP request-response exchanges. `WebFilter` beans found in the application context will be automatically used to filter each exchange. Where the order of the filters is important they can implement `Ordered` or be annotated with `@Order`. Spring Boot auto-configuration may configure web filters for you. When it does so, the orders shown in the following table will be used: | Web Filter | Order | | --- | --- | | `MetricsWebFilter` | `Ordered.HIGHEST_PRECEDENCE + 1` | | `WebFilterChainProxy` (Spring Security) | `-100` | | `HttpTraceWebFilter` | `Ordered.LOWEST_PRECEDENCE - 10` | ### 2.2. Embedded Reactive Server Support Spring Boot includes support for the following embedded reactive web servers: Reactor Netty, Tomcat, Jetty, and Undertow. Most developers use the appropriate “Starter” to obtain a fully configured instance. By default, the embedded server listens for HTTP requests on port 8080. ### 2.3. Reactive Server Resources Configuration When auto-configuring a Reactor Netty or Jetty server, Spring Boot will create specific beans that will provide HTTP resources to the server instance: `ReactorResourceFactory` or `JettyResourceFactory`. By default, those resources will be also shared with the Reactor Netty and Jetty clients for optimal performances, given: * the same technology is used for server and client * the client instance is built using the `WebClient.Builder` bean auto-configured by Spring Boot Developers can override the resource configuration for Jetty and Reactor Netty by providing a custom `ReactorResourceFactory` or `JettyResourceFactory` bean - this will be applied to both clients and servers. You can learn more about the resource configuration on the client side in the [WebClient Runtime section](io#io.rest-client.webclient.runtime). 3. Graceful Shutdown --------------------- Graceful shutdown is supported with all four embedded web servers (Jetty, Reactor Netty, Tomcat, and Undertow) and with both reactive and servlet-based web applications. It occurs as part of closing the application context and is performed in the earliest phase of stopping `SmartLifecycle` beans. This stop processing uses a timeout which provides a grace period during which existing requests will be allowed to complete but no new requests will be permitted. The exact way in which new requests are not permitted varies depending on the web server that is being used. Jetty, Reactor Netty, and Tomcat will stop accepting requests at the network layer. Undertow will accept requests but respond immediately with a service unavailable (503) response. | | | | --- | --- | | | Graceful shutdown with Tomcat requires Tomcat 9.0.33 or later. | To enable graceful shutdown, configure the `server.shutdown` property, as shown in the following example: Properties ``` server.shutdown=graceful ``` Yaml ``` server: shutdown: "graceful" ``` To configure the timeout period, configure the `spring.lifecycle.timeout-per-shutdown-phase` property, as shown in the following example: Properties ``` spring.lifecycle.timeout-per-shutdown-phase=20s ``` Yaml ``` spring: lifecycle: timeout-per-shutdown-phase: "20s" ``` | | | | --- | --- | | | Using graceful shutdown with your IDE may not work properly if it does not send a proper `SIGTERM` signal. See the documentation of your IDE for more details. | 4. Spring Security ------------------- If [Spring Security](https://spring.io/projects/spring-security) is on the classpath, then web applications are secured by default. Spring Boot relies on Spring Security’s content-negotiation strategy to determine whether to use `httpBasic` or `formLogin`. To add method-level security to a web application, you can also add `@EnableGlobalMethodSecurity` with your desired settings. Additional information can be found in the [Spring Security Reference Guide](https://docs.spring.io/spring-security/reference/5.7.1/servlet/authorization/method-security.html). The default `UserDetailsService` has a single user. The user name is `user`, and the password is random and is printed at WARN level when the application starts, as shown in the following example: ``` Using generated security password: 78fa095d-3f4c-48b1-ad50-e24c31d5cf35 This generated password is for development use only. Your security configuration must be updated before running your application in production. ``` | | | | --- | --- | | | If you fine-tune your logging configuration, ensure that the `org.springframework.boot.autoconfigure.security` category is set to log `WARN`-level messages. Otherwise, the default password is not printed. | You can change the username and password by providing a `spring.security.user.name` and `spring.security.user.password`. The basic features you get by default in a web application are: * A `UserDetailsService` (or `ReactiveUserDetailsService` in case of a WebFlux application) bean with in-memory store and a single user with a generated password (see [`SecurityProperties.User`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/autoconfigure/security/SecurityProperties.User.html) for the properties of the user). * Form-based login or HTTP Basic security (depending on the `Accept` header in the request) for the entire application (including actuator endpoints if actuator is on the classpath). * A `DefaultAuthenticationEventPublisher` for publishing authentication events. You can provide a different `AuthenticationEventPublisher` by adding a bean for it. ### 4.1. MVC Security The default security configuration is implemented in `SecurityAutoConfiguration` and `UserDetailsServiceAutoConfiguration`. `SecurityAutoConfiguration` imports `SpringBootWebSecurityConfiguration` for web security and `UserDetailsServiceAutoConfiguration` configures authentication, which is also relevant in non-web applications. To switch off the default web application security configuration completely or to combine multiple Spring Security components such as OAuth2 Client and Resource Server, add a bean of type `SecurityFilterChain` (doing so does not disable the `UserDetailsService` configuration or Actuator’s security). To also switch off the `UserDetailsService` configuration, you can add a bean of type `UserDetailsService`, `AuthenticationProvider`, or `AuthenticationManager`. Access rules can be overridden by adding a custom `SecurityFilterChain` or `WebSecurityConfigurerAdapter` bean. Spring Boot provides convenience methods that can be used to override access rules for actuator endpoints and static resources. `EndpointRequest` can be used to create a `RequestMatcher` that is based on the `management.endpoints.web.base-path` property. `PathRequest` can be used to create a `RequestMatcher` for resources in commonly used locations. ### 4.2. WebFlux Security Similar to Spring MVC applications, you can secure your WebFlux applications by adding the `spring-boot-starter-security` dependency. The default security configuration is implemented in `ReactiveSecurityAutoConfiguration` and `UserDetailsServiceAutoConfiguration`. `ReactiveSecurityAutoConfiguration` imports `WebFluxSecurityConfiguration` for web security and `UserDetailsServiceAutoConfiguration` configures authentication, which is also relevant in non-web applications. To switch off the default web application security configuration completely, you can add a bean of type `WebFilterChainProxy` (doing so does not disable the `UserDetailsService` configuration or Actuator’s security). To also switch off the `UserDetailsService` configuration, you can add a bean of type `ReactiveUserDetailsService` or `ReactiveAuthenticationManager`. Access rules and the use of multiple Spring Security components such as OAuth 2 Client and Resource Server can be configured by adding a custom `SecurityWebFilterChain` bean. Spring Boot provides convenience methods that can be used to override access rules for actuator endpoints and static resources. `EndpointRequest` can be used to create a `ServerWebExchangeMatcher` that is based on the `management.endpoints.web.base-path` property. `PathRequest` can be used to create a `ServerWebExchangeMatcher` for resources in commonly used locations. For example, you can customize your security configuration by adding something like: Java ``` import org.springframework.boot.autoconfigure.security.reactive.PathRequest; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.web.server.ServerHttpSecurity; import org.springframework.security.web.server.SecurityWebFilterChain; @Configuration(proxyBeanMethods = false) public class MyWebFluxSecurityConfiguration { @Bean public SecurityWebFilterChain springSecurityFilterChain(ServerHttpSecurity http) { http.authorizeExchange((spec) -> { spec.matchers(PathRequest.toStaticResources().atCommonLocations()).permitAll(); spec.pathMatchers("/foo", "/bar").authenticated(); }); http.formLogin(); return http.build(); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.security.reactive.PathRequest import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.security.config.web.server.ServerHttpSecurity import org.springframework.security.web.server.SecurityWebFilterChain @Configuration(proxyBeanMethods = false) class MyWebFluxSecurityConfiguration { @Bean fun springSecurityFilterChain(http: ServerHttpSecurity): SecurityWebFilterChain { http.authorizeExchange { spec -> spec.matchers(PathRequest.toStaticResources().atCommonLocations()).permitAll() spec.pathMatchers("/foo", "/bar").authenticated() } http.formLogin() return http.build() } } ``` ### 4.3. OAuth2 [OAuth2](https://oauth.net/2/) is a widely used authorization framework that is supported by Spring. #### 4.3.1. Client If you have `spring-security-oauth2-client` on your classpath, you can take advantage of some auto-configuration to set up an OAuth2/Open ID Connect clients. This configuration makes use of the properties under `OAuth2ClientProperties`. The same properties are applicable to both servlet and reactive applications. You can register multiple OAuth2 clients and providers under the `spring.security.oauth2.client` prefix, as shown in the following example: Properties ``` spring.security.oauth2.client.registration.my-client-1.client-id=abcd spring.security.oauth2.client.registration.my-client-1.client-secret=password spring.security.oauth2.client.registration.my-client-1.client-name=Client for user scope spring.security.oauth2.client.registration.my-client-1.provider=my-oauth-provider spring.security.oauth2.client.registration.my-client-1.scope=user spring.security.oauth2.client.registration.my-client-1.redirect-uri=https://my-redirect-uri.com spring.security.oauth2.client.registration.my-client-1.client-authentication-method=basic spring.security.oauth2.client.registration.my-client-1.authorization-grant-type=authorization-code spring.security.oauth2.client.registration.my-client-2.client-id=abcd spring.security.oauth2.client.registration.my-client-2.client-secret=password spring.security.oauth2.client.registration.my-client-2.client-name=Client for email scope spring.security.oauth2.client.registration.my-client-2.provider=my-oauth-provider spring.security.oauth2.client.registration.my-client-2.scope=email spring.security.oauth2.client.registration.my-client-2.redirect-uri=https://my-redirect-uri.com spring.security.oauth2.client.registration.my-client-2.client-authentication-method=basic spring.security.oauth2.client.registration.my-client-2.authorization-grant-type=authorization_code spring.security.oauth2.client.provider.my-oauth-provider.authorization-uri=https://my-auth-server/oauth/authorize spring.security.oauth2.client.provider.my-oauth-provider.token-uri=https://my-auth-server/oauth/token spring.security.oauth2.client.provider.my-oauth-provider.user-info-uri=https://my-auth-server/userinfo spring.security.oauth2.client.provider.my-oauth-provider.user-info-authentication-method=header spring.security.oauth2.client.provider.my-oauth-provider.jwk-set-uri=https://my-auth-server/token_keys spring.security.oauth2.client.provider.my-oauth-provider.user-name-attribute=name ``` Yaml ``` spring: security: oauth2: client: registration: my-client-1: client-id: "abcd" client-secret: "password" client-name: "Client for user scope" provider: "my-oauth-provider" scope: "user" redirect-uri: "https://my-redirect-uri.com" client-authentication-method: "basic" authorization-grant-type: "authorization-code" my-client-2: client-id: "abcd" client-secret: "password" client-name: "Client for email scope" provider: "my-oauth-provider" scope: "email" redirect-uri: "https://my-redirect-uri.com" client-authentication-method: "basic" authorization-grant-type: "authorization_code" provider: my-oauth-provider: authorization-uri: "https://my-auth-server/oauth/authorize" token-uri: "https://my-auth-server/oauth/token" user-info-uri: "https://my-auth-server/userinfo" user-info-authentication-method: "header" jwk-set-uri: "https://my-auth-server/token_keys" user-name-attribute: "name" ``` For OpenID Connect providers that support [OpenID Connect discovery](https://openid.net/specs/openid-connect-discovery-1_0.html), the configuration can be further simplified. The provider needs to be configured with an `issuer-uri` which is the URI that the it asserts as its Issuer Identifier. For example, if the `issuer-uri` provided is "https://example.com", then an `OpenID Provider Configuration Request` will be made to "https://example.com/.well-known/openid-configuration". The result is expected to be an `OpenID Provider Configuration Response`. The following example shows how an OpenID Connect Provider can be configured with the `issuer-uri`: Properties ``` spring.security.oauth2.client.provider.oidc-provider.issuer-uri=https://dev-123456.oktapreview.com/oauth2/default/ ``` Yaml ``` spring: security: oauth2: client: provider: oidc-provider: issuer-uri: "https://dev-123456.oktapreview.com/oauth2/default/" ``` By default, Spring Security’s `OAuth2LoginAuthenticationFilter` only processes URLs matching `/login/oauth2/code/*`. If you want to customize the `redirect-uri` to use a different pattern, you need to provide configuration to process that custom pattern. For example, for servlet applications, you can add your own `SecurityFilterChain` that resembles the following: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration(proxyBeanMethods = false) public class MyOAuthClientConfiguration { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { http.authorizeRequests().anyRequest().authenticated(); http.oauth2Login().redirectionEndpoint().baseUri("custom-callback"); return http.build(); } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.web.SecurityFilterChain @Configuration(proxyBeanMethods = false) class MyOAuthClientConfiguration { @Bean fun securityFilterChain(http: HttpSecurity): SecurityFilterChain { http.authorizeRequests().anyRequest().authenticated() http.oauth2Login().redirectionEndpoint().baseUri("custom-callback") return http.build() } } ``` | | | | --- | --- | | | Spring Boot auto-configures an `InMemoryOAuth2AuthorizedClientService` which is used by Spring Security for the management of client registrations. The `InMemoryOAuth2AuthorizedClientService` has limited capabilities and we recommend using it only for development environments. For production environments, consider using a `JdbcOAuth2AuthorizedClientService` or creating your own implementation of `OAuth2AuthorizedClientService`. | ##### OAuth2 client registration for common providers For common OAuth2 and OpenID providers, including Google, Github, Facebook, and Okta, we provide a set of provider defaults (`google`, `github`, `facebook`, and `okta`, respectively). If you do not need to customize these providers, you can set the `provider` attribute to the one for which you need to infer defaults. Also, if the key for the client registration matches a default supported provider, Spring Boot infers that as well. In other words, the two configurations in the following example use the Google provider: Properties ``` spring.security.oauth2.client.registration.my-client.client-id=abcd spring.security.oauth2.client.registration.my-client.client-secret=password spring.security.oauth2.client.registration.my-client.provider=google spring.security.oauth2.client.registration.google.client-id=abcd spring.security.oauth2.client.registration.google.client-secret=password ``` Yaml ``` spring: security: oauth2: client: registration: my-client: client-id: "abcd" client-secret: "password" provider: "google" google: client-id: "abcd" client-secret: "password" ``` #### 4.3.2. Resource Server If you have `spring-security-oauth2-resource-server` on your classpath, Spring Boot can set up an OAuth2 Resource Server. For JWT configuration, a JWK Set URI or OIDC Issuer URI needs to be specified, as shown in the following examples: Properties ``` spring.security.oauth2.resourceserver.jwt.jwk-set-uri=https://example.com/oauth2/default/v1/keys ``` Yaml ``` spring: security: oauth2: resourceserver: jwt: jwk-set-uri: "https://example.com/oauth2/default/v1/keys" ``` Properties ``` spring.security.oauth2.resourceserver.jwt.issuer-uri=https://dev-123456.oktapreview.com/oauth2/default/ ``` Yaml ``` spring: security: oauth2: resourceserver: jwt: issuer-uri: "https://dev-123456.oktapreview.com/oauth2/default/" ``` | | | | --- | --- | | | If the authorization server does not support a JWK Set URI, you can configure the resource server with the Public Key used for verifying the signature of the JWT. This can be done using the `spring.security.oauth2.resourceserver.jwt.public-key-location` property, where the value needs to point to a file containing the public key in the PEM-encoded x509 format. | The same properties are applicable for both servlet and reactive applications. Alternatively, you can define your own `JwtDecoder` bean for servlet applications or a `ReactiveJwtDecoder` for reactive applications. In cases where opaque tokens are used instead of JWTs, you can configure the following properties to validate tokens through introspection: Properties ``` spring.security.oauth2.resourceserver.opaquetoken.introspection-uri=https://example.com/check-token spring.security.oauth2.resourceserver.opaquetoken.client-id=my-client-id spring.security.oauth2.resourceserver.opaquetoken.client-secret=my-client-secret ``` Yaml ``` spring: security: oauth2: resourceserver: opaquetoken: introspection-uri: "https://example.com/check-token" client-id: "my-client-id" client-secret: "my-client-secret" ``` Again, the same properties are applicable for both servlet and reactive applications. Alternatively, you can define your own `OpaqueTokenIntrospector` bean for servlet applications or a `ReactiveOpaqueTokenIntrospector` for reactive applications. #### 4.3.3. Authorization Server Currently, Spring Security does not provide support for implementing an OAuth 2.0 Authorization Server. However, this functionality is available from the [Spring Security OAuth](https://spring.io/projects/spring-security-oauth) project, which will eventually be superseded by Spring Security completely. Until then, you can use the `spring-security-oauth2-autoconfigure` module to easily set up an OAuth 2.0 authorization server; see its [documentation](https://docs.spring.io/spring-security-oauth2-boot/) for instructions. ### 4.4. SAML 2.0 #### 4.4.1. Relying Party If you have `spring-security-saml2-service-provider` on your classpath, you can take advantage of some auto-configuration to set up a SAML 2.0 Relying Party. This configuration makes use of the properties under `Saml2RelyingPartyProperties`. A relying party registration represents a paired configuration between an Identity Provider, IDP, and a Service Provider, SP. You can register multiple relying parties under the `spring.security.saml2.relyingparty` prefix, as shown in the following example: Properties ``` spring.security.saml2.relyingparty.registration.my-relying-party1.signing.credentials[0].private-key-location=path-to-private-key spring.security.saml2.relyingparty.registration.my-relying-party1.signing.credentials[0].certificate-location=path-to-certificate spring.security.saml2.relyingparty.registration.my-relying-party1.decryption.credentials[0].private-key-location=path-to-private-key spring.security.saml2.relyingparty.registration.my-relying-party1.decryption.credentials[0].certificate-location=path-to-certificate spring.security.saml2.relyingparty.registration.my-relying-party1.singlelogout.url=https://myapp/logout/saml2/slo spring.security.saml2.relyingparty.registration.my-relying-party1.singlelogout.reponse-url=https://remoteidp2.slo.url spring.security.saml2.relyingparty.registration.my-relying-party1.singlelogout.binding=POST spring.security.saml2.relyingparty.registration.my-relying-party1.assertingparty.verification.credentials[0].certificate-location=path-to-verification-cert spring.security.saml2.relyingparty.registration.my-relying-party1.assertingparty.entity-id=remote-idp-entity-id1 spring.security.saml2.relyingparty.registration.my-relying-party1.assertingparty.sso-url=https://remoteidp1.sso.url spring.security.saml2.relyingparty.registration.my-relying-party2.signing.credentials[0].private-key-location=path-to-private-key spring.security.saml2.relyingparty.registration.my-relying-party2.signing.credentials[0].certificate-location=path-to-certificate spring.security.saml2.relyingparty.registration.my-relying-party2.decryption.credentials[0].private-key-location=path-to-private-key spring.security.saml2.relyingparty.registration.my-relying-party2.decryption.credentials[0].certificate-location=path-to-certificate spring.security.saml2.relyingparty.registration.my-relying-party2.assertingparty.verification.credentials[0].certificate-location=path-to-other-verification-cert spring.security.saml2.relyingparty.registration.my-relying-party2.assertingparty.entity-id=remote-idp-entity-id2 spring.security.saml2.relyingparty.registration.my-relying-party2.assertingparty.sso-url=https://remoteidp2.sso.url spring.security.saml2.relyingparty.registration.my-relying-party2.assertingparty.singlelogout.url=https://remoteidp2.slo.url spring.security.saml2.relyingparty.registration.my-relying-party2.assertingparty.singlelogout.reponse-url=https://myapp/logout/saml2/slo spring.security.saml2.relyingparty.registration.my-relying-party2.assertingparty.singlelogout.binding=POST ``` Yaml ``` spring: security: saml2: relyingparty: registration: my-relying-party1: signing: credentials: - private-key-location: "path-to-private-key" certificate-location: "path-to-certificate" decryption: credentials: - private-key-location: "path-to-private-key" certificate-location: "path-to-certificate" singlelogout: url: "https://myapp/logout/saml2/slo" reponse-url: "https://remoteidp2.slo.url" binding: "POST" assertingparty: verification: credentials: - certificate-location: "path-to-verification-cert" entity-id: "remote-idp-entity-id1" sso-url: "https://remoteidp1.sso.url" my-relying-party2: signing: credentials: - private-key-location: "path-to-private-key" certificate-location: "path-to-certificate" decryption: credentials: - private-key-location: "path-to-private-key" certificate-location: "path-to-certificate" assertingparty: verification: credentials: - certificate-location: "path-to-other-verification-cert" entity-id: "remote-idp-entity-id2" sso-url: "https://remoteidp2.sso.url" singlelogout: url: "https://remoteidp2.slo.url" reponse-url: "https://myapp/logout/saml2/slo" binding: "POST" ``` For SAML2 logout, by default, Spring Security’s `Saml2LogoutRequestFilter` and `Saml2LogoutResponseFilter` only process URLs matching `/logout/saml2/slo`. If you want to customize the `url` to which AP-initiated logout requests get sent to or the `response-url` to which an AP sends logout responses to, to use a different pattern, you need to provide configuration to process that custom pattern. For example, for servlet applications, you can add your own `SecurityFilterChain` that resembles the following: ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration(proxyBeanMethods = false) public class MySamlRelyingPartyConfiguration { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { http.authorizeRequests().anyRequest().authenticated(); http.saml2Login(); http.saml2Logout((saml2) -> saml2.logoutRequest((request) -> request.logoutUrl("/SLOService.saml2")) .logoutResponse((response) -> response.logoutUrl("/SLOService.saml2"))); return http.build(); } } ``` 5. Spring Session ------------------ Spring Boot provides [Spring Session](https://spring.io/projects/spring-session) auto-configuration for a wide range of data stores. When building a servlet web application, the following stores can be auto-configured: * JDBC * Redis * Hazelcast * MongoDB Additionally, [Spring Boot for Apache Geode](https://github.com/spring-projects/spring-boot-data-geode) provides [auto-configuration for using Apache Geode as a session store](https://docs.spring.io/spring-boot-data-geode-build/1.7.x/reference/html5/#geode-session). The servlet auto-configuration replaces the need to use `@Enable*HttpSession`. When building a reactive web application, the following stores can be auto-configured: * Redis * MongoDB The reactive auto-configuration replaces the need to use `@Enable*WebSession`. If a single Spring Session module is present on the classpath, Spring Boot uses that store implementation automatically. If you have more than one implementation, you must choose the [`StoreType`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/session/StoreType.java) that you wish to use to store the sessions. For instance, to use JDBC as the back-end store, you can configure your application as follows: Properties ``` spring.session.store-type=jdbc ``` Yaml ``` spring: session: store-type: "jdbc" ``` | | | | --- | --- | | | You can disable Spring Session by setting the `store-type` to `none`. | Each store has specific additional settings. For instance, it is possible to customize the name of the table for the JDBC store, as shown in the following example: Properties ``` spring.session.jdbc.table-name=SESSIONS ``` Yaml ``` spring: session: jdbc: table-name: "SESSIONS" ``` For setting the timeout of the session you can use the `spring.session.timeout` property. If that property is not set with a servlet web application, the auto-configuration falls back to the value of `server.servlet.session.timeout`. You can take control over Spring Session’s configuration using `@Enable*HttpSession` (servlet) or `@Enable*WebSession` (reactive). This will cause the auto-configuration to back off. Spring Session can then be configured using the annotation’s attributes rather than the previously described configuration properties. 6. Spring for GraphQL ---------------------- If you want to build GraphQL applications, you can take advantage of Spring Boot’s auto-configuration for [Spring for GraphQL](https://spring.io/projects/spring-graphql). The Spring for GraphQL project is based on [GraphQL Java](https://github.com/graphql-java/graphql-java). You’ll need the `spring-boot-starter-graphql` starter at a minimum. Because GraphQL is transport-agnostic, you’ll also need to have one or more additional starters in your application to expose your GraphQL API over the web: | Starter | Transport | Implementation | | --- | --- | --- | | `spring-boot-starter-web` | HTTP | Spring MVC | | `spring-boot-starter-websocket` | WebSocket | WebSocket for Servlet apps | | `spring-boot-starter-webflux` | HTTP, WebSocket | Spring WebFlux | | `spring-boot-starter-rsocket` | TCP, WebSocket | Spring WebFlux on Reactor Netty | ### 6.1. GraphQL Schema A Spring GraphQL application requires a defined schema at startup. By default, you can write ".graphqls" or ".gqls" schema files under `src/main/resources/graphql/**` and Spring Boot will pick them up automatically. You can customize the locations with `spring.graphql.schema.locations` and the file extensions with `spring.graphql.schema.file-extensions`. In the following sections, we’ll consider this sample GraphQL schema, defining two types and two queries: ``` type Query { greeting(name: String! = "Spring"): String! project(slug: ID!): Project } """ A Project in the Spring portfolio """ type Project { """ Unique string id used in URLs """ slug: ID! """ Project name """ name: String! """ URL of the git repository """ repositoryUrl: String! """ Current support status """ status: ProjectStatus! } enum ProjectStatus { """ Actively supported by the Spring team """ ACTIVE """ Supported by the community """ COMMUNITY """ Prototype, not officially supported yet """ INCUBATING """ Project being retired, in maintenance mode """ ATTIC """ End-Of-Lifed """ EOL } ``` | | | | --- | --- | | | By default, [field introspection](https://spec.graphql.org/draft/#sec-Introspection) will be allowed on the schema as it is required for tools such as GraphiQL. If you wish to not expose information about the schema, you can disable introspection by setting `spring.graphql.schema.introspection.enabled` to `false`. | ### 6.2. GraphQL RuntimeWiring The GraphQL Java `RuntimeWiring.Builder` can be used to register custom scalar types, directives, type resolvers, `DataFetcher`s, and more. You can declare `RuntimeWiringConfigurer` beans in your Spring config to get access to the `RuntimeWiring.Builder`. Spring Boot detects such beans and adds them to the [GraphQlSource builder](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html/#execution-graphqlsource). Typically, however, applications will not implement `DataFetcher` directly and will instead create [annotated controllers](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html/#controllers). Spring Boot will automatically detect `@Controller` classes with annotated handler methods and register those as `DataFetcher`s. Here’s a sample implementation for our greeting query with a `@Controller` class: Java ``` import org.springframework.graphql.data.method.annotation.Argument; import org.springframework.graphql.data.method.annotation.QueryMapping; import org.springframework.stereotype.Controller; @Controller public class GreetingController { @QueryMapping public String greeting(@Argument String name) { return "Hello, " + name + "!"; } } ``` Kotlin ``` ; import org.springframework.graphql.data.method.annotation.Argument import org.springframework.graphql.data.method.annotation.QueryMapping import org.springframework.stereotype.Controller @Controller class GreetingController { @QueryMapping fun greeting(@Argument name: String): String { return "Hello, $name!" } } ``` ### 6.3. Querydsl and QueryByExample Repositories support Spring Data offers support for both Querydsl and QueryByExample repositories. Spring GraphQL can [configure Querydsl and QueryByExample repositories as `DataFetcher`](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html/#data). Spring Data repositories annotated with `@GraphQlRepository` and extending one of: * `QuerydslPredicateExecutor` * `ReactiveQuerydslPredicateExecutor` * `QueryByExampleExecutor` * `ReactiveQueryByExampleExecutor` are detected by Spring Boot and considered as candidates for `DataFetcher` for matching top-level queries. ### 6.4. Transports #### 6.4.1. HTTP and WebSocket The GraphQL HTTP endpoint is at HTTP POST "/graphql" by default. The path can be customized with `spring.graphql.path`. The GraphQL WebSocket endpoint is off by default. To enable it: * For a Servlet application, add the WebSocket starter `spring-boot-starter-websocket` * For a WebFlux application, no additional dependency is required * For both, the `spring.graphql.websocket.path` application property must be set Spring GraphQL provides a [Web Interception](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html/#web-interception) model. This is quite useful for retrieving information from an HTTP request header and set it in the GraphQL context or fetching information from the same context and writing it to a response header. With Spring Boot, you can declare a `WebInterceptor` bean to have it registered with the web transport. [Spring MVC](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc-cors) and [Spring WebFlux](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-cors) support CORS (Cross-Origin Resource Sharing) requests. CORS is a critical part of the web config for GraphQL applications that are accessed from browsers using different domains. Spring Boot supports many configuration properties under the `spring.graphql.cors.*` namespace; here’s a short configuration sample: Properties ``` spring.graphql.cors.allowed-origins=https://example.org spring.graphql.cors.allowed-methods=GET,POST spring.graphql.cors.max-age=1800s ``` Yaml ``` spring: graphql: cors: allowed-origins: "https://example.org" allowed-methods: GET,POST max-age: 1800s ``` #### 6.4.2. RSocket RSocket is also supported as a transport, on top of WebSocket or TCP. Once the [RSocket server is configured](messaging#messaging.rsocket.server-auto-configuration), we can configure our GraphQL handler on a particular route using `spring.graphql.rsocket.mapping`. For example, configuring that mapping as `"graphql"` means we can use that as a route when sending requests with the `RSocketGraphQlClient`. Spring Boot auto-configures a `RSocketGraphQlClient.Builder<?>` bean that you can inject in your components: Java ``` @Component public class RSocketGraphQlClientExample { private final RSocketGraphQlClient graphQlClient; public RSocketGraphQlClientExample(RSocketGraphQlClient.Builder<?> builder) { this.graphQlClient = builder.tcp("example.spring.io", 8181).route("graphql").build(); } ``` Kotlin ``` @Component class RSocketGraphQlClientExample(private val builder: RSocketGraphQlClient.Builder<*>) { ``` And then send a request: Java ``` Mono<Book> book = this.graphQlClient.document("{ bookById(id: \"book-1\"){ id name pageCount author } }") .retrieve("bookById").toEntity(Book.class); ``` Kotlin ``` val book = graphQlClient.document( """ { bookById(id: "book-1"){ id name pageCount author } } """ ) .retrieve("bookById").toEntity(Book::class.java) ``` ### 6.5. Exceptions Handling Spring GraphQL enables applications to register one or more Spring `DataFetcherExceptionResolver` components that are invoked sequentially. The Exception must be resolved to a list of `graphql.GraphQLError` objects, see [Spring GraphQL exception handling documentation](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html/#execution-exceptions). Spring Boot will automatically detect `DataFetcherExceptionResolver` beans and register them with the `GraphQlSource.Builder`. ### 6.6. GraphiQL and Schema printer Spring GraphQL offers infrastructure for helping developers when consuming or developing a GraphQL API. Spring GraphQL ships with a default [GraphiQL](https://github.com/graphql/graphiql) page that is exposed at `"/graphiql"` by default. This page is disabled by default and can be turned on with the `spring.graphql.graphiql.enabled` property. Many applications exposing such a page will prefer a custom build. A default implementation is very useful during development, this is why it is exposed automatically with [`spring-boot-devtools`](using#using.devtools) during development. You can also choose to expose the GraphQL schema in text format at `/graphql/schema` when the `spring.graphql.schema.printer.enabled` property is enabled. 7. Spring HATEOAS ------------------ If you develop a RESTful API that makes use of hypermedia, Spring Boot provides auto-configuration for Spring HATEOAS that works well with most applications. The auto-configuration replaces the need to use `@EnableHypermediaSupport` and registers a number of beans to ease building hypermedia-based applications, including a `LinkDiscoverers` (for client side support) and an `ObjectMapper` configured to correctly marshal responses into the desired representation. The `ObjectMapper` is customized by setting the various `spring.jackson.*` properties or, if one exists, by a `Jackson2ObjectMapperBuilder` bean. You can take control of Spring HATEOAS’s configuration by using `@EnableHypermediaSupport`. Note that doing so disables the `ObjectMapper` customization described earlier. | | | | --- | --- | | | `spring-boot-starter-hateoas` is specific to Spring MVC and should not be combined with Spring WebFlux. In order to use Spring HATEOAS with Spring WebFlux, you can add a direct dependency on `org.springframework.hateoas:spring-hateoas` along with `spring-boot-starter-webflux`. | 8. What to Read Next --------------------- You should now have a good understanding of how to develop web applications with Spring Boot. The next few sections describe how Spring Boot integrates with various [data technologies](data#data), [messaging systems](messaging#messaging), and other IO capabilities. You can pick any of these based on your application’s needs.
programming_docs
spring_boot Data Data ==== Spring Boot integrates with a number of data technologies, both SQL and NoSQL. 1. SQL Databases ----------------- The [Spring Framework](https://spring.io/projects/spring-framework) provides extensive support for working with SQL databases, from direct JDBC access using `JdbcTemplate` to complete “object relational mapping” technologies such as Hibernate. [Spring Data](https://spring.io/projects/spring-data) provides an additional level of functionality: creating `Repository` implementations directly from interfaces and using conventions to generate queries from your method names. ### 1.1. Configure a DataSource Java’s `javax.sql.DataSource` interface provides a standard method of working with database connections. Traditionally, a 'DataSource' uses a `URL` along with some credentials to establish a database connection. | | | | --- | --- | | | See [the “How-to” section](howto#howto.data-access.configure-custom-datasource) for more advanced examples, typically to take full control over the configuration of the DataSource. | #### 1.1.1. Embedded Database Support It is often convenient to develop applications by using an in-memory embedded database. Obviously, in-memory databases do not provide persistent storage. You need to populate your database when your application starts and be prepared to throw away data when your application ends. | | | | --- | --- | | | The “How-to” section includes a [section on how to initialize a database](howto#howto.data-initialization). | Spring Boot can auto-configure embedded [H2](https://www.h2database.com), [HSQL](http://hsqldb.org/), and [Derby](https://db.apache.org/derby/) databases. You need not provide any connection URLs. You need only include a build dependency to the embedded database that you want to use. If there are multiple embedded databases on the classpath, set the `spring.datasource.embedded-database-connection` configuration property to control which one is used. Setting the property to `none` disables auto-configuration of an embedded database. | | | | --- | --- | | | If you are using this feature in your tests, you may notice that the same database is reused by your whole test suite regardless of the number of application contexts that you use. If you want to make sure that each context has a separate embedded database, you should set `spring.datasource.generate-unique-name` to `true`. | For example, the typical POM dependencies would be as follows: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-jpa</artifactId> </dependency> <dependency> <groupId>org.hsqldb</groupId> <artifactId>hsqldb</artifactId> <scope>runtime</scope> </dependency> ``` | | | | --- | --- | | | You need a dependency on `spring-jdbc` for an embedded database to be auto-configured. In this example, it is pulled in transitively through `spring-boot-starter-data-jpa`. | | | | | --- | --- | | | If, for whatever reason, you do configure the connection URL for an embedded database, take care to ensure that the database’s automatic shutdown is disabled. If you use H2, you should use `DB_CLOSE_ON_EXIT=FALSE` to do so. If you use HSQLDB, you should ensure that `shutdown=true` is not used. Disabling the database’s automatic shutdown lets Spring Boot control when the database is closed, thereby ensuring that it happens once access to the database is no longer needed. | #### 1.1.2. Connection to a Production Database Production database connections can also be auto-configured by using a pooling `DataSource`. #### 1.1.3. DataSource Configuration DataSource configuration is controlled by external configuration properties in `spring.datasource.*`. For example, you might declare the following section in `application.properties`: Properties ``` spring.datasource.url=jdbc:mysql://localhost/test spring.datasource.username=dbuser spring.datasource.password=dbpass ``` Yaml ``` spring: datasource: url: "jdbc:mysql://localhost/test" username: "dbuser" password: "dbpass" ``` | | | | --- | --- | | | You should at least specify the URL by setting the `spring.datasource.url` property. Otherwise, Spring Boot tries to auto-configure an embedded database. | | | | | --- | --- | | | Spring Boot can deduce the JDBC driver class for most databases from the URL. If you need to specify a specific class, you can use the `spring.datasource.driver-class-name` property. | | | | | --- | --- | | | For a pooling `DataSource` to be created, we need to be able to verify that a valid `Driver` class is available, so we check for that before doing anything. In other words, if you set `spring.datasource.driver-class-name=com.mysql.jdbc.Driver`, then that class has to be loadable. | See [`DataSourceProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/DataSourceProperties.java) for more of the supported options. These are the standard options that work regardless of [the actual implementation](features#data.sql.datasource.connection-pool). It is also possible to fine-tune implementation-specific settings by using their respective prefix (`spring.datasource.hikari.*`, `spring.datasource.tomcat.*`, `spring.datasource.dbcp2.*`, and `spring.datasource.oracleucp.*`). See the documentation of the connection pool implementation you are using for more details. For instance, if you use the [Tomcat connection pool](https://tomcat.apache.org/tomcat-9.0-doc/jdbc-pool.html#Common_Attributes), you could customize many additional settings, as shown in the following example: Properties ``` spring.datasource.tomcat.max-wait=10000 spring.datasource.tomcat.max-active=50 spring.datasource.tomcat.test-on-borrow=true ``` Yaml ``` spring: datasource: tomcat: max-wait: 10000 max-active: 50 test-on-borrow: true ``` This will set the pool to wait 10000ms before throwing an exception if no connection is available, limit the maximum number of connections to 50 and validate the connection before borrowing it from the pool. #### 1.1.4. Supported Connection Pools Spring Boot uses the following algorithm for choosing a specific implementation: 1. We prefer [HikariCP](https://github.com/brettwooldridge/HikariCP) for its performance and concurrency. If HikariCP is available, we always choose it. 2. Otherwise, if the Tomcat pooling `DataSource` is available, we use it. 3. Otherwise, if [Commons DBCP2](https://commons.apache.org/proper/commons-dbcp/) is available, we use it. 4. If none of HikariCP, Tomcat, and DBCP2 are available and if Oracle UCP is available, we use it. | | | | --- | --- | | | If you use the `spring-boot-starter-jdbc` or `spring-boot-starter-data-jpa` “starters”, you automatically get a dependency to `HikariCP`. | You can bypass that algorithm completely and specify the connection pool to use by setting the `spring.datasource.type` property. This is especially important if you run your application in a Tomcat container, as `tomcat-jdbc` is provided by default. Additional connection pools can always be configured manually, using `DataSourceBuilder`. If you define your own `DataSource` bean, auto-configuration does not occur. The following connection pools are supported by `DataSourceBuilder`: * HikariCP * Tomcat pooling `Datasource` * Commons DBCP2 * Oracle UCP & `OracleDataSource` * Spring Framework’s `SimpleDriverDataSource` * H2 `JdbcDataSource` * PostgreSQL `PGSimpleDataSource` #### 1.1.5. Connection to a JNDI DataSource If you deploy your Spring Boot application to an Application Server, you might want to configure and manage your DataSource by using your Application Server’s built-in features and access it by using JNDI. The `spring.datasource.jndi-name` property can be used as an alternative to the `spring.datasource.url`, `spring.datasource.username`, and `spring.datasource.password` properties to access the `DataSource` from a specific JNDI location. For example, the following section in `application.properties` shows how you can access a JBoss AS defined `DataSource`: Properties ``` spring.datasource.jndi-name=java:jboss/datasources/customers ``` Yaml ``` spring: datasource: jndi-name: "java:jboss/datasources/customers" ``` ### 1.2. Using JdbcTemplate Spring’s `JdbcTemplate` and `NamedParameterJdbcTemplate` classes are auto-configured, and you can `@Autowire` them directly into your own beans, as shown in the following example: Java ``` import org.springframework.jdbc.core.JdbcTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final JdbcTemplate jdbcTemplate; public MyBean(JdbcTemplate jdbcTemplate) { this.jdbcTemplate = jdbcTemplate; } public void doSomething() { this.jdbcTemplate ... } } ``` Kotlin ``` import org.springframework.jdbc.core.JdbcTemplate import org.springframework.stereotype.Component @Component class MyBean(private val jdbcTemplate: JdbcTemplate) { fun doSomething() { jdbcTemplate.execute("delete from customer") } } ``` You can customize some properties of the template by using the `spring.jdbc.template.*` properties, as shown in the following example: Properties ``` spring.jdbc.template.max-rows=500 ``` Yaml ``` spring: jdbc: template: max-rows: 500 ``` | | | | --- | --- | | | The `NamedParameterJdbcTemplate` reuses the same `JdbcTemplate` instance behind the scenes. If more than one `JdbcTemplate` is defined and no primary candidate exists, the `NamedParameterJdbcTemplate` is not auto-configured. | ### 1.3. JPA and Spring Data JPA The Java Persistence API is a standard technology that lets you “map” objects to relational databases. The `spring-boot-starter-data-jpa` POM provides a quick way to get started. It provides the following key dependencies: * Hibernate: One of the most popular JPA implementations. * Spring Data JPA: Helps you to implement JPA-based repositories. * Spring ORM: Core ORM support from the Spring Framework. | | | | --- | --- | | | We do not go into too many details of JPA or [Spring Data](https://spring.io/projects/spring-data) here. You can follow the [“Accessing Data with JPA”](https://spring.io/guides/gs/accessing-data-jpa/) guide from [spring.io](https://spring.io) and read the [Spring Data JPA](https://spring.io/projects/spring-data-jpa) and [Hibernate](https://hibernate.org/orm/documentation/) reference documentation. | #### 1.3.1. Entity Classes Traditionally, JPA “Entity” classes are specified in a `persistence.xml` file. With Spring Boot, this file is not necessary and “Entity Scanning” is used instead. By default, all packages below your main configuration class (the one annotated with `@EnableAutoConfiguration` or `@SpringBootApplication`) are searched. Any classes annotated with `@Entity`, `@Embeddable`, or `@MappedSuperclass` are considered. A typical entity class resembles the following example: Java ``` import java.io.Serializable; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.GeneratedValue; import javax.persistence.Id; @Entity public class City implements Serializable { @Id @GeneratedValue private Long id; @Column(nullable = false) private String name; @Column(nullable = false) private String state; // ... additional members, often include @OneToMany mappings protected City() { // no-args constructor required by JPA spec // this one is protected since it should not be used directly } public City(String name, String state) { this.name = name; this.state = state; } public String getName() { return this.name; } public String getState() { return this.state; } // ... etc } ``` Kotlin ``` import java.io.Serializable import javax.persistence.Column import javax.persistence.Entity import javax.persistence.GeneratedValue import javax.persistence.Id @Entity class City : Serializable { @Id @GeneratedValue private val id: Long? = null @Column(nullable = false) var name: String? = null private set // ... etc @Column(nullable = false) var state: String? = null private set // ... additional members, often include @OneToMany mappings protected constructor() { // no-args constructor required by JPA spec // this one is protected since it should not be used directly } constructor(name: String?, state: String?) { this.name = name this.state = state } } ``` | | | | --- | --- | | | You can customize entity scanning locations by using the `@EntityScan` annotation. See the “[howto.html](howto#howto.data-access.separate-entity-definitions-from-spring-configuration)” how-to. | #### 1.3.2. Spring Data JPA Repositories [Spring Data JPA](https://spring.io/projects/spring-data-jpa) repositories are interfaces that you can define to access data. JPA queries are created automatically from your method names. For example, a `CityRepository` interface might declare a `findAllByState(String state)` method to find all the cities in a given state. For more complex queries, you can annotate your method with Spring Data’s [`Query`](https://docs.spring.io/spring-data/jpa/docs/2.7.0/api/org/springframework/data/jpa/repository/Query.html) annotation. Spring Data repositories usually extend from the [`Repository`](https://docs.spring.io/spring-data/commons/docs/2.7.0/api/org/springframework/data/repository/Repository.html) or [`CrudRepository`](https://docs.spring.io/spring-data/commons/docs/2.7.0/api/org/springframework/data/repository/CrudRepository.html) interfaces. If you use auto-configuration, repositories are searched from the package containing your main configuration class (the one annotated with `@EnableAutoConfiguration` or `@SpringBootApplication`) down. The following example shows a typical Spring Data repository interface definition: Java ``` import org.springframework.boot.docs.data.sql.jpaandspringdata.entityclasses.City; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.data.repository.Repository; public interface CityRepository extends Repository<City, Long> { Page<City> findAll(Pageable pageable); City findByNameAndStateAllIgnoringCase(String name, String state); } ``` Kotlin ``` import org.springframework.boot.docs.data.sql.jpaandspringdata.entityclasses.City import org.springframework.data.domain.Page import org.springframework.data.domain.Pageable import org.springframework.data.repository.Repository interface CityRepository : Repository<City?, Long?> { fun findAll(pageable: Pageable?): Page<City?>? fun findByNameAndStateAllIgnoringCase(name: String?, state: String?): City? } ``` Spring Data JPA repositories support three different modes of bootstrapping: default, deferred, and lazy. To enable deferred or lazy bootstrapping, set the `spring.data.jpa.repositories.bootstrap-mode` property to `deferred` or `lazy` respectively. When using deferred or lazy bootstrapping, the auto-configured `EntityManagerFactoryBuilder` will use the context’s `AsyncTaskExecutor`, if any, as the bootstrap executor. If more than one exists, the one named `applicationTaskExecutor` will be used. | | | | --- | --- | | | When using deferred or lazy bootstrapping, make sure to defer any access to the JPA infrastructure after the application context bootstrap phase. You can use `SmartInitializingSingleton` to invoke any initialization that requires the JPA infrastructure. For JPA components (such as converters) that are created as Spring beans, use `ObjectProvider` to delay the resolution of dependencies, if any. | | | | | --- | --- | | | We have barely scratched the surface of Spring Data JPA. For complete details, see the [Spring Data JPA reference documentation](https://docs.spring.io/spring-data/jpa/docs/2.7.0/reference/html). | #### 1.3.3. Spring Data Envers Repositories If [Spring Data Envers](https://spring.io/projects/spring-data-envers) is available, JPA repositories are auto-configured to support typical Envers queries. To use Spring Data Envers, make sure your repository extends from `RevisionRepository` as show in the following example: Java ``` import org.springframework.boot.docs.data.sql.jpaandspringdata.entityclasses.Country; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.data.repository.Repository; import org.springframework.data.repository.history.RevisionRepository; public interface CountryRepository extends RevisionRepository<Country, Long, Integer>, Repository<Country, Long> { Page<Country> findAll(Pageable pageable); } ``` Kotlin ``` import org.springframework.boot.docs.data.sql.jpaandspringdata.entityclasses.Country import org.springframework.data.domain.Page import org.springframework.data.domain.Pageable import org.springframework.data.repository.Repository import org.springframework.data.repository.history.RevisionRepository interface CountryRepository : RevisionRepository<Country?, Long?, Int>, Repository<Country?, Long?> { fun findAll(pageable: Pageable?): Page<Country?>? } ``` | | | | --- | --- | | | For more details, check the [Spring Data Envers reference documentation](https://docs.spring.io/spring-data/envers/docs/2.7.0/reference/html/). | #### 1.3.4. Creating and Dropping JPA Databases By default, JPA databases are automatically created **only** if you use an embedded database (H2, HSQL, or Derby). You can explicitly configure JPA settings by using `spring.jpa.*` properties. For example, to create and drop tables you can add the following line to your `application.properties`: Properties ``` spring.jpa.hibernate.ddl-auto=create-drop ``` Yaml ``` spring: jpa: hibernate.ddl-auto: "create-drop" ``` | | | | --- | --- | | | Hibernate’s own internal property name for this (if you happen to remember it better) is `hibernate.hbm2ddl.auto`. You can set it, along with other Hibernate native properties, by using `spring.jpa.properties.*` (the prefix is stripped before adding them to the entity manager). The following line shows an example of setting JPA properties for Hibernate: | Properties ``` spring.jpa.properties.hibernate[globally_quoted_identifiers]=true ``` Yaml ``` spring: jpa: properties: hibernate: "globally_quoted_identifiers": "true" ``` The line in the preceding example passes a value of `true` for the `hibernate.globally_quoted_identifiers` property to the Hibernate entity manager. By default, the DDL execution (or validation) is deferred until the `ApplicationContext` has started. There is also a `spring.jpa.generate-ddl` flag, but it is not used if Hibernate auto-configuration is active, because the `ddl-auto` settings are more fine-grained. #### 1.3.5. Open EntityManager in View If you are running a web application, Spring Boot by default registers [`OpenEntityManagerInViewInterceptor`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/orm/jpa/support/OpenEntityManagerInViewInterceptor.html) to apply the “Open EntityManager in View” pattern, to allow for lazy loading in web views. If you do not want this behavior, you should set `spring.jpa.open-in-view` to `false` in your `application.properties`. ### 1.4. Spring Data JDBC Spring Data includes repository support for JDBC and will automatically generate SQL for the methods on `CrudRepository`. For more advanced queries, a `@Query` annotation is provided. Spring Boot will auto-configure Spring Data’s JDBC repositories when the necessary dependencies are on the classpath. They can be added to your project with a single dependency on `spring-boot-starter-data-jdbc`. If necessary, you can take control of Spring Data JDBC’s configuration by adding the `@EnableJdbcRepositories` annotation or a `JdbcConfiguration` subclass to your application. | | | | --- | --- | | | For complete details of Spring Data JDBC, see the [reference documentation](https://docs.spring.io/spring-data/jdbc/docs/2.4.0/reference/html/). | ### 1.5. Using H2’s Web Console The [H2 database](https://www.h2database.com) provides a [browser-based console](https://www.h2database.com/html/quickstart.html#h2_console) that Spring Boot can auto-configure for you. The console is auto-configured when the following conditions are met: * You are developing a servlet-based web application. * `com.h2database:h2` is on the classpath. * You are using [Spring Boot’s developer tools](using#using.devtools). | | | | --- | --- | | | If you are not using Spring Boot’s developer tools but would still like to make use of H2’s console, you can configure the `spring.h2.console.enabled` property with a value of `true`. | | | | | --- | --- | | | The H2 console is only intended for use during development, so you should take care to ensure that `spring.h2.console.enabled` is not set to `true` in production. | #### 1.5.1. Changing the H2 Console’s Path By default, the console is available at `/h2-console`. You can customize the console’s path by using the `spring.h2.console.path` property. #### 1.5.2. Accessing the H2 Console in a Secured Application H2 Console uses frames and, as it is intended for development only, does not implement CSRF protection measures. If your application uses Spring Security, you need to configure it to * disable CSRF protection for requests against the console, * set the header `X-Frame-Options` to `SAMEORIGIN` on responses from the console. More information on [CSRF](https://docs.spring.io/spring-security/reference/5.7.1/features/exploits/csrf.html) and the header [X-Frame-Options](https://docs.spring.io/spring-security/reference/5.7.1/features/exploits/headers.html#headers-frame-options) can be found in the Spring Security Reference Guide. In simple setups, a `SecurityFilterChain` like the following can be used: Java ``` import org.springframework.boot.autoconfigure.security.servlet.PathRequest; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Profile; import org.springframework.core.Ordered; import org.springframework.core.annotation.Order; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Profile("dev") @Configuration(proxyBeanMethods = false) public class DevProfileSecurityConfiguration { @Bean @Order(Ordered.HIGHEST\_PRECEDENCE) SecurityFilterChain h2ConsoleSecurityFilterChain(HttpSecurity http) throws Exception { return http.requestMatcher(PathRequest.toH2Console()) // ... configuration for authorization .csrf().disable() .headers().frameOptions().sameOrigin().and() .build(); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.security.servlet.PathRequest import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Profile import org.springframework.core.Ordered import org.springframework.core.annotation.Order import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.web.SecurityFilterChain @Profile("dev") @Configuration(proxyBeanMethods = false) class DevProfileSecurityConfiguration { @Bean @Order(Ordered.HIGHEST\_PRECEDENCE) fun h2ConsoleSecurityFilterChain(http: HttpSecurity): SecurityFilterChain { return http.requestMatcher(PathRequest.toH2Console()) .csrf().disable() .headers().frameOptions().sameOrigin().and() .build() } } ``` | | | | --- | --- | | | The H2 console is only intended for use during development. In production, disabling CSRF protection or allowing frames for a website may create severe security risks. | | | | | --- | --- | | | `PathRequest.toH2Console()` returns the correct request matcher also when the console’s path has been customized. | ### 1.6. Using jOOQ jOOQ Object Oriented Querying ([jOOQ](https://www.jooq.org/)) is a popular product from [Data Geekery](https://www.datageekery.com/) which generates Java code from your database and lets you build type-safe SQL queries through its fluent API. Both the commercial and open source editions can be used with Spring Boot. #### 1.6.1. Code Generation In order to use jOOQ type-safe queries, you need to generate Java classes from your database schema. You can follow the instructions in the [jOOQ user manual](https://www.jooq.org/doc/3.14.15/manual-single-page/#jooq-in-7-steps-step3). If you use the `jooq-codegen-maven` plugin and you also use the `spring-boot-starter-parent` “parent POM”, you can safely omit the plugin’s `<version>` tag. You can also use Spring Boot-defined version variables (such as `h2.version`) to declare the plugin’s database dependency. The following listing shows an example: ``` <plugin> <groupId>org.jooq</groupId> <artifactId>jooq-codegen-maven</artifactId> <executions> ... </executions> <dependencies> <dependency> <groupId>com.h2database</groupId> <artifactId>h2</artifactId> <version>${h2.version}</version> </dependency> </dependencies> <configuration> <jdbc> <driver>org.h2.Driver</driver> <url>jdbc:h2:~/yourdatabase</url> </jdbc> <generator> ... </generator> </configuration> </plugin> ``` #### 1.6.2. Using DSLContext The fluent API offered by jOOQ is initiated through the `org.jooq.DSLContext` interface. Spring Boot auto-configures a `DSLContext` as a Spring Bean and connects it to your application `DataSource`. To use the `DSLContext`, you can inject it, as shown in the following example: Java ``` import java.util.GregorianCalendar; import java.util.List; import org.jooq.DSLContext; import org.springframework.stereotype.Component; import static org.springframework.boot.docs.data.sql.jooq.dslcontext.Tables.AUTHOR; @Component public class MyBean { private final DSLContext create; public MyBean(DSLContext dslContext) { this.create = dslContext; } } ``` Kotlin ``` import org.jooq.DSLContext import org.springframework.stereotype.Component import java.util.GregorianCalendar @Component class MyBean(private val create: DSLContext) { } ``` | | | | --- | --- | | | The jOOQ manual tends to use a variable named `create` to hold the `DSLContext`. | You can then use the `DSLContext` to construct your queries, as shown in the following example: Java ``` public List<GregorianCalendar> authorsBornAfter1980() { return this.create.selectFrom(AUTHOR) .where(AUTHOR.DATE_OF_BIRTH.greaterThan(new GregorianCalendar(1980, 0, 1))) .fetch(AUTHOR.DATE_OF_BIRTH); ``` Kotlin ``` fun authorsBornAfter1980(): List<GregorianCalendar> { return create.selectFrom<Tables.TAuthorRecord>(Tables.AUTHOR) .where(Tables.AUTHOR?.DATE_OF_BIRTH?.greaterThan(GregorianCalendar(1980, 0, 1))) .fetch(Tables.AUTHOR?.DATE_OF_BIRTH) } ``` #### 1.6.3. jOOQ SQL Dialect Unless the `spring.jooq.sql-dialect` property has been configured, Spring Boot determines the SQL dialect to use for your datasource. If Spring Boot could not detect the dialect, it uses `DEFAULT`. | | | | --- | --- | | | Spring Boot can only auto-configure dialects supported by the open source version of jOOQ. | #### 1.6.4. Customizing jOOQ More advanced customizations can be achieved by defining your own `DefaultConfigurationCustomizer` bean that will be invoked prior to creating the `org.jooq.Configuration` `@Bean`. This takes precedence to anything that is applied by the auto-configuration. You can also create your own `org.jooq.Configuration` `@Bean` if you want to take complete control of the jOOQ configuration. ### 1.7. Using R2DBC The Reactive Relational Database Connectivity ([R2DBC](https://r2dbc.io)) project brings reactive programming APIs to relational databases. R2DBC’s `io.r2dbc.spi.Connection` provides a standard method of working with non-blocking database connections. Connections are provided by using a `ConnectionFactory`, similar to a `DataSource` with jdbc. `ConnectionFactory` configuration is controlled by external configuration properties in `spring.r2dbc.*`. For example, you might declare the following section in `application.properties`: Properties ``` spring.r2dbc.url=r2dbc:postgresql://localhost/test spring.r2dbc.username=dbuser spring.r2dbc.password=dbpass ``` Yaml ``` spring: r2dbc: url: "r2dbc:postgresql://localhost/test" username: "dbuser" password: "dbpass" ``` | | | | --- | --- | | | You do not need to specify a driver class name, since Spring Boot obtains the driver from R2DBC’s Connection Factory discovery. | | | | | --- | --- | | | At least the url should be provided. Information specified in the URL takes precedence over individual properties, that is `name`, `username`, `password` and pooling options. | | | | | --- | --- | | | The “How-to” section includes a [section on how to initialize a database](howto#howto.data-initialization.using-basic-sql-scripts). | To customize the connections created by a `ConnectionFactory`, that is, set specific parameters that you do not want (or cannot) configure in your central database configuration, you can use a `ConnectionFactoryOptionsBuilderCustomizer` `@Bean`. The following example shows how to manually override the database port while the rest of the options is taken from the application configuration: Java ``` import io.r2dbc.spi.ConnectionFactoryOptions; import org.springframework.boot.autoconfigure.r2dbc.ConnectionFactoryOptionsBuilderCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyR2dbcConfiguration { @Bean public ConnectionFactoryOptionsBuilderCustomizer connectionFactoryPortCustomizer() { return (builder) -> builder.option(ConnectionFactoryOptions.PORT, 5432); } } ``` Kotlin ``` import io.r2dbc.spi.ConnectionFactoryOptions import org.springframework.boot.autoconfigure.r2dbc.ConnectionFactoryOptionsBuilderCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyR2dbcConfiguration { @Bean fun connectionFactoryPortCustomizer(): ConnectionFactoryOptionsBuilderCustomizer { return ConnectionFactoryOptionsBuilderCustomizer { builder -> builder.option(ConnectionFactoryOptions.PORT, 5432) } } } ``` The following examples show how to set some PostgreSQL connection options: Java ``` import java.util.HashMap; import java.util.Map; import io.r2dbc.postgresql.PostgresqlConnectionFactoryProvider; import org.springframework.boot.autoconfigure.r2dbc.ConnectionFactoryOptionsBuilderCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyPostgresR2dbcConfiguration { @Bean public ConnectionFactoryOptionsBuilderCustomizer postgresCustomizer() { Map<String, String> options = new HashMap<>(); options.put("lock\_timeout", "30s"); options.put("statement\_timeout", "60s"); return (builder) -> builder.option(PostgresqlConnectionFactoryProvider.OPTIONS, options); } } ``` Kotlin ``` import io.r2dbc.postgresql.PostgresqlConnectionFactoryProvider import org.springframework.boot.autoconfigure.r2dbc.ConnectionFactoryOptionsBuilderCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyPostgresR2dbcConfiguration { @Bean fun postgresCustomizer(): ConnectionFactoryOptionsBuilderCustomizer { val options: MutableMap<String, String> = HashMap() options["lock\_timeout"] = "30s" options["statement\_timeout"] = "60s" return ConnectionFactoryOptionsBuilderCustomizer { builder -> builder.option(PostgresqlConnectionFactoryProvider.OPTIONS, options) } } } ``` When a `ConnectionFactory` bean is available, the regular JDBC `DataSource` auto-configuration backs off. If you want to retain the JDBC `DataSource` auto-configuration, and are comfortable with the risk of using the blocking JDBC API in a reactive application, add `@Import(DataSourceAutoConfiguration.class)` on a `@Configuration` class in your application to re-enable it. #### 1.7.1. Embedded Database Support Similarly to [the JDBC support](features#data.sql.datasource.embedded), Spring Boot can automatically configure an embedded database for reactive usage. You need not provide any connection URLs. You need only include a build dependency to the embedded database that you want to use, as shown in the following example: ``` <dependency> <groupId>io.r2dbc</groupId> <artifactId>r2dbc-h2</artifactId> <scope>runtime</scope> </dependency> ``` | | | | --- | --- | | | If you are using this feature in your tests, you may notice that the same database is reused by your whole test suite regardless of the number of application contexts that you use. If you want to make sure that each context has a separate embedded database, you should set `spring.r2dbc.generate-unique-name` to `true`. | #### 1.7.2. Using DatabaseClient A `DatabaseClient` bean is auto-configured, and you can `@Autowire` it directly into your own beans, as shown in the following example: Java ``` import java.util.Map; import reactor.core.publisher.Flux; import org.springframework.r2dbc.core.DatabaseClient; import org.springframework.stereotype.Component; @Component public class MyBean { private final DatabaseClient databaseClient; public MyBean(DatabaseClient databaseClient) { this.databaseClient = databaseClient; } // ... public Flux<Map<String, Object>> someMethod() { return this.databaseClient.sql("select \* from user").fetch().all(); } } ``` Kotlin ``` import org.springframework.r2dbc.core.DatabaseClient import org.springframework.stereotype.Component import reactor.core.publisher.Flux @Component class MyBean(private val databaseClient: DatabaseClient) { // ... fun someMethod(): Flux<Map<String, Any>> { return databaseClient.sql("select \* from user").fetch().all() } } ``` #### 1.7.3. Spring Data R2DBC Repositories [Spring Data R2DBC](https://spring.io/projects/spring-data-r2dbc) repositories are interfaces that you can define to access data. Queries are created automatically from your method names. For example, a `CityRepository` interface might declare a `findAllByState(String state)` method to find all the cities in a given state. For more complex queries, you can annotate your method with Spring Data’s [`Query`](https://docs.spring.io/spring-data/r2dbc/docs/1.5.0/api/org/springframework/data/r2dbc/repository/Query.html) annotation. Spring Data repositories usually extend from the [`Repository`](https://docs.spring.io/spring-data/commons/docs/2.7.0/api/org/springframework/data/repository/Repository.html) or [`CrudRepository`](https://docs.spring.io/spring-data/commons/docs/2.7.0/api/org/springframework/data/repository/CrudRepository.html) interfaces. If you use auto-configuration, repositories are searched from the package containing your main configuration class (the one annotated with `@EnableAutoConfiguration` or `@SpringBootApplication`) down. The following example shows a typical Spring Data repository interface definition: Java ``` import reactor.core.publisher.Mono; import org.springframework.data.repository.Repository; public interface CityRepository extends Repository<City, Long> { Mono<City> findByNameAndStateAllIgnoringCase(String name, String state); } ``` Kotlin ``` import org.springframework.data.repository.Repository import reactor.core.publisher.Mono interface CityRepository : Repository<City?, Long?> { fun findByNameAndStateAllIgnoringCase(name: String?, state: String?): Mono<City?>? } ``` | | | | --- | --- | | | We have barely scratched the surface of Spring Data R2DBC. For complete details, see the [Spring Data R2DBC reference documentation](https://docs.spring.io/spring-data/r2dbc/docs/1.5.0/reference/html/). | 2. Working with NoSQL Technologies ----------------------------------- Spring Data provides additional projects that help you access a variety of NoSQL technologies, including: * [MongoDB](https://spring.io/projects/spring-data-mongodb) * [Neo4J](https://spring.io/projects/spring-data-neo4j) * [Elasticsearch](https://spring.io/projects/spring-data-elasticsearch) * [Redis](https://spring.io/projects/spring-data-redis) * [GemFire](https://spring.io/projects/spring-data-gemfire) or [Geode](https://spring.io/projects/spring-data-geode) * [Cassandra](https://spring.io/projects/spring-data-cassandra) * [Couchbase](https://spring.io/projects/spring-data-couchbase) * [LDAP](https://spring.io/projects/spring-data-ldap) Spring Boot provides auto-configuration for Redis, MongoDB, Neo4j, Solr, Elasticsearch, Cassandra, Couchbase, LDAP and InfluxDB. Additionally, [Spring Boot for Apache Geode](https://github.com/spring-projects/spring-boot-data-geode) provides [auto-configuration for Apache Geode](https://docs.spring.io/spring-boot-data-geode-build/1.7.x/reference/html5/#geode-repositories). You can make use of the other projects, but you must configure them yourself. See the appropriate reference documentation at [spring.io/projects/spring-data](https://spring.io/projects/spring-data). ### 2.1. Redis [Redis](https://redis.io/) is a cache, message broker, and richly-featured key-value store. Spring Boot offers basic auto-configuration for the [Lettuce](https://github.com/lettuce-io/lettuce-core/) and [Jedis](https://github.com/xetorthio/jedis/) client libraries and the abstractions on top of them provided by [Spring Data Redis](https://github.com/spring-projects/spring-data-redis). There is a `spring-boot-starter-data-redis` “Starter” for collecting the dependencies in a convenient way. By default, it uses [Lettuce](https://github.com/lettuce-io/lettuce-core/). That starter handles both traditional and reactive applications. | | | | --- | --- | | | We also provide a `spring-boot-starter-data-redis-reactive` “Starter” for consistency with the other stores with reactive support. | #### 2.1.1. Connecting to Redis You can inject an auto-configured `RedisConnectionFactory`, `StringRedisTemplate`, or vanilla `RedisTemplate` instance as you would any other Spring Bean. By default, the instance tries to connect to a Redis server at `localhost:6379`. The following listing shows an example of such a bean: Java ``` import org.springframework.data.redis.core.StringRedisTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final StringRedisTemplate template; public MyBean(StringRedisTemplate template) { this.template = template; } // ... public Boolean someMethod() { return this.template.hasKey("spring"); } } ``` Kotlin ``` import org.springframework.data.redis.core.StringRedisTemplate import org.springframework.stereotype.Component @Component class MyBean(private val template: StringRedisTemplate) { // ... fun someMethod(): Boolean { return template.hasKey("spring") } } ``` | | | | --- | --- | | | You can also register an arbitrary number of beans that implement `LettuceClientConfigurationBuilderCustomizer` for more advanced customizations. `ClientResources` can also be customized using `ClientResourcesBuilderCustomizer`. If you use Jedis, `JedisClientConfigurationBuilderCustomizer` is also available. Alternatively, you can register a bean of type `RedisStandaloneConfiguration`, `RedisSentinelConfiguration`, or `RedisClusterConfiguration` to take full control over the configuration. | If you add your own `@Bean` of any of the auto-configured types, it replaces the default (except in the case of `RedisTemplate`, when the exclusion is based on the bean name, `redisTemplate`, not its type). By default, a pooled connection factory is auto-configured if `commons-pool2` is on the classpath. ### 2.2. MongoDB [MongoDB](https://www.mongodb.com/) is an open-source NoSQL document database that uses a JSON-like schema instead of traditional table-based relational data. Spring Boot offers several conveniences for working with MongoDB, including the `spring-boot-starter-data-mongodb` and `spring-boot-starter-data-mongodb-reactive` “Starters”. #### 2.2.1. Connecting to a MongoDB Database To access MongoDB databases, you can inject an auto-configured `org.springframework.data.mongodb.MongoDatabaseFactory`. By default, the instance tries to connect to a MongoDB server at `mongodb://localhost/test`. The following example shows how to connect to a MongoDB database: Java ``` import com.mongodb.client.MongoCollection; import com.mongodb.client.MongoDatabase; import org.bson.Document; import org.springframework.data.mongodb.MongoDatabaseFactory; import org.springframework.stereotype.Component; @Component public class MyBean { private final MongoDatabaseFactory mongo; public MyBean(MongoDatabaseFactory mongo) { this.mongo = mongo; } // ... public MongoCollection<Document> someMethod() { MongoDatabase db = this.mongo.getMongoDatabase(); return db.getCollection("users"); } } ``` Kotlin ``` import com.mongodb.client.MongoCollection import org.bson.Document import org.springframework.data.mongodb.MongoDatabaseFactory import org.springframework.stereotype.Component @Component class MyBean(private val mongo: MongoDatabaseFactory) { // ... fun someMethod(): MongoCollection<Document> { val db = mongo.mongoDatabase return db.getCollection("users") } } ``` If you have defined your own `MongoClient`, it will be used to auto-configure a suitable `MongoDatabaseFactory`. The auto-configured `MongoClient` is created using a `MongoClientSettings` bean. If you have defined your own `MongoClientSettings`, it will be used without modification and the `spring.data.mongodb` properties will be ignored. Otherwise a `MongoClientSettings` will be auto-configured and will have the `spring.data.mongodb` properties applied to it. In either case, you can declare one or more `MongoClientSettingsBuilderCustomizer` beans to fine-tune the `MongoClientSettings` configuration. Each will be called in order with the `MongoClientSettings.Builder` that is used to build the `MongoClientSettings`. You can set the `spring.data.mongodb.uri` property to change the URL and configure additional settings such as the *replica set*, as shown in the following example: Properties ``` spring.data.mongodb.uri=mongodb://user:[email protected]:12345,mongo2.example.com:23456/test ``` Yaml ``` spring: data: mongodb: uri: "mongodb://user:[email protected]:12345,mongo2.example.com:23456/test" ``` Alternatively, you can specify connection details using discrete properties. For example, you might declare the following settings in your `application.properties`: Properties ``` spring.data.mongodb.host=mongoserver.example.com spring.data.mongodb.port=27017 spring.data.mongodb.database=test spring.data.mongodb.username=user spring.data.mongodb.password=secret ``` Yaml ``` spring: data: mongodb: host: "mongoserver.example.com" port: 27017 database: "test" username: "user" password: "secret" ``` | | | | --- | --- | | | If `spring.data.mongodb.port` is not specified, the default of `27017` is used. You could delete this line from the example shown earlier. | | | | | --- | --- | | | If you do not use Spring Data MongoDB, you can inject a `MongoClient` bean instead of using `MongoDatabaseFactory`. If you want to take complete control of establishing the MongoDB connection, you can also declare your own `MongoDatabaseFactory` or `MongoClient` bean. | | | | | --- | --- | | | If you are using the reactive driver, Netty is required for SSL. The auto-configuration configures this factory automatically if Netty is available and the factory to use has not been customized already. | #### 2.2.2. MongoTemplate [Spring Data MongoDB](https://spring.io/projects/spring-data-mongodb) provides a [`MongoTemplate`](https://docs.spring.io/spring-data/mongodb/docs/3.4.0/api/org/springframework/data/mongodb/core/MongoTemplate.html) class that is very similar in its design to Spring’s `JdbcTemplate`. As with `JdbcTemplate`, Spring Boot auto-configures a bean for you to inject the template, as follows: Java ``` import com.mongodb.client.MongoCollection; import org.bson.Document; import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final MongoTemplate mongoTemplate; public MyBean(MongoTemplate mongoTemplate) { this.mongoTemplate = mongoTemplate; } // ... public MongoCollection<Document> someMethod() { return this.mongoTemplate.getCollection("users"); } } ``` Kotlin ``` import com.mongodb.client.MongoCollection import org.bson.Document import org.springframework.data.mongodb.core.MongoTemplate import org.springframework.stereotype.Component @Component class MyBean(private val mongoTemplate: MongoTemplate) { // ... fun someMethod(): MongoCollection<Document> { return mongoTemplate.getCollection("users") } } ``` See the [`MongoOperations` Javadoc](https://docs.spring.io/spring-data/mongodb/docs/3.4.0/api/org/springframework/data/mongodb/core/MongoOperations.html) for complete details. #### 2.2.3. Spring Data MongoDB Repositories Spring Data includes repository support for MongoDB. As with the JPA repositories discussed earlier, the basic principle is that queries are constructed automatically, based on method names. In fact, both Spring Data JPA and Spring Data MongoDB share the same common infrastructure. You could take the JPA example from earlier and, assuming that `City` is now a MongoDB data class rather than a JPA `@Entity`, it works in the same way, as shown in the following example: Java ``` import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.data.repository.Repository; public interface CityRepository extends Repository<City, Long> { Page<City> findAll(Pageable pageable); City findByNameAndStateAllIgnoringCase(String name, String state); } ``` Kotlin ``` import org.springframework.data.domain.Page import org.springframework.data.domain.Pageable import org.springframework.data.repository.Repository interface CityRepository : Repository<City?, Long?> { fun findAll(pageable: Pageable?): Page<City?>? fun findByNameAndStateAllIgnoringCase(name: String?, state: String?): City? } ``` | | | | --- | --- | | | You can customize document scanning locations by using the `@EntityScan` annotation. | | | | | --- | --- | | | For complete details of Spring Data MongoDB, including its rich object mapping technologies, see its [reference documentation](https://spring.io/projects/spring-data-mongodb). | #### 2.2.4. Embedded Mongo Spring Boot offers auto-configuration for [Embedded Mongo](https://github.com/flapdoodle-oss/de.flapdoodle.embed.mongo). To use it in your Spring Boot application, add a dependency on `de.flapdoodle.embed:de.flapdoodle.embed.mongo` and set the `spring.mongodb.embedded.version` property to match the version of MongoDB that your application will use in production. | | | | --- | --- | | | The default download configuration allows access to most of the versions listed in [Embedded Mongo’s `Version` class](https://github.com/flapdoodle-oss/de.flapdoodle.embed.mongo/blob/de.flapdoodle.embed.mongo-3.4.5/src/main/java/de/flapdoodle/embed/mongo/distribution/Version.java) as well as some others. Configuring an inaccessible version will result in an error when attempting to download the server. Such an error can be corrected by defining an appropriately configured `DownloadConfigBuilderCustomizer` bean. | The port that Mongo listens on can be configured by setting the `spring.data.mongodb.port` property. To use a randomly allocated free port, use a value of 0. The `MongoClient` created by `MongoAutoConfiguration` is automatically configured to use the randomly allocated port. | | | | --- | --- | | | If you do not configure a custom port, the embedded support uses a random port (rather than 27017) by default. | If you have SLF4J on the classpath, the output produced by Mongo is automatically routed to a logger named `org.springframework.boot.autoconfigure.mongo.embedded.EmbeddedMongo`. You can declare your own `IMongodConfig` and `IRuntimeConfig` beans to take control of the Mongo instance’s configuration and logging routing. The download configuration can be customized by declaring a `DownloadConfigBuilderCustomizer` bean. ### 2.3. Neo4j [Neo4j](https://neo4j.com/) is an open-source NoSQL graph database that uses a rich data model of nodes connected by first class relationships, which is better suited for connected big data than traditional RDBMS approaches. Spring Boot offers several conveniences for working with Neo4j, including the `spring-boot-starter-data-neo4j` “Starter”. #### 2.3.1. Connecting to a Neo4j Database To access a Neo4j server, you can inject an auto-configured `org.neo4j.driver.Driver`. By default, the instance tries to connect to a Neo4j server at `localhost:7687` using the Bolt protocol. The following example shows how to inject a Neo4j `Driver` that gives you access, amongst other things, to a `Session`: Java ``` import org.neo4j.driver.Driver; import org.neo4j.driver.Session; import org.neo4j.driver.Values; import org.springframework.stereotype.Component; @Component public class MyBean { private final Driver driver; public MyBean(Driver driver) { this.driver = driver; } // ... public String someMethod(String message) { try (Session session = this.driver.session()) { return session.writeTransaction((transaction) -> transaction .run("CREATE (a:Greeting) SET a.message = $message RETURN a.message + ', from node ' + id(a)", Values.parameters("message", message)) .single().get(0).asString()); } } } ``` Kotlin ``` import org.neo4j.driver.Driver import org.neo4j.driver.Transaction import org.neo4j.driver.Values import org.springframework.stereotype.Component @Component class MyBean(private val driver: Driver) { // ... fun someMethod(message: String?): String { driver.session().use { session -> return@someMethod session.writeTransaction { transaction: Transaction -> transaction.run( "CREATE (a:Greeting) SET a.message = \$message RETURN a.message + ', from node ' + id(a)", Values.parameters("message", message) ).single()[0].asString() } } } } ``` You can configure various aspects of the driver using `spring.neo4j.*` properties. The following example shows how to configure the uri and credentials to use: Properties ``` spring.neo4j.uri=bolt://my-server:7687 spring.neo4j.authentication.username=neo4j spring.neo4j.authentication.password=secret ``` Yaml ``` spring: neo4j: uri: "bolt://my-server:7687" authentication: username: "neo4j" password: "secret" ``` The auto-configured `Driver` is created using `ConfigBuilder`. To fine-tune its configuration, declare one or more `ConfigBuilderCustomizer` beans. Each will be called in order with the `ConfigBuilder` that is used to build the `Driver`. #### 2.3.2. Spring Data Neo4j Repositories Spring Data includes repository support for Neo4j. For complete details of Spring Data Neo4j, see the [reference documentation](https://docs.spring.io/spring-data/neo4j/docs/6.3.0/reference/html/). Spring Data Neo4j shares the common infrastructure with Spring Data JPA as many other Spring Data modules do. You could take the JPA example from earlier and define `City` as Spring Data Neo4j `@Node` rather than JPA `@Entity` and the repository abstraction works in the same way, as shown in the following example: Java ``` import java.util.Optional; import org.springframework.data.neo4j.repository.Neo4jRepository; public interface CityRepository extends Neo4jRepository<City, Long> { Optional<City> findOneByNameAndState(String name, String state); } ``` Kotlin ``` import org.springframework.data.neo4j.repository.Neo4jRepository import java.util.Optional interface CityRepository : Neo4jRepository<City?, Long?> { fun findOneByNameAndState(name: String?, state: String?): Optional<City?>? } ``` The `spring-boot-starter-data-neo4j` “Starter” enables the repository support as well as transaction management. Spring Boot supports both classic and reactive Neo4j repositories, using the `Neo4jTemplate` or `ReactiveNeo4jTemplate` beans. When Project Reactor is available on the classpath, the reactive style is also auto-configured. You can customize the locations to look for repositories and entities by using `@EnableNeo4jRepositories` and `@EntityScan` respectively on a `@Configuration`-bean. | | | | --- | --- | | | In an application using the reactive style, a `ReactiveTransactionManager` is not auto-configured. To enable transaction management, the following bean must be defined in your configuration: Java ``` import org.neo4j.driver.Driver; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.data.neo4j.core.ReactiveDatabaseSelectionProvider; import org.springframework.data.neo4j.core.transaction.ReactiveNeo4jTransactionManager; @Configuration(proxyBeanMethods = false) public class MyNeo4jConfiguration { @Bean public ReactiveNeo4jTransactionManager reactiveTransactionManager(Driver driver, ReactiveDatabaseSelectionProvider databaseNameProvider) { return new ReactiveNeo4jTransactionManager(driver, databaseNameProvider); } } ``` Kotlin ``` import org.neo4j.driver.Driver import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.data.neo4j.core.ReactiveDatabaseSelectionProvider import org.springframework.data.neo4j.core.transaction.ReactiveNeo4jTransactionManager @Configuration(proxyBeanMethods = false) class MyNeo4jConfiguration { @Bean fun reactiveTransactionManager(driver: Driver, databaseNameProvider: ReactiveDatabaseSelectionProvider): ReactiveNeo4jTransactionManager { return ReactiveNeo4jTransactionManager(driver, databaseNameProvider) } } ``` | ### 2.4. Solr [Apache Solr](https://lucene.apache.org/solr/) is a search engine. Spring Boot offers basic auto-configuration for the Solr 5 client library. #### 2.4.1. Connecting to Solr You can inject an auto-configured `SolrClient` instance as you would any other Spring bean. By default, the instance tries to connect to a server at `[localhost:8983/solr](http://localhost:8983/solr)`. The following example shows how to inject a Solr bean: Java ``` import java.io.IOException; import org.apache.solr.client.solrj.SolrClient; import org.apache.solr.client.solrj.SolrServerException; import org.apache.solr.client.solrj.response.SolrPingResponse; import org.springframework.stereotype.Component; @Component public class MyBean { private final SolrClient solr; public MyBean(SolrClient solr) { this.solr = solr; } // ... public SolrPingResponse someMethod() throws SolrServerException, IOException { return this.solr.ping("users"); } } ``` Kotlin ``` import org.apache.solr.client.solrj.SolrClient import org.apache.solr.client.solrj.response.SolrPingResponse import org.springframework.stereotype.Component @Component class MyBean(private val solr: SolrClient) { // ... fun someMethod(): SolrPingResponse { return solr.ping("users") } } ``` If you add your own `@Bean` of type `SolrClient`, it replaces the default. ### 2.5. Elasticsearch [Elasticsearch](https://www.elastic.co/products/elasticsearch) is an open source, distributed, RESTful search and analytics engine. Spring Boot offers basic auto-configuration for Elasticsearch clients. Spring Boot supports several clients: * The official Java "Low Level" and "High Level" REST clients * The `ReactiveElasticsearchClient` provided by Spring Data Elasticsearch Spring Boot provides a dedicated “Starter”, `spring-boot-starter-data-elasticsearch`. #### 2.5.1. Connecting to Elasticsearch using REST clients Elasticsearch ships [two different REST clients](https://www.elastic.co/guide/en/elasticsearch/client/java-rest/current/index.html) that you can use to query a cluster: the low-level client from the `org.elasticsearch.client:elasticsearch-rest-client` module and the high-level client from the `org.elasticsearch.client:elasticsearch-high-level-client` module. Additionally, Spring Boot provides support for a reactive client, based on Spring Framework’s `WebClient`, from the `org.springframework.data:spring-data-elasticsearch` module. By default, the clients will target `[localhost:9200](http://localhost:9200)`. You can use `spring.elasticsearch.*` properties to further tune how the clients are configured, as shown in the following example: Properties ``` spring.elasticsearch.uris=https://search.example.com:9200 spring.elasticsearch.socket-timeout=10s spring.elasticsearch.username=user spring.elasticsearch.password=secret ``` Yaml ``` spring: elasticsearch: uris: "https://search.example.com:9200" socket-timeout: "10s" username: "user" password: "secret" ``` ##### Connecting to Elasticsearch using RestClient If you have `elasticsearch-rest-client` on the classpath, Spring Boot will auto-configure and register a `RestClient` bean. If you have `elasticsearch-rest-high-level-client` on the classpath a `RestHighLevelClient` bean will be auto-configured as well. Following Elasticsearch’s deprecation of `RestHighLevelClient`, its auto-configuration is deprecated and will be removed in a future release. In addition to the properties described previously, to fine-tune the `RestClient` and `RestHighLevelClient`, you can register an arbitrary number of beans that implement `RestClientBuilderCustomizer` for more advanced customizations. To take full control over the clients' configuration, define a `RestClientBuilder` bean. Additionally, if `elasticsearch-rest-client-sniffer` is on the classpath, a `Sniffer` is auto-configured to automatically discover nodes from a running Elasticsearch cluster and set them on the `RestClient` bean. You can further tune how `Sniffer` is configured, as shown in the following example: Properties ``` spring.elasticsearch.restclient.sniffer.interval=10m spring.elasticsearch.restclient.sniffer.delay-after-failure=30s ``` Yaml ``` spring: elasticsearch: restclient: sniffer: interval: "10m" delay-after-failure: "30s" ``` ##### Connecting to Elasticsearch using ReactiveElasticsearchClient [Spring Data Elasticsearch](https://spring.io/projects/spring-data-elasticsearch) ships `ReactiveElasticsearchClient` for querying Elasticsearch instances in a reactive fashion. It is built on top of WebFlux’s `WebClient`, so both `spring-boot-starter-elasticsearch` and `spring-boot-starter-webflux` dependencies are useful to enable this support. By default, Spring Boot will auto-configure and register a `ReactiveElasticsearchClient`. In addition to the properties described previously, the `spring.elasticsearch.webclient.*` properties can be used to configure reactive-specific settings, as shown in the following example: Properties ``` spring.elasticsearch.webclient.max-in-memory-size=1MB ``` Yaml ``` spring: elasticsearch: webclient: max-in-memory-size: "1MB" ``` If the `spring.elasticsearch.` and `spring.elasticsearch.webclient.` configuration properties are not enough and you’d like to fully control the client configuration, you can register a custom `ClientConfiguration` bean. #### 2.5.2. Connecting to Elasticsearch by Using Spring Data To connect to Elasticsearch, a `RestHighLevelClient` bean must be defined, auto-configured by Spring Boot or manually provided by the application (see previous sections). With this configuration in place, an `ElasticsearchRestTemplate` can be injected like any other Spring bean, as shown in the following example: Java ``` import org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final ElasticsearchRestTemplate template; public MyBean(ElasticsearchRestTemplate template) { this.template = template; } // ... public boolean someMethod(String id) { return this.template.exists(id, User.class); } } ``` Kotlin ``` import org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate import org.springframework.stereotype.Component @Component class MyBean(private val template: ElasticsearchRestTemplate) { // ... fun someMethod(id: String): Boolean { return template.exists(id, User::class.java) } } ``` In the presence of `spring-data-elasticsearch` and the required dependencies for using a `WebClient` (typically `spring-boot-starter-webflux`), Spring Boot can also auto-configure a [ReactiveElasticsearchClient](features#data.nosql.elasticsearch.connecting-using-rest.webclient) and a `ReactiveElasticsearchTemplate` as beans. They are the reactive equivalent of the other REST clients. #### 2.5.3. Spring Data Elasticsearch Repositories Spring Data includes repository support for Elasticsearch. As with the JPA repositories discussed earlier, the basic principle is that queries are constructed for you automatically based on method names. In fact, both Spring Data JPA and Spring Data Elasticsearch share the same common infrastructure. You could take the JPA example from earlier and, assuming that `City` is now an Elasticsearch `@Document` class rather than a JPA `@Entity`, it works in the same way. | | | | --- | --- | | | For complete details of Spring Data Elasticsearch, see the [reference documentation](https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/). | Spring Boot supports both classic and reactive Elasticsearch repositories, using the `ElasticsearchRestTemplate` or `ReactiveElasticsearchTemplate` beans. Most likely those beans are auto-configured by Spring Boot given the required dependencies are present. If you wish to use your own template for backing the Elasticsearch repositories, you can add your own `ElasticsearchRestTemplate` or `ElasticsearchOperations` `@Bean`, as long as it is named `"elasticsearchTemplate"`. Same applies to `ReactiveElasticsearchTemplate` and `ReactiveElasticsearchOperations`, with the bean name `"reactiveElasticsearchTemplate"`. You can choose to disable the repositories support with the following property: Properties ``` spring.data.elasticsearch.repositories.enabled=false ``` Yaml ``` spring: data: elasticsearch: repositories: enabled: false ``` ### 2.6. Cassandra [Cassandra](https://cassandra.apache.org/) is an open source, distributed database management system designed to handle large amounts of data across many commodity servers. Spring Boot offers auto-configuration for Cassandra and the abstractions on top of it provided by [Spring Data Cassandra](https://github.com/spring-projects/spring-data-cassandra). There is a `spring-boot-starter-data-cassandra` “Starter” for collecting the dependencies in a convenient way. #### 2.6.1. Connecting to Cassandra You can inject an auto-configured `CassandraTemplate` or a Cassandra `CqlSession` instance as you would with any other Spring Bean. The `spring.data.cassandra.*` properties can be used to customize the connection. Generally, you provide `keyspace-name` and `contact-points` as well the local datacenter name, as shown in the following example: Properties ``` spring.data.cassandra.keyspace-name=mykeyspace spring.data.cassandra.contact-points=cassandrahost1:9042,cassandrahost2:9042 spring.data.cassandra.local-datacenter=datacenter1 ``` Yaml ``` spring: data: cassandra: keyspace-name: "mykeyspace" contact-points: "cassandrahost1:9042,cassandrahost2:9042" local-datacenter: "datacenter1" ``` If the port is the same for all your contact points you can use a shortcut and only specify the host names, as shown in the following example: Properties ``` spring.data.cassandra.keyspace-name=mykeyspace spring.data.cassandra.contact-points=cassandrahost1,cassandrahost2 spring.data.cassandra.local-datacenter=datacenter1 ``` Yaml ``` spring: data: cassandra: keyspace-name: "mykeyspace" contact-points: "cassandrahost1,cassandrahost2" local-datacenter: "datacenter1" ``` | | | | --- | --- | | | Those two examples are identical as the port default to `9042`. If you need to configure the port, use `spring.data.cassandra.port`. | | | | | --- | --- | | | The Cassandra driver has its own configuration infrastructure that loads an `application.conf` at the root of the classpath. Spring Boot does not look for such a file by default but can load one using `spring.data.cassandra.config`. If a property is both present in `spring.data.cassandra.*` and the configuration file, the value in `spring.data.cassandra.*` takes precedence. For more advanced driver customizations, you can register an arbitrary number of beans that implement `DriverConfigLoaderBuilderCustomizer`. The `CqlSession` can be customized with a bean of type `CqlSessionBuilderCustomizer`. | | | | | --- | --- | | | If you use `CqlSessionBuilder` to create multiple `CqlSession` beans, keep in mind the builder is mutable so make sure to inject a fresh copy for each session. | The following code listing shows how to inject a Cassandra bean: Java ``` import org.springframework.data.cassandra.core.CassandraTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final CassandraTemplate template; public MyBean(CassandraTemplate template) { this.template = template; } // ... public long someMethod() { return this.template.count(User.class); } } ``` Kotlin ``` import org.springframework.data.cassandra.core.CassandraTemplate import org.springframework.stereotype.Component @Component class MyBean(private val template: CassandraTemplate) { // ... fun someMethod(): Long { return template.count(User::class.java) } } ``` If you add your own `@Bean` of type `CassandraTemplate`, it replaces the default. #### 2.6.2. Spring Data Cassandra Repositories Spring Data includes basic repository support for Cassandra. Currently, this is more limited than the JPA repositories discussed earlier and needs to annotate finder methods with `@Query`. | | | | --- | --- | | | For complete details of Spring Data Cassandra, see the [reference documentation](https://docs.spring.io/spring-data/cassandra/docs/). | ### 2.7. Couchbase [Couchbase](https://www.couchbase.com/) is an open-source, distributed, multi-model NoSQL document-oriented database that is optimized for interactive applications. Spring Boot offers auto-configuration for Couchbase and the abstractions on top of it provided by [Spring Data Couchbase](https://github.com/spring-projects/spring-data-couchbase). There are `spring-boot-starter-data-couchbase` and `spring-boot-starter-data-couchbase-reactive` “Starters” for collecting the dependencies in a convenient way. #### 2.7.1. Connecting to Couchbase You can get a `Cluster` by adding the Couchbase SDK and some configuration. The `spring.couchbase.*` properties can be used to customize the connection. Generally, you provide the [connection string](https://github.com/couchbaselabs/sdk-rfcs/blob/master/rfc/0011-connection-string.md), username, and password, as shown in the following example: Properties ``` spring.couchbase.connection-string=couchbase://192.168.1.123 spring.couchbase.username=user spring.couchbase.password=secret ``` Yaml ``` spring: couchbase: connection-string: "couchbase://192.168.1.123" username: "user" password: "secret" ``` It is also possible to customize some of the `ClusterEnvironment` settings. For instance, the following configuration changes the timeout to use to open a new `Bucket` and enables SSL support: Properties ``` spring.couchbase.env.timeouts.connect=3s spring.couchbase.env.ssl.key-store=/location/of/keystore.jks spring.couchbase.env.ssl.key-store-password=secret ``` Yaml ``` spring: couchbase: env: timeouts: connect: "3s" ssl: key-store: "/location/of/keystore.jks" key-store-password: "secret" ``` | | | | --- | --- | | | Check the `spring.couchbase.env.*` properties for more details. To take more control, one or more `ClusterEnvironmentBuilderCustomizer` beans can be used. | #### 2.7.2. Spring Data Couchbase Repositories Spring Data includes repository support for Couchbase. For complete details of Spring Data Couchbase, see the [reference documentation](https://docs.spring.io/spring-data/couchbase/docs/4.4.0/reference/html/). You can inject an auto-configured `CouchbaseTemplate` instance as you would with any other Spring Bean, provided a `CouchbaseClientFactory` bean is available. This happens when a `Cluster` is available, as described above, and a bucket name has been specified: Properties ``` spring.data.couchbase.bucket-name=my-bucket ``` Yaml ``` spring: data: couchbase: bucket-name: "my-bucket" ``` The following examples shows how to inject a `CouchbaseTemplate` bean: Java ``` import org.springframework.data.couchbase.core.CouchbaseTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final CouchbaseTemplate template; public MyBean(CouchbaseTemplate template) { this.template = template; } // ... public String someMethod() { return this.template.getBucketName(); } } ``` Kotlin ``` import org.springframework.data.couchbase.core.CouchbaseTemplate import org.springframework.stereotype.Component @Component class MyBean(private val template: CouchbaseTemplate) { // ... fun someMethod(): String { return template.bucketName } } ``` There are a few beans that you can define in your own configuration to override those provided by the auto-configuration: * A `CouchbaseMappingContext` `@Bean` with a name of `couchbaseMappingContext`. * A `CustomConversions` `@Bean` with a name of `couchbaseCustomConversions`. * A `CouchbaseTemplate` `@Bean` with a name of `couchbaseTemplate`. To avoid hard-coding those names in your own config, you can reuse `BeanNames` provided by Spring Data Couchbase. For instance, you can customize the converters to use, as follows: Java ``` import org.assertj.core.util.Arrays; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.data.couchbase.config.BeanNames; import org.springframework.data.couchbase.core.convert.CouchbaseCustomConversions; @Configuration(proxyBeanMethods = false) public class MyCouchbaseConfiguration { @Bean(BeanNames.COUCHBASE\_CUSTOM\_CONVERSIONS) public CouchbaseCustomConversions myCustomConversions() { return new CouchbaseCustomConversions(Arrays.asList(new MyConverter())); } } ``` Kotlin ``` import org.assertj.core.util.Arrays import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.data.couchbase.config.BeanNames import org.springframework.data.couchbase.core.convert.CouchbaseCustomConversions @Configuration(proxyBeanMethods = false) class MyCouchbaseConfiguration { @Bean(BeanNames.COUCHBASE\_CUSTOM\_CONVERSIONS) fun myCustomConversions(): CouchbaseCustomConversions { return CouchbaseCustomConversions(Arrays.asList(MyConverter())) } } ``` ### 2.8. LDAP [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol) (Lightweight Directory Access Protocol) is an open, vendor-neutral, industry standard application protocol for accessing and maintaining distributed directory information services over an IP network. Spring Boot offers auto-configuration for any compliant LDAP server as well as support for the embedded in-memory LDAP server from [UnboundID](https://ldap.com/unboundid-ldap-sdk-for-java/). LDAP abstractions are provided by [Spring Data LDAP](https://github.com/spring-projects/spring-data-ldap). There is a `spring-boot-starter-data-ldap` “Starter” for collecting the dependencies in a convenient way. #### 2.8.1. Connecting to an LDAP Server To connect to an LDAP server, make sure you declare a dependency on the `spring-boot-starter-data-ldap` “Starter” or `spring-ldap-core` and then declare the URLs of your server in your application.properties, as shown in the following example: Properties ``` spring.ldap.urls=ldap://myserver:1235 spring.ldap.username=admin spring.ldap.password=secret ``` Yaml ``` spring: ldap: urls: "ldap://myserver:1235" username: "admin" password: "secret" ``` If you need to customize connection settings, you can use the `spring.ldap.base` and `spring.ldap.base-environment` properties. An `LdapContextSource` is auto-configured based on these settings. If a `DirContextAuthenticationStrategy` bean is available, it is associated to the auto-configured `LdapContextSource`. If you need to customize it, for instance to use a `PooledContextSource`, you can still inject the auto-configured `LdapContextSource`. Make sure to flag your customized `ContextSource` as `@Primary` so that the auto-configured `LdapTemplate` uses it. #### 2.8.2. Spring Data LDAP Repositories Spring Data includes repository support for LDAP. For complete details of Spring Data LDAP, see the [reference documentation](https://docs.spring.io/spring-data/ldap/docs/1.0.x/reference/html/). You can also inject an auto-configured `LdapTemplate` instance as you would with any other Spring Bean, as shown in the following example: Java ``` import java.util.List; import org.springframework.ldap.core.LdapTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final LdapTemplate template; public MyBean(LdapTemplate template) { this.template = template; } // ... public List<User> someMethod() { return this.template.findAll(User.class); } } ``` Kotlin ``` import org.springframework.ldap.core.LdapTemplate import org.springframework.stereotype.Component @Component class MyBean(private val template: LdapTemplate) { // ... fun someMethod(): List<User> { return template.findAll(User::class.java) } } ``` #### 2.8.3. Embedded In-memory LDAP Server For testing purposes, Spring Boot supports auto-configuration of an in-memory LDAP server from [UnboundID](https://ldap.com/unboundid-ldap-sdk-for-java/). To configure the server, add a dependency to `com.unboundid:unboundid-ldapsdk` and declare a `spring.ldap.embedded.base-dn` property, as follows: Properties ``` spring.ldap.embedded.base-dn=dc=spring,dc=io ``` Yaml ``` spring: ldap: embedded: base-dn: "dc=spring,dc=io" ``` | | | | --- | --- | | | It is possible to define multiple base-dn values, however, since distinguished names usually contain commas, they must be defined using the correct notation. In yaml files, you can use the yaml list notation. In properties files, you must include the index as part of the property name: Properties ``` spring.ldap.embedded.base-dn[0]=dc=spring,dc=io spring.ldap.embedded.base-dn[1]=dc=pivotal,dc=io ``` Yaml ``` spring.ldap.embedded.base-dn: - "dc=spring,dc=io" - "dc=pivotal,dc=io" ``` | By default, the server starts on a random port and triggers the regular LDAP support. There is no need to specify a `spring.ldap.urls` property. If there is a `schema.ldif` file on your classpath, it is used to initialize the server. If you want to load the initialization script from a different resource, you can also use the `spring.ldap.embedded.ldif` property. By default, a standard schema is used to validate `LDIF` files. You can turn off validation altogether by setting the `spring.ldap.embedded.validation.enabled` property. If you have custom attributes, you can use `spring.ldap.embedded.validation.schema` to define your custom attribute types or object classes. ### 2.9. InfluxDB [InfluxDB](https://www.influxdata.com/) is an open-source time series database optimized for fast, high-availability storage and retrieval of time series data in fields such as operations monitoring, application metrics, Internet-of-Things sensor data, and real-time analytics. #### 2.9.1. Connecting to InfluxDB Spring Boot auto-configures an `InfluxDB` instance, provided the `influxdb-java` client is on the classpath and the URL of the database is set, as shown in the following example: Properties ``` spring.influx.url=https://172.0.0.1:8086 ``` Yaml ``` spring: influx: url: "https://172.0.0.1:8086" ``` If the connection to InfluxDB requires a user and password, you can set the `spring.influx.user` and `spring.influx.password` properties accordingly. InfluxDB relies on OkHttp. If you need to tune the http client `InfluxDB` uses behind the scenes, you can register an `InfluxDbOkHttpClientBuilderProvider` bean. If you need more control over the configuration, consider registering an `InfluxDbCustomizer` bean. 3. What to Read Next --------------------- You should now have a feeling for how to use Spring Boot with various data technologies. From here, you can read about Spring Boot’s support for various [messaging technologies](messaging#messaging) and how to enable them in your application.
programming_docs
spring_boot Getting Help Getting Help ============ If you have trouble with Spring Boot, we would like to help. * Try the [How-to documents](howto#howto). They provide solutions to the most common questions. * Learn the Spring basics. Spring Boot builds on many other Spring projects. Check the [spring.io](https://spring.io) web-site for a wealth of reference documentation. If you are starting out with Spring, try one of the [guides](https://spring.io/guides). * Ask a question. We monitor [stackoverflow.com](https://stackoverflow.com) for questions tagged with [`spring-boot`](https://stackoverflow.com/tags/spring-boot). * Report bugs with Spring Boot at [github.com/spring-projects/spring-boot/issues](https://github.com/spring-projects/spring-boot/issues). | | | | --- | --- | | | All of Spring Boot is open source, including the documentation. If you find problems with the docs or if you want to improve them, please [get involved](https://github.com/spring-projects/spring-boot/tree/v2.7.0). | spring_boot Production-ready Features Production-ready Features ========================= Spring Boot includes a number of additional features to help you monitor and manage your application when you push it to production. You can choose to manage and monitor your application by using HTTP endpoints or with JMX. Auditing, health, and metrics gathering can also be automatically applied to your application. 1. Enabling Production-ready Features -------------------------------------- The [`spring-boot-actuator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator) module provides all of Spring Boot’s production-ready features. The recommended way to enable the features is to add a dependency on the `spring-boot-starter-actuator` “Starter”. Definition of Actuator An actuator is a manufacturing term that refers to a mechanical device for moving or controlling something. Actuators can generate a large amount of motion from a small change. To add the actuator to a Maven-based project, add the following ‘Starter’ dependency: ``` <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-actuator</artifactId> </dependency> </dependencies> ``` For Gradle, use the following declaration: ``` dependencies { implementation 'org.springframework.boot:spring-boot-starter-actuator' } ``` 2. Endpoints ------------- Actuator endpoints let you monitor and interact with your application. Spring Boot includes a number of built-in endpoints and lets you add your own. For example, the `health` endpoint provides basic application health information. You can [enable or disable](#actuator.endpoints.enabling) each individual endpoint and [expose them (make them remotely accessible) over HTTP or JMX](#actuator.endpoints.exposing). An endpoint is considered to be available when it is both enabled and exposed. The built-in endpoints are auto-configured only when they are available. Most applications choose exposure over HTTP, where the ID of the endpoint and a prefix of `/actuator` is mapped to a URL. For example, by default, the `health` endpoint is mapped to `/actuator/health`. | | | | --- | --- | | | To learn more about the Actuator’s endpoints and their request and response formats, see the separate API documentation ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/htmlsingle) or [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/pdf/spring-boot-actuator-web-api.pdf)). | The following technology-agnostic endpoints are available: | ID | Description | | --- | --- | | `auditevents` | Exposes audit events information for the current application. Requires an `AuditEventRepository` bean. | | `beans` | Displays a complete list of all the Spring beans in your application. | | `caches` | Exposes available caches. | | `conditions` | Shows the conditions that were evaluated on configuration and auto-configuration classes and the reasons why they did or did not match. | | `configprops` | Displays a collated list of all `@ConfigurationProperties`. | | `env` | Exposes properties from Spring’s `ConfigurableEnvironment`. | | `flyway` | Shows any Flyway database migrations that have been applied. Requires one or more `Flyway` beans. | | `health` | Shows application health information. | | `httptrace` | Displays HTTP trace information (by default, the last 100 HTTP request-response exchanges). Requires an `HttpTraceRepository` bean. | | `info` | Displays arbitrary application info. | | `integrationgraph` | Shows the Spring Integration graph. Requires a dependency on `spring-integration-core`. | | `loggers` | Shows and modifies the configuration of loggers in the application. | | `liquibase` | Shows any Liquibase database migrations that have been applied. Requires one or more `Liquibase` beans. | | `metrics` | Shows “metrics” information for the current application. | | `mappings` | Displays a collated list of all `@RequestMapping` paths. | | `quartz` | Shows information about Quartz Scheduler jobs. | | `scheduledtasks` | Displays the scheduled tasks in your application. | | `sessions` | Allows retrieval and deletion of user sessions from a Spring Session-backed session store. Requires a servlet-based web application that uses Spring Session. | | `shutdown` | Lets the application be gracefully shutdown. Disabled by default. | | `startup` | Shows the [startup steps data](features#features.spring-application.startup-tracking) collected by the `ApplicationStartup`. Requires the `SpringApplication` to be configured with a `BufferingApplicationStartup`. | | `threaddump` | Performs a thread dump. | If your application is a web application (Spring MVC, Spring WebFlux, or Jersey), you can use the following additional endpoints: | ID | Description | | --- | --- | | `heapdump` | Returns a heap dump file. On a HotSpot JVM, an `HPROF`-format file is returned. On an OpenJ9 JVM, a `PHD`-format file is returned. | | `jolokia` | Exposes JMX beans over HTTP when Jolokia is on the classpath (not available for WebFlux). Requires a dependency on `jolokia-core`. | | `logfile` | Returns the contents of the logfile (if the `logging.file.name` or the `logging.file.path` property has been set). Supports the use of the HTTP `Range` header to retrieve part of the log file’s content. | | `prometheus` | Exposes metrics in a format that can be scraped by a Prometheus server. Requires a dependency on `micrometer-registry-prometheus`. | ### 2.1. Enabling Endpoints By default, all endpoints except for `shutdown` are enabled. To configure the enablement of an endpoint, use its `management.endpoint.<id>.enabled` property. The following example enables the `shutdown` endpoint: Properties ``` management.endpoint.shutdown.enabled=true ``` Yaml ``` management: endpoint: shutdown: enabled: true ``` If you prefer endpoint enablement to be opt-in rather than opt-out, set the `management.endpoints.enabled-by-default` property to `false` and use individual endpoint `enabled` properties to opt back in. The following example enables the `info` endpoint and disables all other endpoints: Properties ``` management.endpoints.enabled-by-default=false management.endpoint.info.enabled=true ``` Yaml ``` management: endpoints: enabled-by-default: false endpoint: info: enabled: true ``` | | | | --- | --- | | | Disabled endpoints are removed entirely from the application context. If you want to change only the technologies over which an endpoint is exposed, use the [`include` and `exclude` properties](#actuator.endpoints.exposing) instead. | ### 2.2. Exposing Endpoints Since Endpoints may contain sensitive information, you should carefully consider when to expose them. The following table shows the default exposure for the built-in endpoints: | ID | JMX | Web | | --- | --- | --- | | `auditevents` | Yes | No | | `beans` | Yes | No | | `caches` | Yes | No | | `conditions` | Yes | No | | `configprops` | Yes | No | | `env` | Yes | No | | `flyway` | Yes | No | | `health` | Yes | Yes | | `heapdump` | N/A | No | | `httptrace` | Yes | No | | `info` | Yes | No | | `integrationgraph` | Yes | No | | `jolokia` | N/A | No | | `logfile` | N/A | No | | `loggers` | Yes | No | | `liquibase` | Yes | No | | `metrics` | Yes | No | | `mappings` | Yes | No | | `prometheus` | N/A | No | | `quartz` | Yes | No | | `scheduledtasks` | Yes | No | | `sessions` | Yes | No | | `shutdown` | Yes | No | | `startup` | Yes | No | | `threaddump` | Yes | No | To change which endpoints are exposed, use the following technology-specific `include` and `exclude` properties: | Property | Default | | --- | --- | | `management.endpoints.jmx.exposure.exclude` | | | `management.endpoints.jmx.exposure.include` | `*` | | `management.endpoints.web.exposure.exclude` | | | `management.endpoints.web.exposure.include` | `health` | The `include` property lists the IDs of the endpoints that are exposed. The `exclude` property lists the IDs of the endpoints that should not be exposed. The `exclude` property takes precedence over the `include` property. You can configure both the `include` and the `exclude` properties with a list of endpoint IDs. For example, to stop exposing all endpoints over JMX and only expose the `health` and `info` endpoints, use the following property: Properties ``` management.endpoints.jmx.exposure.include=health,info ``` Yaml ``` management: endpoints: jmx: exposure: include: "health,info" ``` `*` can be used to select all endpoints. For example, to expose everything over HTTP except the `env` and `beans` endpoints, use the following properties: Properties ``` management.endpoints.web.exposure.include=* management.endpoints.web.exposure.exclude=env,beans ``` Yaml ``` management: endpoints: web: exposure: include: "*" exclude: "env,beans" ``` | | | | --- | --- | | | `*` has a special meaning in YAML, so be sure to add quotation marks if you want to include (or exclude) all endpoints. | | | | | --- | --- | | | If your application is exposed publicly, we strongly recommend that you also [secure your endpoints](#actuator.endpoints.security). | | | | | --- | --- | | | If you want to implement your own strategy for when endpoints are exposed, you can register an `EndpointFilter` bean. | ### 2.3. Security For security purposes, only the `/health` endpoint is exposed over HTTP by default. You can use the `management.endpoints.web.exposure.include` property to configure the endpoints that are exposed. | | | | --- | --- | | | Before setting the `management.endpoints.web.exposure.include`, ensure that the exposed actuators do not contain sensitive information, are secured by placing them behind a firewall, or are secured by something like Spring Security. | If Spring Security is on the classpath and no other `WebSecurityConfigurerAdapter` or `SecurityFilterChain` bean is present, all actuators other than `/health` are secured by Spring Boot auto-configuration. If you define a custom `WebSecurityConfigurerAdapter` or `SecurityFilterChain` bean, Spring Boot auto-configuration backs off and lets you fully control the actuator access rules. If you wish to configure custom security for HTTP endpoints (for example, to allow only users with a certain role to access them), Spring Boot provides some convenient `RequestMatcher` objects that you can use in combination with Spring Security. A typical Spring Security configuration might look something like the following example: Java ``` import org.springframework.boot.actuate.autoconfigure.security.servlet.EndpointRequest; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration(proxyBeanMethods = false) public class MySecurityConfiguration { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { http.requestMatcher(EndpointRequest.toAnyEndpoint()) .authorizeRequests((requests) -> requests.anyRequest().hasRole("ENDPOINT\_ADMIN")); http.httpBasic(); return http.build(); } } ``` Kotlin ``` import org.springframework.boot.actuate.autoconfigure.security.servlet.EndpointRequest import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.web.SecurityFilterChain @Configuration(proxyBeanMethods = false) class MySecurityConfiguration { @Bean fun securityFilterChain(http: HttpSecurity): SecurityFilterChain { http.requestMatcher(EndpointRequest.toAnyEndpoint()).authorizeRequests { requests -> requests.anyRequest().hasRole("ENDPOINT\_ADMIN") } http.httpBasic() return http.build() } } ``` The preceding example uses `EndpointRequest.toAnyEndpoint()` to match a request to any endpoint and then ensures that all have the `ENDPOINT_ADMIN` role. Several other matcher methods are also available on `EndpointRequest`. See the API documentation ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/htmlsingle) or [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/pdf/spring-boot-actuator-web-api.pdf)) for details. If you deploy applications behind a firewall, you may prefer that all your actuator endpoints can be accessed without requiring authentication. You can do so by changing the `management.endpoints.web.exposure.include` property, as follows: Properties ``` management.endpoints.web.exposure.include=* ``` Yaml ``` management: endpoints: web: exposure: include: "*" ``` Additionally, if Spring Security is present, you would need to add custom security configuration that allows unauthenticated access to the endpoints, as the following example shows: Java ``` import org.springframework.boot.actuate.autoconfigure.security.servlet.EndpointRequest; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration(proxyBeanMethods = false) public class MySecurityConfiguration { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { http.requestMatcher(EndpointRequest.toAnyEndpoint()) .authorizeRequests((requests) -> requests.anyRequest().permitAll()); return http.build(); } } ``` Kotlin ``` import org.springframework.boot.actuate.autoconfigure.security.servlet.EndpointRequest import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.web.SecurityFilterChain @Configuration(proxyBeanMethods = false) class MySecurityConfiguration { @Bean fun securityFilterChain(http: HttpSecurity): SecurityFilterChain { http.requestMatcher(EndpointRequest.toAnyEndpoint()).authorizeRequests { requests -> requests.anyRequest().permitAll() } return http.build() } } ``` | | | | --- | --- | | | In both of the preceding examples, the configuration applies only to the actuator endpoints. Since Spring Boot’s security configuration backs off completely in the presence of any `SecurityFilterChain` bean, you need to configure an additional `SecurityFilterChain` bean with rules that apply to the rest of the application. | #### 2.3.1. Cross Site Request Forgery Protection Since Spring Boot relies on Spring Security’s defaults, CSRF protection is turned on by default. This means that the actuator endpoints that require a `POST` (shutdown and loggers endpoints), a `PUT`, or a `DELETE` get a 403 (forbidden) error when the default security configuration is in use. | | | | --- | --- | | | We recommend disabling CSRF protection completely only if you are creating a service that is used by non-browser clients. | You can find additional information about CSRF protection in the [Spring Security Reference Guide](https://docs.spring.io/spring-security/reference/5.7.1/features/exploits/csrf.html). ### 2.4. Configuring Endpoints Endpoints automatically cache responses to read operations that do not take any parameters. To configure the amount of time for which an endpoint caches a response, use its `cache.time-to-live` property. The following example sets the time-to-live of the `beans` endpoint’s cache to 10 seconds: Properties ``` management.endpoint.beans.cache.time-to-live=10s ``` Yaml ``` management: endpoint: beans: cache: time-to-live: "10s" ``` | | | | --- | --- | | | The `management.endpoint.<name>` prefix uniquely identifies the endpoint that is being configured. | ### 2.5. Hypermedia for Actuator Web Endpoints A “discovery page” is added with links to all the endpoints. The “discovery page” is available on `/actuator` by default. To disable the “discovery page”, add the following property to your application properties: Properties ``` management.endpoints.web.discovery.enabled=false ``` Yaml ``` management: endpoints: web: discovery: enabled: false ``` When a custom management context path is configured, the “discovery page” automatically moves from `/actuator` to the root of the management context. For example, if the management context path is `/management`, the discovery page is available from `/management`. When the management context path is set to `/`, the discovery page is disabled to prevent the possibility of a clash with other mappings. ### 2.6. CORS Support [Cross-origin resource sharing](https://en.wikipedia.org/wiki/Cross-origin_resource_sharing) (CORS) is a [W3C specification](https://www.w3.org/TR/cors/) that lets you specify in a flexible way what kind of cross-domain requests are authorized. If you use Spring MVC or Spring WebFlux, you can configure Actuator’s web endpoints to support such scenarios. CORS support is disabled by default and is only enabled once you have set the `management.endpoints.web.cors.allowed-origins` property. The following configuration permits `GET` and `POST` calls from the `example.com` domain: Properties ``` management.endpoints.web.cors.allowed-origins=https://example.com management.endpoints.web.cors.allowed-methods=GET,POST ``` Yaml ``` management: endpoints: web: cors: allowed-origins: "https://example.com" allowed-methods: "GET,POST" ``` | | | | --- | --- | | | See [`CorsEndpointProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/endpoint/web/CorsEndpointProperties.java) for a complete list of options. | ### 2.7. Implementing Custom Endpoints If you add a `@Bean` annotated with `@Endpoint`, any methods annotated with `@ReadOperation`, `@WriteOperation`, or `@DeleteOperation` are automatically exposed over JMX and, in a web application, over HTTP as well. Endpoints can be exposed over HTTP by using Jersey, Spring MVC, or Spring WebFlux. If both Jersey and Spring MVC are available, Spring MVC is used. The following example exposes a read operation that returns a custom object: Java ``` @ReadOperation public CustomData getData() { return new CustomData("test", 5); } ``` Kotlin ``` @ReadOperation fun getData(): CustomData { return CustomData("test", 5) } ``` You can also write technology-specific endpoints by using `@JmxEndpoint` or `@WebEndpoint`. These endpoints are restricted to their respective technologies. For example, `@WebEndpoint` is exposed only over HTTP and not over JMX. You can write technology-specific extensions by using `@EndpointWebExtension` and `@EndpointJmxExtension`. These annotations let you provide technology-specific operations to augment an existing endpoint. Finally, if you need access to web-framework-specific functionality, you can implement servlet or Spring `@Controller` and `@RestController` endpoints at the cost of them not being available over JMX or when using a different web framework. #### 2.7.1. Receiving Input Operations on an endpoint receive input through their parameters. When exposed over the web, the values for these parameters are taken from the URL’s query parameters and from the JSON request body. When exposed over JMX, the parameters are mapped to the parameters of the MBean’s operations. Parameters are required by default. They can be made optional by annotating them with either `@javax.annotation.Nullable` or `@org.springframework.lang.Nullable`. You can map each root property in the JSON request body to a parameter of the endpoint. Consider the following JSON request body: ``` { "name": "test", "counter": 42 } ``` You can use this to invoke a write operation that takes `String name` and `int counter` parameters, as the following example shows: Java ``` @WriteOperation public void updateData(String name, int counter) { // injects "test" and 42 } ``` Kotlin ``` @WriteOperation fun updateData(name: String?, counter: Int) { // injects "test" and 42 } ``` | | | | --- | --- | | | Because endpoints are technology agnostic, only simple types can be specified in the method signature. In particular, declaring a single parameter with a `CustomData` type that defines a `name` and `counter` properties is not supported. | | | | | --- | --- | | | To let the input be mapped to the operation method’s parameters, Java code that implements an endpoint should be compiled with `-parameters`, and Kotlin code that implements an endpoint should be compiled with `-java-parameters`. This will happen automatically if you use Spring Boot’s Gradle plugin or if you use Maven and `spring-boot-starter-parent`. | ##### Input Type Conversion The parameters passed to endpoint operation methods are, if necessary, automatically converted to the required type. Before calling an operation method, the input received over JMX or HTTP is converted to the required types by using an instance of `ApplicationConversionService` as well as any `Converter` or `GenericConverter` beans qualified with `@EndpointConverter`. #### 2.7.2. Custom Web Endpoints Operations on an `@Endpoint`, `@WebEndpoint`, or `@EndpointWebExtension` are automatically exposed over HTTP using Jersey, Spring MVC, or Spring WebFlux. If both Jersey and Spring MVC are available, Spring MVC is used. ##### Web Endpoint Request Predicates A request predicate is automatically generated for each operation on a web-exposed endpoint. ##### Path The path of the predicate is determined by the ID of the endpoint and the base path of the web-exposed endpoints. The default base path is `/actuator`. For example, an endpoint with an ID of `sessions` uses `/actuator/sessions` as its path in the predicate. You can further customize the path by annotating one or more parameters of the operation method with `@Selector`. Such a parameter is added to the path predicate as a path variable. The variable’s value is passed into the operation method when the endpoint operation is invoked. If you want to capture all remaining path elements, you can add `@Selector(Match=ALL_REMAINING)` to the last parameter and make it a type that is conversion-compatible with a `String[]`. ##### HTTP method The HTTP method of the predicate is determined by the operation type, as shown in the following table: | Operation | HTTP method | | --- | --- | | `@ReadOperation` | `GET` | | `@WriteOperation` | `POST` | | `@DeleteOperation` | `DELETE` | ##### Consumes For a `@WriteOperation` (HTTP `POST`) that uses the request body, the `consumes` clause of the predicate is `application/vnd.spring-boot.actuator.v2+json, application/json`. For all other operations, the `consumes` clause is empty. ##### Produces The `produces` clause of the predicate can be determined by the `produces` attribute of the `@DeleteOperation`, `@ReadOperation`, and `@WriteOperation` annotations. The attribute is optional. If it is not used, the `produces` clause is determined automatically. If the operation method returns `void` or `Void`, the `produces` clause is empty. If the operation method returns a `org.springframework.core.io.Resource`, the `produces` clause is `application/octet-stream`. For all other operations, the `produces` clause is `application/vnd.spring-boot.actuator.v2+json, application/json`. ##### Web Endpoint Response Status The default response status for an endpoint operation depends on the operation type (read, write, or delete) and what, if anything, the operation returns. If a `@ReadOperation` returns a value, the response status will be 200 (OK). If it does not return a value, the response status will be 404 (Not Found). If a `@WriteOperation` or `@DeleteOperation` returns a value, the response status will be 200 (OK). If it does not return a value, the response status will be 204 (No Content). If an operation is invoked without a required parameter or with a parameter that cannot be converted to the required type, the operation method is not called, and the response status will be 400 (Bad Request). ##### Web Endpoint Range Requests You can use an HTTP range request to request part of an HTTP resource. When using Spring MVC or Spring Web Flux, operations that return a `org.springframework.core.io.Resource` automatically support range requests. | | | | --- | --- | | | Range requests are not supported when using Jersey. | ##### Web Endpoint Security An operation on a web endpoint or a web-specific endpoint extension can receive the current `java.security.Principal` or `org.springframework.boot.actuate.endpoint.SecurityContext` as a method parameter. The former is typically used in conjunction with `@Nullable` to provide different behavior for authenticated and unauthenticated users. The latter is typically used to perform authorization checks by using its `isUserInRole(String)` method. #### 2.7.3. Servlet Endpoints A servlet can be exposed as an endpoint by implementing a class annotated with `@ServletEndpoint` that also implements `Supplier<EndpointServlet>`. Servlet endpoints provide deeper integration with the servlet container but at the expense of portability. They are intended to be used to expose an existing servlet as an endpoint. For new endpoints, the `@Endpoint` and `@WebEndpoint` annotations should be preferred whenever possible. #### 2.7.4. Controller Endpoints You can use `@ControllerEndpoint` and `@RestControllerEndpoint` to implement an endpoint that is exposed only by Spring MVC or Spring WebFlux. Methods are mapped by using the standard annotations for Spring MVC and Spring WebFlux, such as `@RequestMapping` and `@GetMapping`, with the endpoint’s ID being used as a prefix for the path. Controller endpoints provide deeper integration with Spring’s web frameworks but at the expense of portability. The `@Endpoint` and `@WebEndpoint` annotations should be preferred whenever possible. ### 2.8. Health Information You can use health information to check the status of your running application. It is often used by monitoring software to alert someone when a production system goes down. The information exposed by the `health` endpoint depends on the `management.endpoint.health.show-details` and `management.endpoint.health.show-components` properties, which can be configured with one of the following values: | Name | Description | | --- | --- | | `never` | Details are never shown. | | `when-authorized` | Details are shown only to authorized users. Authorized roles can be configured by using `management.endpoint.health.roles`. | | `always` | Details are shown to all users. | The default value is `never`. A user is considered to be authorized when they are in one or more of the endpoint’s roles. If the endpoint has no configured roles (the default), all authenticated users are considered to be authorized. You can configure the roles by using the `management.endpoint.health.roles` property. | | | | --- | --- | | | If you have secured your application and wish to use `always`, your security configuration must permit access to the health endpoint for both authenticated and unauthenticated users. | Health information is collected from the content of a [`HealthContributorRegistry`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/HealthContributorRegistry.java) (by default, all [`HealthContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/HealthContributor.java) instances defined in your `ApplicationContext`). Spring Boot includes a number of auto-configured `HealthContributors`, and you can also write your own. A `HealthContributor` can be either a `HealthIndicator` or a `CompositeHealthContributor`. A `HealthIndicator` provides actual health information, including a `Status`. A `CompositeHealthContributor` provides a composite of other `HealthContributors`. Taken together, contributors form a tree structure to represent the overall system health. By default, the final system health is derived by a `StatusAggregator`, which sorts the statuses from each `HealthIndicator` based on an ordered list of statuses. The first status in the sorted list is used as the overall health status. If no `HealthIndicator` returns a status that is known to the `StatusAggregator`, an `UNKNOWN` status is used. | | | | --- | --- | | | You can use the `HealthContributorRegistry` to register and unregister health indicators at runtime. | #### 2.8.1. Auto-configured HealthIndicators When appropriate, Spring Boot auto-configures the `HealthIndicators` listed in the following table. You can also enable or disable selected indicators by configuring `management.health.key.enabled`, with the `key` listed in the following table: | Key | Name | Description | | --- | --- | --- | | `cassandra` | [`CassandraDriverHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/cassandra/CassandraDriverHealthIndicator.java) | Checks that a Cassandra database is up. | | `couchbase` | [`CouchbaseHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/couchbase/CouchbaseHealthIndicator.java) | Checks that a Couchbase cluster is up. | | `db` | [`DataSourceHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/jdbc/DataSourceHealthIndicator.java) | Checks that a connection to `DataSource` can be obtained. | | `diskspace` | [`DiskSpaceHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/system/DiskSpaceHealthIndicator.java) | Checks for low disk space. | | `elasticsearch` | [`ElasticsearchRestHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/elasticsearch/ElasticsearchRestHealthIndicator.java) | Checks that an Elasticsearch cluster is up. | | `hazelcast` | [`HazelcastHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/hazelcast/HazelcastHealthIndicator.java) | Checks that a Hazelcast server is up. | | `influxdb` | [`InfluxDbHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/influx/InfluxDbHealthIndicator.java) | Checks that an InfluxDB server is up. | | `jms` | [`JmsHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/jms/JmsHealthIndicator.java) | Checks that a JMS broker is up. | | `ldap` | [`LdapHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/ldap/LdapHealthIndicator.java) | Checks that an LDAP server is up. | | `mail` | [`MailHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/mail/MailHealthIndicator.java) | Checks that a mail server is up. | | `mongo` | [`MongoHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/mongo/MongoHealthIndicator.java) | Checks that a Mongo database is up. | | `neo4j` | [`Neo4jHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/neo4j/Neo4jHealthIndicator.java) | Checks that a Neo4j database is up. | | `ping` | [`PingHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/PingHealthIndicator.java) | Always responds with `UP`. | | `rabbit` | [`RabbitHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/amqp/RabbitHealthIndicator.java) | Checks that a Rabbit server is up. | | `redis` | [`RedisHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/redis/RedisHealthIndicator.java) | Checks that a Redis server is up. | | `solr` | [`SolrHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/solr/SolrHealthIndicator.java) | Checks that a Solr server is up. | | | | | --- | --- | | | You can disable them all by setting the `management.health.defaults.enabled` property. | Additional `HealthIndicators` are available but are not enabled by default: | Key | Name | Description | | --- | --- | --- | | `livenessstate` | [`LivenessStateHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/availability/LivenessStateHealthIndicator.java) | Exposes the “Liveness” application availability state. | | `readinessstate` | [`ReadinessStateHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/availability/ReadinessStateHealthIndicator.java) | Exposes the “Readiness” application availability state. | #### 2.8.2. Writing Custom HealthIndicators To provide custom health information, you can register Spring beans that implement the [`HealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/HealthIndicator.java) interface. You need to provide an implementation of the `health()` method and return a `Health` response. The `Health` response should include a status and can optionally include additional details to be displayed. The following code shows a sample `HealthIndicator` implementation: Java ``` import org.springframework.boot.actuate.health.Health; import org.springframework.boot.actuate.health.HealthIndicator; import org.springframework.stereotype.Component; @Component public class MyHealthIndicator implements HealthIndicator { @Override public Health health() { int errorCode = check(); if (errorCode != 0) { return Health.down().withDetail("Error Code", errorCode).build(); } return Health.up().build(); } private int check() { // perform some specific health check return ... } } ``` Kotlin ``` import org.springframework.boot.actuate.health.Health import org.springframework.boot.actuate.health.HealthIndicator import org.springframework.stereotype.Component @Component class MyHealthIndicator : HealthIndicator { override fun health(): Health { val errorCode = check() if (errorCode != 0) { return Health.down().withDetail("Error Code", errorCode).build() } return Health.up().build() } private fun check(): Int { // perform some specific health check return ... } } ``` | | | | --- | --- | | | The identifier for a given `HealthIndicator` is the name of the bean without the `HealthIndicator` suffix, if it exists. In the preceding example, the health information is available in an entry named `my`. | In addition to Spring Boot’s predefined [`Status`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/Status.java) types, `Health` can return a custom `Status` that represents a new system state. In such cases, you also need to provide a custom implementation of the [`StatusAggregator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/StatusAggregator.java) interface, or you must configure the default implementation by using the `management.endpoint.health.status.order` configuration property. For example, assume a new `Status` with a code of `FATAL` is being used in one of your `HealthIndicator` implementations. To configure the severity order, add the following property to your application properties: Properties ``` management.endpoint.health.status.order=fatal,down,out-of-service,unknown,up ``` Yaml ``` management: endpoint: health: status: order: "fatal,down,out-of-service,unknown,up" ``` The HTTP status code in the response reflects the overall health status. By default, `OUT_OF_SERVICE` and `DOWN` map to 503. Any unmapped health statuses, including `UP`, map to 200. You might also want to register custom status mappings if you access the health endpoint over HTTP. Configuring a custom mapping disables the defaults mappings for `DOWN` and `OUT_OF_SERVICE`. If you want to retain the default mappings, you must explicitly configure them, alongside any custom mappings. For example, the following property maps `FATAL` to 503 (service unavailable) and retains the default mappings for `DOWN` and `OUT_OF_SERVICE`: Properties ``` management.endpoint.health.status.http-mapping.down=503 management.endpoint.health.status.http-mapping.fatal=503 management.endpoint.health.status.http-mapping.out-of-service=503 ``` Yaml ``` management: endpoint: health: status: http-mapping: down: 503 fatal: 503 out-of-service: 503 ``` | | | | --- | --- | | | If you need more control, you can define your own `HttpCodeStatusMapper` bean. | The following table shows the default status mappings for the built-in statuses: | Status | Mapping | | --- | --- | | `DOWN` | `SERVICE_UNAVAILABLE` (`503`) | | `OUT_OF_SERVICE` | `SERVICE_UNAVAILABLE` (`503`) | | `UP` | No mapping by default, so HTTP status is `200` | | `UNKNOWN` | No mapping by default, so HTTP status is `200` | #### 2.8.3. Reactive Health Indicators For reactive applications, such as those that use Spring WebFlux, `ReactiveHealthContributor` provides a non-blocking contract for getting application health. Similar to a traditional `HealthContributor`, health information is collected from the content of a [`ReactiveHealthContributorRegistry`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/ReactiveHealthContributorRegistry.java) (by default, all [`HealthContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/HealthContributor.java) and [`ReactiveHealthContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/ReactiveHealthContributor.java) instances defined in your `ApplicationContext`). Regular `HealthContributors` that do not check against a reactive API are executed on the elastic scheduler. | | | | --- | --- | | | In a reactive application, you should use the `ReactiveHealthContributorRegistry` to register and unregister health indicators at runtime. If you need to register a regular `HealthContributor`, you should wrap it with `ReactiveHealthContributor#adapt`. | To provide custom health information from a reactive API, you can register Spring beans that implement the [`ReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/health/ReactiveHealthIndicator.java) interface. The following code shows a sample `ReactiveHealthIndicator` implementation: Java ``` import reactor.core.publisher.Mono; import org.springframework.boot.actuate.health.Health; import org.springframework.boot.actuate.health.ReactiveHealthIndicator; import org.springframework.stereotype.Component; @Component public class MyReactiveHealthIndicator implements ReactiveHealthIndicator { @Override public Mono<Health> health() { return doHealthCheck().onErrorResume((exception) -> Mono.just(new Health.Builder().down(exception).build())); } private Mono<Health> doHealthCheck() { // perform some specific health check return ... } } ``` Kotlin ``` import org.springframework.boot.actuate.health.Health import org.springframework.boot.actuate.health.ReactiveHealthIndicator import org.springframework.stereotype.Component import reactor.core.publisher.Mono @Component class MyReactiveHealthIndicator : ReactiveHealthIndicator { override fun health(): Mono<Health> { return doHealthCheck()!!.onErrorResume { exception: Throwable? -> Mono.just(Health.Builder().down(exception).build()) } } private fun doHealthCheck(): Mono<Health>? { // perform some specific health check return ... } } ``` | | | | --- | --- | | | To handle the error automatically, consider extending from `AbstractReactiveHealthIndicator`. | #### 2.8.4. Auto-configured ReactiveHealthIndicators When appropriate, Spring Boot auto-configures the following `ReactiveHealthIndicators`: | Key | Name | Description | | --- | --- | --- | | `cassandra` | [`CassandraDriverReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/cassandra/CassandraDriverReactiveHealthIndicator.java) | Checks that a Cassandra database is up. | | `couchbase` | [`CouchbaseReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/couchbase/CouchbaseReactiveHealthIndicator.java) | Checks that a Couchbase cluster is up. | | `elasticsearch` | [`ElasticsearchReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/elasticsearch/ElasticsearchReactiveHealthIndicator.java) | Checks that an Elasticsearch cluster is up. | | `mongo` | [`MongoReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/mongo/MongoReactiveHealthIndicator.java) | Checks that a Mongo database is up. | | `neo4j` | [`Neo4jReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/neo4j/Neo4jReactiveHealthIndicator.java) | Checks that a Neo4j database is up. | | `redis` | [`RedisReactiveHealthIndicator`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/redis/RedisReactiveHealthIndicator.java) | Checks that a Redis server is up. | | | | | --- | --- | | | If necessary, reactive indicators replace the regular ones. Also, any `HealthIndicator` that is not handled explicitly is wrapped automatically. | #### 2.8.5. Health Groups It is sometimes useful to organize health indicators into groups that you can use for different purposes. To create a health indicator group, you can use the `management.endpoint.health.group.<name>` property and specify a list of health indicator IDs to `include` or `exclude`. For example, to create a group that includes only database indicators you can define the following: Properties ``` management.endpoint.health.group.custom.include=db ``` Yaml ``` management: endpoint: health: group: custom: include: "db" ``` You can then check the result by hitting `[localhost:8080/actuator/health/custom](http://localhost:8080/actuator/health/custom)`. Similarly, to create a group that excludes the database indicators from the group and includes all the other indicators, you can define the following: Properties ``` management.endpoint.health.group.custom.exclude=db ``` Yaml ``` management: endpoint: health: group: custom: exclude: "db" ``` By default, groups inherit the same `StatusAggregator` and `HttpCodeStatusMapper` settings as the system health. However, you can also define these on a per-group basis. You can also override the `show-details` and `roles` properties if required: Properties ``` management.endpoint.health.group.custom.show-details=when-authorized management.endpoint.health.group.custom.roles=admin management.endpoint.health.group.custom.status.order=fatal,up management.endpoint.health.group.custom.status.http-mapping.fatal=500 management.endpoint.health.group.custom.status.http-mapping.out-of-service=500 ``` Yaml ``` management: endpoint: health: group: custom: show-details: "when-authorized" roles: "admin" status: order: "fatal,up" http-mapping: fatal: 500 out-of-service: 500 ``` | | | | --- | --- | | | You can use `@Qualifier("groupname")` if you need to register custom `StatusAggregator` or `HttpCodeStatusMapper` beans for use with the group. | A health group can also include/exclude a `CompositeHealthContributor`. You can also include/exclude only a certain component of a `CompositeHealthContributor`. This can be done using the fully qualified name of the component as follows: ``` management.endpoint.health.group.custom.include="test/primary" management.endpoint.health.group.custom.exclude="test/primary/b" ``` In the example above, the `custom` group will include the `HealthContributor` with the name `primary` which is a component of the composite `test`. Here, `primary` itself is a composite and the `HealthContributor` with the name `b` will be excluded from the `custom` group. Health groups can be made available at an additional path on either the main or management port. This is useful in cloud environments such as Kubernetes, where it is quite common to use a separate management port for the actuator endpoints for security purposes. Having a separate port could lead to unreliable health checks because the main application might not work properly even if the health check is successful. The health group can be configured with an additional path as follows: ``` management.endpoint.health.group.live.additional-path="server:/healthz" ``` This would make the `live` health group available on the main server port at `/healthz`. The prefix is mandatory and must be either `server:` (represents the main server port) or `management:` (represents the management port, if configured.) The path must be a single path segment. #### 2.8.6. DataSource Health The `DataSource` health indicator shows the health of both standard data sources and routing data source beans. The health of a routing data source includes the health of each of its target data sources. In the health endpoint’s response, each of a routing data source’s targets is named by using its routing key. If you prefer not to include routing data sources in the indicator’s output, set `management.health.db.ignore-routing-data-sources` to `true`. ### 2.9. Kubernetes Probes Applications deployed on Kubernetes can provide information about their internal state with [Container Probes](https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/#container-probes). Depending on [your Kubernetes configuration](https://kubernetes.io/docs/tasks/configure-pod-container/configure-liveness-readiness-startup-probes/), the kubelet calls those probes and reacts to the result. By default, Spring Boot manages your [Application Availability State](features#features.spring-application.application-availability). If deployed in a Kubernetes environment, actuator gathers the “Liveness” and “Readiness” information from the `ApplicationAvailability` interface and uses that information in dedicated [health indicators](#actuator.endpoints.health.auto-configured-health-indicators): `LivenessStateHealthIndicator` and `ReadinessStateHealthIndicator`. These indicators are shown on the global health endpoint (`"/actuator/health"`). They are also exposed as separate HTTP Probes by using [health groups](#actuator.endpoints.health.groups): `"/actuator/health/liveness"` and `"/actuator/health/readiness"`. You can then configure your Kubernetes infrastructure with the following endpoint information: ``` livenessProbe: httpGet: path: "/actuator/health/liveness" port: <actuator-port> failureThreshold: ... periodSeconds: ... readinessProbe: httpGet: path: "/actuator/health/readiness" port: <actuator-port> failureThreshold: ... periodSeconds: ... ``` | | | | --- | --- | | | `<actuator-port>` should be set to the port that the actuator endpoints are available on. It could be the main web server port or a separate management port if the `"management.server.port"` property has been set. | These health groups are automatically enabled only if the application [runs in a Kubernetes environment](deployment#deployment.cloud.kubernetes). You can enable them in any environment by using the `management.endpoint.health.probes.enabled` configuration property. | | | | --- | --- | | | If an application takes longer to start than the configured liveness period, Kubernetes mentions the `"startupProbe"` as a possible solution. The `"startupProbe"` is not necessarily needed here, as the `"readinessProbe"` fails until all startup tasks are done. See the section that describes [how probes behave during the application lifecycle](#actuator.endpoints.kubernetes-probes.lifecycle). | If your Actuator endpoints are deployed on a separate management context, the endpoints do not use the same web infrastructure (port, connection pools, framework components) as the main application. In this case, a probe check could be successful even if the main application does not work properly (for example, it cannot accept new connections). For this reason, is it a good idea to make the `liveness` and `readiness` health groups available on the main server port. This can be done by setting the following property: ``` management.endpoint.health.probes.add-additional-paths=true ``` This would make `liveness` available at `/livez` and `readiness` at `readyz` on the main server port. #### 2.9.1. Checking External State with Kubernetes Probes Actuator configures the “liveness” and “readiness” probes as Health Groups. This means that all the [health groups features](#actuator.endpoints.health.groups) are available for them. You can, for example, configure additional Health Indicators: Properties ``` management.endpoint.health.group.readiness.include=readinessState,customCheck ``` Yaml ``` management: endpoint: health: group: readiness: include: "readinessState,customCheck" ``` By default, Spring Boot does not add other health indicators to these groups. The “liveness” probe should not depend on health checks for external systems. If the [liveness state of an application](features#features.spring-application.application-availability.liveness) is broken, Kubernetes tries to solve that problem by restarting the application instance. This means that if an external system (such as a database, a Web API, or an external cache) fails, Kubernetes might restart all application instances and create cascading failures. As for the “readiness” probe, the choice of checking external systems must be made carefully by the application developers. For this reason, Spring Boot does not include any additional health checks in the readiness probe. If the [readiness state of an application instance](features#features.spring-application.application-availability.readiness) is unready, Kubernetes does not route traffic to that instance. Some external systems might not be shared by application instances, in which case they could be included in a readiness probe. Other external systems might not be essential to the application (the application could have circuit breakers and fallbacks), in which case they definitely should not be included. Unfortunately, an external system that is shared by all application instances is common, and you have to make a judgement call: Include it in the readiness probe and expect that the application is taken out of service when the external service is down or leave it out and deal with failures higher up the stack, perhaps by using a circuit breaker in the caller. | | | | --- | --- | | | If all instances of an application are unready, a Kubernetes Service with `type=ClusterIP` or `NodePort` does not accept any incoming connections. There is no HTTP error response (503 and so on), since there is no connection. A service with `type=LoadBalancer` might or might not accept connections, depending on the provider. A service that has an explicit [ingress](https://kubernetes.io/docs/concepts/services-networking/ingress/) also responds in a way that depends on the implementation — the ingress service itself has to decide how to handle the “connection refused” from downstream. HTTP 503 is quite likely in the case of both load balancer and ingress. | Also, if an application uses Kubernetes [autoscaling](https://kubernetes.io/docs/tasks/run-application/horizontal-pod-autoscale/), it may react differently to applications being taken out of the load-balancer, depending on its autoscaler configuration. #### 2.9.2. Application Lifecycle and Probe States An important aspect of the Kubernetes Probes support is its consistency with the application lifecycle. There is a significant difference between the `AvailabilityState` (which is the in-memory, internal state of the application) and the actual probe (which exposes that state). Depending on the phase of application lifecycle, the probe might not be available. Spring Boot publishes [application events during startup and shutdown](features#features.spring-application.application-events-and-listeners), and probes can listen to such events and expose the `AvailabilityState` information. The following tables show the `AvailabilityState` and the state of HTTP connectors at different stages. When a Spring Boot application starts: | Startup phase | LivenessState | ReadinessState | HTTP server | Notes | | --- | --- | --- | --- | --- | | Starting | `BROKEN` | `REFUSING_TRAFFIC` | Not started | Kubernetes checks the "liveness" Probe and restarts the application if it takes too long. | | Started | `CORRECT` | `REFUSING_TRAFFIC` | Refuses requests | The application context is refreshed. The application performs startup tasks and does not receive traffic yet. | | Ready | `CORRECT` | `ACCEPTING_TRAFFIC` | Accepts requests | Startup tasks are finished. The application is receiving traffic. | When a Spring Boot application shuts down: | Shutdown phase | Liveness State | Readiness State | HTTP server | Notes | | --- | --- | --- | --- | --- | | Running | `CORRECT` | `ACCEPTING_TRAFFIC` | Accepts requests | Shutdown has been requested. | | Graceful shutdown | `CORRECT` | `REFUSING_TRAFFIC` | New requests are rejected | If enabled, [graceful shutdown processes in-flight requests](web#web.graceful-shutdown). | | Shutdown complete | N/A | N/A | Server is shut down | The application context is closed and the application is shut down. | | | | | --- | --- | | | See [Kubernetes container lifecycle section](deployment#deployment.cloud.kubernetes.container-lifecycle) for more information about Kubernetes deployment. | ### 2.10. Application Information Application information exposes various information collected from all [`InfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/InfoContributor.java) beans defined in your `ApplicationContext`. Spring Boot includes a number of auto-configured `InfoContributor` beans, and you can write your own. #### 2.10.1. Auto-configured InfoContributors When appropriate, Spring auto-configures the following `InfoContributor` beans: | ID | Name | Description | Prerequisites | | --- | --- | --- | --- | | `build` | [`BuildInfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/BuildInfoContributor.java) | Exposes build information. | A `META-INF/build-info.properties` resource. | | `env` | [`EnvironmentInfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/EnvironmentInfoContributor.java) | Exposes any property from the `Environment` whose name starts with `info.`. | None. | | `git` | [`GitInfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/GitInfoContributor.java) | Exposes git information. | A `git.properties` resource. | | `java` | [`JavaInfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/JavaInfoContributor.java) | Exposes Java runtime information. | None. | | `os` | [`OsInfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/OsInfoContributor.java) | Exposes Operating System information. | None. | Whether an individual contributor is enabled is controlled by its `management.info.<id>.enabled` property. Different contributors have different defaults for this property, depending on their prerequisites and the nature of the information that they expose. With no prerequisites to indicate that they should be enabled, the `env`, `java`, and `os` contributors are disabled by default. Each can be enabled by setting its `management.info.<id>.enabled` property to `true`. The `build` and `git` info contributors are enabled by default. Each can be disabled by setting its `management.info.<id>.enabled` property to `false`. Alternatively, to disable every contributor that is usually enabled by default, set the `management.info.defaults.enabled` property to `false`. #### 2.10.2. Custom Application Information When the `env` contributor is enabled, you can customize the data exposed by the `info` endpoint by setting `info.*` Spring properties. All `Environment` properties under the `info` key are automatically exposed. For example, you could add the following settings to your `application.properties` file: Properties ``` info.app.encoding=UTF-8 info.app.java.source=11 info.app.java.target=11 ``` Yaml ``` info: app: encoding: "UTF-8" java: source: "11" target: "11" ``` | | | | --- | --- | | | Rather than hardcoding those values, you could also [expand info properties at build time](howto#howto.properties-and-configuration.expand-properties). Assuming you use Maven, you could rewrite the preceding example as follows: Properties ``` [email protected]@ [email protected]@ [email protected]@ ``` Yaml ``` info: app: encoding: "@project.build.sourceEncoding@" java: source: "@java.version@" target: "@java.version@" ``` | #### 2.10.3. Git Commit Information Another useful feature of the `info` endpoint is its ability to publish information about the state of your `git` source code repository when the project was built. If a `GitProperties` bean is available, you can use the `info` endpoint to expose these properties. | | | | --- | --- | | | A `GitProperties` bean is auto-configured if a `git.properties` file is available at the root of the classpath. See "[how to generate git information](howto#howto.build.generate-git-info)" for more detail. | By default, the endpoint exposes `git.branch`, `git.commit.id`, and `git.commit.time` properties, if present. If you do not want any of these properties in the endpoint response, they need to be excluded from the `git.properties` file. If you want to display the full git information (that is, the full content of `git.properties`), use the `management.info.git.mode` property, as follows: Properties ``` management.info.git.mode=full ``` Yaml ``` management: info: git: mode: "full" ``` To disable the git commit information from the `info` endpoint completely, set the `management.info.git.enabled` property to `false`, as follows: Properties ``` management.info.git.enabled=false ``` Yaml ``` management: info: git: enabled: false ``` #### 2.10.4. Build Information If a `BuildProperties` bean is available, the `info` endpoint can also publish information about your build. This happens if a `META-INF/build-info.properties` file is available in the classpath. | | | | --- | --- | | | The Maven and Gradle plugins can both generate that file. See "[how to generate build information](howto#howto.build.generate-info)" for more details. | #### 2.10.5. Java Information The `info` endpoint publishes information about your Java runtime environment, see [`JavaInfo`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/info/JavaInfo.html) for more details. #### 2.10.6. OS Information The `info` endpoint publishes information about your Operating System, see [`OsInfo`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/info/OsInfo.html) for more details. #### 2.10.7. Writing Custom InfoContributors To provide custom application information, you can register Spring beans that implement the [`InfoContributor`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator/src/main/java/org/springframework/boot/actuate/info/InfoContributor.java) interface. The following example contributes an `example` entry with a single value: Java ``` import java.util.Collections; import org.springframework.boot.actuate.info.Info; import org.springframework.boot.actuate.info.InfoContributor; import org.springframework.stereotype.Component; @Component public class MyInfoContributor implements InfoContributor { @Override public void contribute(Info.Builder builder) { builder.withDetail("example", Collections.singletonMap("key", "value")); } } ``` Kotlin ``` import org.springframework.boot.actuate.info.Info import org.springframework.boot.actuate.info.InfoContributor import org.springframework.stereotype.Component import java.util.Collections @Component class MyInfoContributor : InfoContributor { override fun contribute(builder: Info.Builder) { builder.withDetail("example", Collections.singletonMap("key", "value")) } } ``` If you reach the `info` endpoint, you should see a response that contains the following additional entry: ``` { "example": { "key" : "value" } } ``` 3. Monitoring and Management over HTTP --------------------------------------- If you are developing a web application, Spring Boot Actuator auto-configures all enabled endpoints to be exposed over HTTP. The default convention is to use the `id` of the endpoint with a prefix of `/actuator` as the URL path. For example, `health` is exposed as `/actuator/health`. | | | | --- | --- | | | Actuator is supported natively with Spring MVC, Spring WebFlux, and Jersey. If both Jersey and Spring MVC are available, Spring MVC is used. | | | | | --- | --- | | | Jackson is a required dependency in order to get the correct JSON responses as documented in the API documentation ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/htmlsingle) or [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/pdf/spring-boot-actuator-web-api.pdf)). | ### 3.1. Customizing the Management Endpoint Paths Sometimes, it is useful to customize the prefix for the management endpoints. For example, your application might already use `/actuator` for another purpose. You can use the `management.endpoints.web.base-path` property to change the prefix for your management endpoint, as the following example shows: Properties ``` management.endpoints.web.base-path=/manage ``` Yaml ``` management: endpoints: web: base-path: "/manage" ``` The preceding `application.properties` example changes the endpoint from `/actuator/{id}` to `/manage/{id}` (for example, `/manage/info`). | | | | --- | --- | | | Unless the management port has been configured to [expose endpoints by using a different HTTP port](#actuator.monitoring.customizing-management-server-port), `management.endpoints.web.base-path` is relative to `server.servlet.context-path` (for servlet web applications) or `spring.webflux.base-path` (for reactive web applications). If `management.server.port` is configured, `management.endpoints.web.base-path` is relative to `management.server.base-path`. | If you want to map endpoints to a different path, you can use the `management.endpoints.web.path-mapping` property. The following example remaps `/actuator/health` to `/healthcheck`: Properties ``` management.endpoints.web.base-path=/ management.endpoints.web.path-mapping.health=healthcheck ``` Yaml ``` management: endpoints: web: base-path: "/" path-mapping: health: "healthcheck" ``` ### 3.2. Customizing the Management Server Port Exposing management endpoints by using the default HTTP port is a sensible choice for cloud-based deployments. If, however, your application runs inside your own data center, you may prefer to expose endpoints by using a different HTTP port. You can set the `management.server.port` property to change the HTTP port, as the following example shows: Properties ``` management.server.port=8081 ``` Yaml ``` management: server: port: 8081 ``` | | | | --- | --- | | | On Cloud Foundry, by default, applications receive requests only on port 8080 for both HTTP and TCP routing. If you want to use a custom management port on Cloud Foundry, you need to explicitly set up the application’s routes to forward traffic to the custom port. | ### 3.3. Configuring Management-specific SSL When configured to use a custom port, you can also configure the management server with its own SSL by using the various `management.server.ssl.*` properties. For example, doing so lets a management server be available over HTTP while the main application uses HTTPS, as the following property settings show: Properties ``` server.port=8443 server.ssl.enabled=true server.ssl.key-store=classpath:store.jks server.ssl.key-password=secret management.server.port=8080 management.server.ssl.enabled=false ``` Yaml ``` server: port: 8443 ssl: enabled: true key-store: "classpath:store.jks" key-password: "secret" management: server: port: 8080 ssl: enabled: false ``` Alternatively, both the main server and the management server can use SSL but with different key stores, as follows: Properties ``` server.port=8443 server.ssl.enabled=true server.ssl.key-store=classpath:main.jks server.ssl.key-password=secret management.server.port=8080 management.server.ssl.enabled=true management.server.ssl.key-store=classpath:management.jks management.server.ssl.key-password=secret ``` Yaml ``` server: port: 8443 ssl: enabled: true key-store: "classpath:main.jks" key-password: "secret" management: server: port: 8080 ssl: enabled: true key-store: "classpath:management.jks" key-password: "secret" ``` ### 3.4. Customizing the Management Server Address You can customize the address on which the management endpoints are available by setting the `management.server.address` property. Doing so can be useful if you want to listen only on an internal or ops-facing network or to listen only for connections from `localhost`. | | | | --- | --- | | | You can listen on a different address only when the port differs from the main server port. | The following example `application.properties` does not allow remote management connections: Properties ``` management.server.port=8081 management.server.address=127.0.0.1 ``` Yaml ``` management: server: port: 8081 address: "127.0.0.1" ``` ### 3.5. Disabling HTTP Endpoints If you do not want to expose endpoints over HTTP, you can set the management port to `-1`, as the following example shows: Properties ``` management.server.port=-1 ``` Yaml ``` management: server: port: -1 ``` You can also achieve this by using the `management.endpoints.web.exposure.exclude` property, as the following example shows: Properties ``` management.endpoints.web.exposure.exclude=* ``` Yaml ``` management: endpoints: web: exposure: exclude: "*" ``` 4. Monitoring and Management over JMX -------------------------------------- Java Management Extensions (JMX) provide a standard mechanism to monitor and manage applications. By default, this feature is not enabled. You can turn it on by setting the `spring.jmx.enabled` configuration property to `true`. Spring Boot exposes the most suitable `MBeanServer` as a bean with an ID of `mbeanServer`. Any of your beans that are annotated with Spring JMX annotations (`@ManagedResource`, `@ManagedAttribute`, or `@ManagedOperation`) are exposed to it. If your platform provides a standard `MBeanServer`, Spring Boot uses that and defaults to the VM `MBeanServer`, if necessary. If all that fails, a new `MBeanServer` is created. See the [`JmxAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jmx/JmxAutoConfiguration.java) class for more details. By default, Spring Boot also exposes management endpoints as JMX MBeans under the `org.springframework.boot` domain. To take full control over endpoint registration in the JMX domain, consider registering your own `EndpointObjectNameFactory` implementation. ### 4.1. Customizing MBean Names The name of the MBean is usually generated from the `id` of the endpoint. For example, the `health` endpoint is exposed as `org.springframework.boot:type=Endpoint,name=Health`. If your application contains more than one Spring `ApplicationContext`, you may find that names clash. To solve this problem, you can set the `spring.jmx.unique-names` property to `true` so that MBean names are always unique. You can also customize the JMX domain under which endpoints are exposed. The following settings show an example of doing so in `application.properties`: Properties ``` spring.jmx.unique-names=true management.endpoints.jmx.domain=com.example.myapp ``` Yaml ``` spring: jmx: unique-names: true management: endpoints: jmx: domain: "com.example.myapp" ``` ### 4.2. Disabling JMX Endpoints If you do not want to expose endpoints over JMX, you can set the `management.endpoints.jmx.exposure.exclude` property to `*`, as the following example shows: Properties ``` management.endpoints.jmx.exposure.exclude=* ``` Yaml ``` management: endpoints: jmx: exposure: exclude: "*" ``` ### 4.3. Using Jolokia for JMX over HTTP Jolokia is a JMX-HTTP bridge that provides an alternative method of accessing JMX beans. To use Jolokia, include a dependency to `org.jolokia:jolokia-core`. For example, with Maven, you would add the following dependency: ``` <dependency> <groupId>org.jolokia</groupId> <artifactId>jolokia-core</artifactId> </dependency> ``` You can then expose the Jolokia endpoint by adding `jolokia` or `*` to the `management.endpoints.web.exposure.include` property. You can then access it by using `/actuator/jolokia` on your management HTTP server. | | | | --- | --- | | | The Jolokia endpoint exposes Jolokia’s servlet as an actuator endpoint. As a result, it is specific to servlet environments, such as Spring MVC and Jersey. The endpoint is not available in a WebFlux application. | #### 4.3.1. Customizing Jolokia Jolokia has a number of settings that you would traditionally configure by setting servlet parameters. With Spring Boot, you can use your `application.properties` file. To do so, prefix the parameter with `management.endpoint.jolokia.config.`, as the following example shows: Properties ``` management.endpoint.jolokia.config.debug=true ``` Yaml ``` management: endpoint: jolokia: config: debug: true ``` #### 4.3.2. Disabling Jolokia If you use Jolokia but do not want Spring Boot to configure it, set the `management.endpoint.jolokia.enabled` property to `false`, as follows: Properties ``` management.endpoint.jolokia.enabled=false ``` Yaml ``` management: endpoint: jolokia: enabled: false ``` 5. Loggers ----------- Spring Boot Actuator includes the ability to view and configure the log levels of your application at runtime. You can view either the entire list or an individual logger’s configuration, which is made up of both the explicitly configured logging level as well as the effective logging level given to it by the logging framework. These levels can be one of: * `TRACE` * `DEBUG` * `INFO` * `WARN` * `ERROR` * `FATAL` * `OFF` * `null` `null` indicates that there is no explicit configuration. ### 5.1. Configure a Logger To configure a given logger, `POST` a partial entity to the resource’s URI, as the following example shows: ``` { "configuredLevel": "DEBUG" } ``` | | | | --- | --- | | | To “reset” the specific level of the logger (and use the default configuration instead), you can pass a value of `null` as the `configuredLevel`. | 6. Metrics ----------- Spring Boot Actuator provides dependency management and auto-configuration for [Micrometer](https://micrometer.io), an application metrics facade that supports [numerous monitoring systems](https://micrometer.io/docs), including: * [AppOptics](#actuator.metrics.export.appoptics) * [Atlas](#actuator.metrics.export.atlas) * [Datadog](#actuator.metrics.export.datadog) * [Dynatrace](#actuator.metrics.export.dynatrace) * [Elastic](#actuator.metrics.export.elastic) * [Ganglia](#actuator.metrics.export.ganglia) * [Graphite](#actuator.metrics.export.graphite) * [Humio](#actuator.metrics.export.humio) * [Influx](#actuator.metrics.export.influx) * [JMX](#actuator.metrics.export.jmx) * [KairosDB](#actuator.metrics.export.kairos) * [New Relic](#actuator.metrics.export.newrelic) * [Prometheus](#actuator.metrics.export.prometheus) * [SignalFx](#actuator.metrics.export.signalfx) * [Simple (in-memory)](#actuator.metrics.export.simple) * [Stackdriver](#actuator.metrics.export.stackdriver) * [StatsD](#actuator.metrics.export.statsd) * [Wavefront](#actuator.metrics.export.wavefront) | | | | --- | --- | | | To learn more about Micrometer’s capabilities, see its [reference documentation](https://micrometer.io/docs), in particular the [concepts section](https://micrometer.io/docs/concepts). | ### 6.1. Getting started Spring Boot auto-configures a composite `MeterRegistry` and adds a registry to the composite for each of the supported implementations that it finds on the classpath. Having a dependency on `micrometer-registry-{system}` in your runtime classpath is enough for Spring Boot to configure the registry. Most registries share common features. For instance, you can disable a particular registry even if the Micrometer registry implementation is on the classpath. The following example disables Datadog: Properties ``` management.metrics.export.datadog.enabled=false ``` Yaml ``` management: metrics: export: datadog: enabled: false ``` You can also disable all registries unless stated otherwise by the registry-specific property, as the following example shows: Properties ``` management.metrics.export.defaults.enabled=false ``` Yaml ``` management: metrics: export: defaults: enabled: false ``` Spring Boot also adds any auto-configured registries to the global static composite registry on the `Metrics` class, unless you explicitly tell it not to: Properties ``` management.metrics.use-global-registry=false ``` Yaml ``` management: metrics: use-global-registry: false ``` You can register any number of `MeterRegistryCustomizer` beans to further configure the registry, such as applying common tags, before any meters are registered with the registry: Java ``` import io.micrometer.core.instrument.MeterRegistry; import org.springframework.boot.actuate.autoconfigure.metrics.MeterRegistryCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyMeterRegistryConfiguration { @Bean public MeterRegistryCustomizer<MeterRegistry> metricsCommonTags() { return (registry) -> registry.config().commonTags("region", "us-east-1"); } } ``` Kotlin ``` import io.micrometer.core.instrument.MeterRegistry import org.springframework.boot.actuate.autoconfigure.metrics.MeterRegistryCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyMeterRegistryConfiguration { @Bean fun metricsCommonTags(): MeterRegistryCustomizer<MeterRegistry> { return MeterRegistryCustomizer { registry -> registry.config().commonTags("region", "us-east-1") } } } ``` You can apply customizations to particular registry implementations by being more specific about the generic type: Java ``` import io.micrometer.core.instrument.Meter; import io.micrometer.core.instrument.config.NamingConvention; import io.micrometer.graphite.GraphiteMeterRegistry; import org.springframework.boot.actuate.autoconfigure.metrics.MeterRegistryCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyMeterRegistryConfiguration { @Bean public MeterRegistryCustomizer<GraphiteMeterRegistry> graphiteMetricsNamingConvention() { return (registry) -> registry.config().namingConvention(this::name); } private String name(String name, Meter.Type type, String baseUnit) { return ... } } ``` Kotlin ``` import io.micrometer.core.instrument.Meter import io.micrometer.core.instrument.config.NamingConvention import io.micrometer.graphite.GraphiteMeterRegistry import org.springframework.boot.actuate.autoconfigure.metrics.MeterRegistryCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyMeterRegistryConfiguration { @Bean fun graphiteMetricsNamingConvention(): MeterRegistryCustomizer<GraphiteMeterRegistry> { return MeterRegistryCustomizer { registry: GraphiteMeterRegistry -> registry.config().namingConvention(this::name) } } private fun name(name: String, type: Meter.Type, baseUnit: String?): String { return ... } } ``` Spring Boot also [configures built-in instrumentation](#actuator.metrics.supported) that you can control through configuration or dedicated annotation markers. ### 6.2. Supported Monitoring Systems This section briefly describes each of the supported monitoring systems. #### 6.2.1. AppOptics By default, the AppOptics registry periodically pushes metrics to `[api.appoptics.com/v1/measurements](https://api.appoptics.com/v1/measurements)`. To export metrics to SaaS [AppOptics](https://micrometer.io/docs/registry/appOptics), your API token must be provided: Properties ``` management.metrics.export.appoptics.api-token=YOUR_TOKEN ``` Yaml ``` management: metrics: export: appoptics: api-token: "YOUR_TOKEN" ``` #### 6.2.2. Atlas By default, metrics are exported to [Atlas](https://micrometer.io/docs/registry/atlas) running on your local machine. You can provide the location of the [Atlas server](https://github.com/Netflix/atlas): Properties ``` management.metrics.export.atlas.uri=https://atlas.example.com:7101/api/v1/publish ``` Yaml ``` management: metrics: export: atlas: uri: "https://atlas.example.com:7101/api/v1/publish" ``` #### 6.2.3. Datadog A Datadog registry periodically pushes metrics to [datadoghq](https://www.datadoghq.com). To export metrics to [Datadog](https://micrometer.io/docs/registry/datadog), you must provide your API key: Properties ``` management.metrics.export.datadog.api-key=YOUR_KEY ``` Yaml ``` management: metrics: export: datadog: api-key: "YOUR_KEY" ``` If you additionally provide an application key (optional), then metadata such as meter descriptions, types, and base units will also be exported: Properties ``` management.metrics.export.datadog.api-key=YOUR_API_KEY management.metrics.export.datadog.application-key=YOUR_APPLICATION_KEY ``` Yaml ``` management: metrics: export: datadog: api-key: "YOUR_API_KEY" application-key: "YOUR_APPLICATION_KEY" ``` By default, metrics are sent to the Datadog US [site](https://docs.datadoghq.com/getting_started/site) (`[api.datadoghq.com](https://api.datadoghq.com)`). If your Datadog project is hosted on one of the other sites, or you need to send metrics through a proxy, configure the URI accordingly: Properties ``` management.metrics.export.datadog.uri=https://api.datadoghq.eu ``` Yaml ``` management: metrics: export: datadog: uri: "https://api.datadoghq.eu" ``` You can also change the interval at which metrics are sent to Datadog: Properties ``` management.metrics.export.datadog.step=30s ``` Yaml ``` management: metrics: export: datadog: step: "30s" ``` #### 6.2.4. Dynatrace Dynatrace offers two metrics ingest APIs, both of which are implemented for [Micrometer](https://micrometer.io/docs/registry/dynatrace). Configuration properties in the `v1` namespace apply only when exporting to the [Timeseries v1 API](https://www.dynatrace.com/support/help/dynatrace-api/environment-api/metric-v1/). Configuration properties in the `v2` namespace apply only when exporting to the [Metrics v2 API](https://www.dynatrace.com/support/help/dynatrace-api/environment-api/metric-v2/post-ingest-metrics/). Note that this integration can export only to either the `v1` or `v2` version of the API at a time. If the `device-id` (required for v1 but not used in v2) is set in the `v1` namespace, metrics are exported to the `v1` endpoint. Otherwise, `v2` is assumed. ##### v2 API You can use the v2 API in two ways. If a local OneAgent is running on the host, metrics are automatically exported to the [local OneAgent ingest endpoint](https://www.dynatrace.com/support/help/how-to-use-dynatrace/metrics/metric-ingestion/ingestion-methods/local-api/). The ingest endpoint forwards the metrics to the Dynatrace backend. This is the default behavior and requires no special setup beyond a dependency on `io.micrometer:micrometer-registry-dynatrace`. If no local OneAgent is running, the endpoint of the [Metrics v2 API](https://www.dynatrace.com/support/help/dynatrace-api/environment-api/metric-v2/post-ingest-metrics/) and an API token are required. The [API token](https://www.dynatrace.com/support/help/dynatrace-api/basics/dynatrace-api-authentication/) must have the “Ingest metrics” (`metrics.ingest`) permission set. We recommend limiting the scope of the token to this one permission. You must ensure that the endpoint URI contains the path (for example, `/api/v2/metrics/ingest`): The URL of the Metrics API v2 ingest endpoint is different according to your deployment option: * SaaS: `https://{your-environment-id}.live.dynatrace.com/api/v2/metrics/ingest` * Managed deployments: `https://{your-domain}/e/{your-environment-id}/api/v2/metrics/ingest` The example below configures metrics export using the `example` environment id: Properties ``` management.metrics.export.dynatrace.uri=https://example.live.dynatrace.com/api/v2/metrics/ingest management.metrics.export.dynatrace.api-token=YOUR_TOKEN ``` Yaml ``` management: metrics: export: dynatrace: uri: "https://example.live.dynatrace.com/api/v2/metrics/ingest" api-token: "YOUR_TOKEN" ``` When using the Dynatrace v2 API, the following optional features are available: * Metric key prefix: Sets a prefix that is prepended to all exported metric keys. * Enrich with Dynatrace metadata: If a OneAgent or Dynatrace operator is running, enrich metrics with additional metadata (for example, about the host, process, or pod). * Default dimensions: Specify key-value pairs that are added to all exported metrics. If tags with the same key are specified with Micrometer, they overwrite the default dimensions. It is possible to not specify a URI and API token, as shown in the following example. In this scenario, the local OneAgent endpoint is used: Properties ``` management.metrics.export.dynatrace.v2.metric-key-prefix=your.key.prefix management.metrics.export.dynatrace.v2.enrich-with-dynatrace-metadata=true management.metrics.export.dynatrace.v2.default-dimensions.key1=value1 management.metrics.export.dynatrace.v2.default-dimensions.key2=value2 ``` Yaml ``` management: metrics: export: dynatrace: # Specify uri and api-token here if not using the local OneAgent endpoint. v2: metric-key-prefix: "your.key.prefix" enrich-with-dynatrace-metadata: true default-dimensions: key1: "value1" key2: "value2" ``` ##### v1 API (Legacy) The Dynatrace v1 API metrics registry pushes metrics to the configured URI periodically by using the [Timeseries v1 API](https://www.dynatrace.com/support/help/dynatrace-api/environment-api/metric-v1/). For backwards-compatibility with existing setups, when `device-id` is set (required for v1, but not used in v2), metrics are exported to the Timeseries v1 endpoint. To export metrics to [Dynatrace](https://micrometer.io/docs/registry/dynatrace), your API token, device ID, and URI must be provided: Properties ``` management.metrics.export.dynatrace.uri=https://{your-environment-id}.live.dynatrace.com management.metrics.export.dynatrace.api-token=YOUR_TOKEN management.metrics.export.dynatrace.v1.device-id=YOUR_DEVICE_ID ``` Yaml ``` management: metrics: export: dynatrace: uri: "https://{your-environment-id}.live.dynatrace.com" api-token: "YOUR_TOKEN" v1: device-id: "YOUR_DEVICE_ID" ``` For the v1 API, you must specify the base environment URI without a path, as the v1 endpoint path is added automatically. ##### Version-independent Settings In addition to the API endpoint and token, you can also change the interval at which metrics are sent to Dynatrace. The default export interval is `60s`. The following example sets the export interval to 30 seconds: Properties ``` management.metrics.export.dynatrace.step=30s ``` Yaml ``` management: metrics: export: dynatrace: step: "30s" ``` You can find more information on how to set up the Dynatrace exporter for Micrometer in [the Micrometer documentation](https://micrometer.io/docs/registry/dynatrace). #### 6.2.5. Elastic By default, metrics are exported to [Elastic](https://micrometer.io/docs/registry/elastic) running on your local machine. You can provide the location of the Elastic server to use by using the following property: Properties ``` management.metrics.export.elastic.host=https://elastic.example.com:8086 ``` Yaml ``` management: metrics: export: elastic: host: "https://elastic.example.com:8086" ``` #### 6.2.6. Ganglia By default, metrics are exported to [Ganglia](https://micrometer.io/docs/registry/ganglia) running on your local machine. You can provide the [Ganglia server](http://ganglia.sourceforge.net) host and port, as the following example shows: Properties ``` management.metrics.export.ganglia.host=ganglia.example.com management.metrics.export.ganglia.port=9649 ``` Yaml ``` management: metrics: export: ganglia: host: "ganglia.example.com" port: 9649 ``` #### 6.2.7. Graphite By default, metrics are exported to [Graphite](https://micrometer.io/docs/registry/graphite) running on your local machine. You can provide the [Graphite server](https://graphiteapp.org) host and port, as the following example shows: Properties ``` management.metrics.export.graphite.host=graphite.example.com management.metrics.export.graphite.port=9004 ``` Yaml ``` management: metrics: export: graphite: host: "graphite.example.com" port: 9004 ``` Micrometer provides a default `HierarchicalNameMapper` that governs how a dimensional meter ID is [mapped to flat hierarchical names](https://micrometer.io/docs/registry/graphite#_hierarchical_name_mapping). | | | | --- | --- | | | To take control over this behavior, define your `GraphiteMeterRegistry` and supply your own `HierarchicalNameMapper`. An auto-configured `GraphiteConfig` and `Clock` beans are provided unless you define your own: Java ``` import io.micrometer.core.instrument.Clock; import io.micrometer.core.instrument.Meter; import io.micrometer.core.instrument.config.NamingConvention; import io.micrometer.core.instrument.util.HierarchicalNameMapper; import io.micrometer.graphite.GraphiteConfig; import io.micrometer.graphite.GraphiteMeterRegistry; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyGraphiteConfiguration { @Bean public GraphiteMeterRegistry graphiteMeterRegistry(GraphiteConfig config, Clock clock) { return new GraphiteMeterRegistry(config, clock, this::toHierarchicalName); } private String toHierarchicalName(Meter.Id id, NamingConvention convention) { return ... } } ``` Kotlin ``` import io.micrometer.core.instrument.Clock import io.micrometer.core.instrument.Meter import io.micrometer.core.instrument.config.NamingConvention import io.micrometer.core.instrument.util.HierarchicalNameMapper import io.micrometer.graphite.GraphiteConfig import io.micrometer.graphite.GraphiteMeterRegistry import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyGraphiteConfiguration { @Bean fun graphiteMeterRegistry(config: GraphiteConfig, clock: Clock): GraphiteMeterRegistry { return GraphiteMeterRegistry(config, clock, this::toHierarchicalName) } private fun toHierarchicalName(id: Meter.Id, convention: NamingConvention): String { return ... } } ``` | #### 6.2.8. Humio By default, the Humio registry periodically pushes metrics to [cloud.humio.com](https://cloud.humio.com). To export metrics to SaaS [Humio](https://micrometer.io/docs/registry/humio), you must provide your API token: Properties ``` management.metrics.export.humio.api-token=YOUR_TOKEN ``` Yaml ``` management: metrics: export: humio: api-token: "YOUR_TOKEN" ``` You should also configure one or more tags to identify the data source to which metrics are pushed: Properties ``` management.metrics.export.humio.tags.alpha=a management.metrics.export.humio.tags.bravo=b ``` Yaml ``` management: metrics: export: humio: tags: alpha: "a" bravo: "b" ``` #### 6.2.9. Influx By default, metrics are exported to an [Influx](https://micrometer.io/docs/registry/influx) v1 instance running on your local machine with the default configuration. To export metrics to InfluxDB v2, configure the `org`, `bucket`, and authentication `token` for writing metrics. You can provide the location of the [Influx server](https://www.influxdata.com) to use by using: Properties ``` management.metrics.export.influx.uri=https://influx.example.com:8086 ``` Yaml ``` management: metrics: export: influx: uri: "https://influx.example.com:8086" ``` #### 6.2.10. JMX Micrometer provides a hierarchical mapping to [JMX](https://micrometer.io/docs/registry/jmx), primarily as a cheap and portable way to view metrics locally. By default, metrics are exported to the `metrics` JMX domain. You can provide the domain to use by using: Properties ``` management.metrics.export.jmx.domain=com.example.app.metrics ``` Yaml ``` management: metrics: export: jmx: domain: "com.example.app.metrics" ``` Micrometer provides a default `HierarchicalNameMapper` that governs how a dimensional meter ID is [mapped to flat hierarchical names](https://micrometer.io/docs/registry/jmx#_hierarchical_name_mapping). | | | | --- | --- | | | To take control over this behavior, define your `JmxMeterRegistry` and supply your own `HierarchicalNameMapper`. An auto-configured `JmxConfig` and `Clock` beans are provided unless you define your own: Java ``` import io.micrometer.core.instrument.Clock; import io.micrometer.core.instrument.Meter; import io.micrometer.core.instrument.config.NamingConvention; import io.micrometer.core.instrument.util.HierarchicalNameMapper; import io.micrometer.jmx.JmxConfig; import io.micrometer.jmx.JmxMeterRegistry; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyJmxConfiguration { @Bean public JmxMeterRegistry jmxMeterRegistry(JmxConfig config, Clock clock) { return new JmxMeterRegistry(config, clock, this::toHierarchicalName); } private String toHierarchicalName(Meter.Id id, NamingConvention convention) { return ... } } ``` Kotlin ``` import io.micrometer.core.instrument.Clock import io.micrometer.core.instrument.Meter import io.micrometer.core.instrument.config.NamingConvention import io.micrometer.core.instrument.util.HierarchicalNameMapper import io.micrometer.jmx.JmxConfig import io.micrometer.jmx.JmxMeterRegistry import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyJmxConfiguration { @Bean fun jmxMeterRegistry(config: JmxConfig, clock: Clock): JmxMeterRegistry { return JmxMeterRegistry(config, clock, this::toHierarchicalName) } private fun toHierarchicalName(id: Meter.Id, convention: NamingConvention): String { return ... } } ``` | #### 6.2.11. KairosDB By default, metrics are exported to [KairosDB](https://micrometer.io/docs/registry/kairos) running on your local machine. You can provide the location of the [KairosDB server](https://kairosdb.github.io/) to use by using: Properties ``` management.metrics.export.kairos.uri=https://kairosdb.example.com:8080/api/v1/datapoints ``` Yaml ``` management: metrics: export: kairos: uri: "https://kairosdb.example.com:8080/api/v1/datapoints" ``` #### 6.2.12. New Relic A New Relic registry periodically pushes metrics to [New Relic](https://micrometer.io/docs/registry/new-relic). To export metrics to [New Relic](https://newrelic.com), you must provide your API key and account ID: Properties ``` management.metrics.export.newrelic.api-key=YOUR_KEY management.metrics.export.newrelic.account-id=YOUR_ACCOUNT_ID ``` Yaml ``` management: metrics: export: newrelic: api-key: "YOUR_KEY" account-id: "YOUR_ACCOUNT_ID" ``` You can also change the interval at which metrics are sent to New Relic: Properties ``` management.metrics.export.newrelic.step=30s ``` Yaml ``` management: metrics: export: newrelic: step: "30s" ``` By default, metrics are published through REST calls, but you can also use the Java Agent API if you have it on the classpath: Properties ``` management.metrics.export.newrelic.client-provider-type=insights-agent ``` Yaml ``` management: metrics: export: newrelic: client-provider-type: "insights-agent" ``` Finally, you can take full control by defining your own `NewRelicClientProvider` bean. #### 6.2.13. Prometheus [Prometheus](https://micrometer.io/docs/registry/prometheus) expects to scrape or poll individual application instances for metrics. Spring Boot provides an actuator endpoint at `/actuator/prometheus` to present a [Prometheus scrape](https://prometheus.io) with the appropriate format. | | | | --- | --- | | | By default, the endpoint is not available and must be exposed. See [exposing endpoints](#actuator.endpoints.exposing) for more details. | The following example `scrape_config` adds to `prometheus.yml`: ``` scrape_configs: - job_name: "spring" metrics_path: "/actuator/prometheus" static_configs: - targets: ["HOST:PORT"] ``` [Prometheus Exemplars](https://prometheus.io/docs/prometheus/latest/feature_flags/#exemplars-storage) are also supported. To enable this feature, a `SpanContextSupplier` bean should present. If you use [Spring Cloud Sleuth](https://spring.io/projects/spring-cloud-sleuth), this will be auto-configured for you, but you can always create your own if you want. Please check the [Prometheus Docs](https://prometheus.io/docs/prometheus/latest/feature_flags/#exemplars-storage), since this feature needs to be explicitly enabled on Prometheus' side, and it is only supported using the [OpenMetrics](https://github.com/OpenObservability/OpenMetrics/blob/v1.0.0/specification/OpenMetrics.md#exemplars) format. For ephemeral or batch jobs that may not exist long enough to be scraped, you can use [Prometheus Pushgateway](https://github.com/prometheus/pushgateway) support to expose the metrics to Prometheus. To enable Prometheus Pushgateway support, add the following dependency to your project: ``` <dependency> <groupId>io.prometheus</groupId> <artifactId>simpleclient_pushgateway</artifactId> </dependency> ``` When the Prometheus Pushgateway dependency is present on the classpath and the `management.metrics.export.prometheus.pushgateway.enabled` property is set to `true`, a `PrometheusPushGatewayManager` bean is auto-configured. This manages the pushing of metrics to a Prometheus Pushgateway. You can tune the `PrometheusPushGatewayManager` by using properties under `management.metrics.export.prometheus.pushgateway`. For advanced configuration, you can also provide your own `PrometheusPushGatewayManager` bean. #### 6.2.14. SignalFx SignalFx registry periodically pushes metrics to [SignalFx](https://micrometer.io/docs/registry/signalFx). To export metrics to [SignalFx](https://www.signalfx.com), you must provide your access token: Properties ``` management.metrics.export.signalfx.access-token=YOUR_ACCESS_TOKEN ``` Yaml ``` management: metrics: export: signalfx: access-token: "YOUR_ACCESS_TOKEN" ``` You can also change the interval at which metrics are sent to SignalFx: Properties ``` management.metrics.export.signalfx.step=30s ``` Yaml ``` management: metrics: export: signalfx: step: "30s" ``` #### 6.2.15. Simple Micrometer ships with a simple, in-memory backend that is automatically used as a fallback if no other registry is configured. This lets you see what metrics are collected in the [metrics endpoint](#actuator.metrics.endpoint). The in-memory backend disables itself as soon as you use any other available backend. You can also disable it explicitly: Properties ``` management.metrics.export.simple.enabled=false ``` Yaml ``` management: metrics: export: simple: enabled: false ``` #### 6.2.16. Stackdriver The Stackdriver registry periodically pushes metrics to [Stackdriver](https://cloud.google.com/stackdriver/). To export metrics to SaaS [Stackdriver](https://micrometer.io/docs/registry/stackdriver), you must provide your Google Cloud project ID: Properties ``` management.metrics.export.stackdriver.project-id=my-project ``` Yaml ``` management: metrics: export: stackdriver: project-id: "my-project" ``` You can also change the interval at which metrics are sent to Stackdriver: Properties ``` management.metrics.export.stackdriver.step=30s ``` Yaml ``` management: metrics: export: stackdriver: step: "30s" ``` #### 6.2.17. StatsD The StatsD registry eagerly pushes metrics over UDP to a StatsD agent. By default, metrics are exported to a [StatsD](https://micrometer.io/docs/registry/statsD) agent running on your local machine. You can provide the StatsD agent host, port, and protocol to use by using: Properties ``` management.metrics.export.statsd.host=statsd.example.com management.metrics.export.statsd.port=9125 management.metrics.export.statsd.protocol=udp ``` Yaml ``` management: metrics: export: statsd: host: "statsd.example.com" port: 9125 protocol: "udp" ``` You can also change the StatsD line protocol to use (it defaults to Datadog): Properties ``` management.metrics.export.statsd.flavor=etsy ``` Yaml ``` management: metrics: export: statsd: flavor: "etsy" ``` #### 6.2.18. Wavefront The Wavefront registry periodically pushes metrics to [Wavefront](https://micrometer.io/docs/registry/wavefront). If you are exporting metrics to [Wavefront](https://www.wavefront.com/) directly, you must provide your API token: Properties ``` management.metrics.export.wavefront.api-token=YOUR_API_TOKEN ``` Yaml ``` management: metrics: export: wavefront: api-token: "YOUR_API_TOKEN" ``` Alternatively, you can use a Wavefront sidecar or an internal proxy in your environment to forward metrics data to the Wavefront API host: Properties ``` management.metrics.export.wavefront.uri=proxy://localhost:2878 ``` Yaml ``` management: metrics: export: wavefront: uri: "proxy://localhost:2878" ``` | | | | --- | --- | | | If you publish metrics to a Wavefront proxy (as described in [the Wavefront documentation](https://docs.wavefront.com/proxies_installing.html)), the host must be in the `proxy://HOST:PORT` format. | You can also change the interval at which metrics are sent to Wavefront: Properties ``` management.metrics.export.wavefront.step=30s ``` Yaml ``` management: metrics: export: wavefront: step: "30s" ``` ### 6.3. Supported Metrics and Meters Spring Boot provides automatic meter registration for a wide variety of technologies. In most situations, the defaults provide sensible metrics that can be published to any of the supported monitoring systems. #### 6.3.1. JVM Metrics Auto-configuration enables JVM Metrics by using core Micrometer classes. JVM metrics are published under the `jvm.` meter name. The following JVM metrics are provided: * Various memory and buffer pool details * Statistics related to garbage collection * Thread utilization * The number of classes loaded and unloaded #### 6.3.2. System Metrics Auto-configuration enables system metrics by using core Micrometer classes. System metrics are published under the `system.`, `process.`, and `disk.` meter names. The following system metrics are provided: * CPU metrics * File descriptor metrics * Uptime metrics (both the amount of time the application has been running and a fixed gauge of the absolute start time) * Disk space available #### 6.3.3. Application Startup Metrics Auto-configuration exposes application startup time metrics: * `application.started.time`: time taken to start the application. * `application.ready.time`: time taken for the application to be ready to service requests. Metrics are tagged by the fully qualified name of the application class. #### 6.3.4. Logger Metrics Auto-configuration enables the event metrics for both Logback and Log4J2. The details are published under the `log4j2.events.` or `logback.events.` meter names. #### 6.3.5. Task Execution and Scheduling Metrics Auto-configuration enables the instrumentation of all available `ThreadPoolTaskExecutor` and `ThreadPoolTaskScheduler` beans, as long as the underling `ThreadPoolExecutor` is available. Metrics are tagged by the name of the executor, which is derived from the bean name. #### 6.3.6. Spring MVC Metrics Auto-configuration enables the instrumentation of all requests handled by Spring MVC controllers and functional handlers. By default, metrics are generated with the name, `http.server.requests`. You can customize the name by setting the `management.metrics.web.server.request.metric-name` property. `@Timed` annotations are supported on `@Controller` classes and `@RequestMapping` methods (see [@Timed Annotation Support](#actuator.metrics.supported.timed-annotation) for details). If you do not want to record metrics for all Spring MVC requests, you can set `management.metrics.web.server.request.autotime.enabled` to `false` and exclusively use `@Timed` annotations instead. By default, Spring MVC related metrics are tagged with the following information: | Tag | Description | | --- | --- | | `exception` | The simple class name of any exception that was thrown while handling the request. | | `method` | The request’s method (for example, `GET` or `POST`) | | `outcome` | The request’s outcome, based on the status code of the response. 1xx is `INFORMATIONAL`, 2xx is `SUCCESS`, 3xx is `REDIRECTION`, 4xx is `CLIENT_ERROR`, and 5xx is `SERVER_ERROR` | | `status` | The response’s HTTP status code (for example, `200` or `500`) | | `uri` | The request’s URI template prior to variable substitution, if possible (for example, `/api/person/{id}`) | To add to the default tags, provide one or more `@Bean`s that implement `WebMvcTagsContributor`. To replace the default tags, provide a `@Bean` that implements `WebMvcTagsProvider`. | | | | --- | --- | | | In some cases, exceptions handled in web controllers are not recorded as request metrics tags. Applications can opt in and record exceptions by [setting handled exceptions as request attributes](web#web.servlet.spring-mvc.error-handling). | By default, all requests are handled. To customize the filter, provide a `@Bean` that implements `FilterRegistrationBean<WebMvcMetricsFilter>`. #### 6.3.7. Spring WebFlux Metrics Auto-configuration enables the instrumentation of all requests handled by Spring WebFlux controllers and functional handlers. By default, metrics are generated with the name, `http.server.requests`. You can customize the name by setting the `management.metrics.web.server.request.metric-name` property. `@Timed` annotations are supported on `@Controller` classes and `@RequestMapping` methods (see [@Timed Annotation Support](#actuator.metrics.supported.timed-annotation) for details). If you do not want to record metrics for all Spring WebFlux requests, you can set `management.metrics.web.server.request.autotime.enabled` to `false` and exclusively use `@Timed` annotations instead. By default, WebFlux related metrics are tagged with the following information: | Tag | Description | | --- | --- | | `exception` | The simple class name of any exception that was thrown while handling the request. | | `method` | The request’s method (for example, `GET` or `POST`) | | `outcome` | The request’s outcome, based on the status code of the response. 1xx is `INFORMATIONAL`, 2xx is `SUCCESS`, 3xx is `REDIRECTION`, 4xx is `CLIENT_ERROR`, and 5xx is `SERVER_ERROR` | | `status` | The response’s HTTP status code (for example, `200` or `500`) | | `uri` | The request’s URI template prior to variable substitution, if possible (for example, `/api/person/{id}`) | To add to the default tags, provide one or more beans that implement `WebFluxTagsContributor`. To replace the default tags, provide a bean that implements `WebFluxTagsProvider`. | | | | --- | --- | | | In some cases, exceptions handled in controllers and handler functions are not recorded as request metrics tags. Applications can opt in and record exceptions by [setting handled exceptions as request attributes](web#web.reactive.webflux.error-handling). | #### 6.3.8. Jersey Server Metrics Auto-configuration enables the instrumentation of all requests handled by the Jersey JAX-RS implementation. By default, metrics are generated with the name, `http.server.requests`. You can customize the name by setting the `management.metrics.web.server.request.metric-name` property. `@Timed` annotations are supported on request-handling classes and methods (see [@Timed Annotation Support](#actuator.metrics.supported.timed-annotation) for details). If you do not want to record metrics for all Jersey requests, you can set `management.metrics.web.server.request.autotime.enabled` to `false` and exclusively use `@Timed` annotations instead. By default, Jersey server metrics are tagged with the following information: | Tag | Description | | --- | --- | | `exception` | The simple class name of any exception that was thrown while handling the request. | | `method` | The request’s method (for example, `GET` or `POST`) | | `outcome` | The request’s outcome, based on the status code of the response. 1xx is `INFORMATIONAL`, 2xx is `SUCCESS`, 3xx is `REDIRECTION`, 4xx is `CLIENT_ERROR`, and 5xx is `SERVER_ERROR` | | `status` | The response’s HTTP status code (for example, `200` or `500`) | | `uri` | The request’s URI template prior to variable substitution, if possible (for example, `/api/person/{id}`) | To customize the tags, provide a `@Bean` that implements `JerseyTagsProvider`. #### 6.3.9. HTTP Client Metrics Spring Boot Actuator manages the instrumentation of both `RestTemplate` and `WebClient`. For that, you have to inject the auto-configured builder and use it to create instances: * `RestTemplateBuilder` for `RestTemplate` * `WebClient.Builder` for `WebClient` You can also manually apply the customizers responsible for this instrumentation, namely `MetricsRestTemplateCustomizer` and `MetricsWebClientCustomizer`. By default, metrics are generated with the name, `http.client.requests`. You can customize the name by setting the `management.metrics.web.client.request.metric-name` property. By default, metrics generated by an instrumented client are tagged with the following information: | Tag | Description | | --- | --- | | `clientName` | The host portion of the URI | | `method` | The request’s method (for example, `GET` or `POST`) | | `outcome` | The request’s outcome, based on the status code of the response. 1xx is `INFORMATIONAL`, 2xx is `SUCCESS`, 3xx is `REDIRECTION`, 4xx is `CLIENT_ERROR`, and 5xx is `SERVER_ERROR`. Otherwise, it is `UNKNOWN`. | | `status` | The response’s HTTP status code if available (for example, `200` or `500`) or `IO_ERROR` in case of I/O issues. Otherwise, it is `CLIENT_ERROR`. | | `uri` | The request’s URI template prior to variable substitution, if possible (for example, `/api/person/{id}`) | To customize the tags, and depending on your choice of client, you can provide a `@Bean` that implements `RestTemplateExchangeTagsProvider` or `WebClientExchangeTagsProvider`. There are convenience static functions in `RestTemplateExchangeTags` and `WebClientExchangeTags`. #### 6.3.10. Tomcat Metrics Auto-configuration enables the instrumentation of Tomcat only when an `MBeanRegistry` is enabled. By default, the `MBeanRegistry` is disabled, but you can enable it by setting `server.tomcat.mbeanregistry.enabled` to `true`. Tomcat metrics are published under the `tomcat.` meter name. #### 6.3.11. Cache Metrics Auto-configuration enables the instrumentation of all available `Cache` instances on startup, with metrics prefixed with `cache`. Cache instrumentation is standardized for a basic set of metrics. Additional, cache-specific metrics are also available. The following cache libraries are supported: * Cache2k * Caffeine * EhCache 2 * Hazelcast * Any compliant JCache (JSR-107) implementation * Redis Metrics are tagged by the name of the cache and by the name of the `CacheManager`, which is derived from the bean name. | | | | --- | --- | | | Only caches that are configured on startup are bound to the registry. For caches not defined in the cache’s configuration, such as caches created on the fly or programmatically after the startup phase, an explicit registration is required. A `CacheMetricsRegistrar` bean is made available to make that process easier. | #### 6.3.12. Spring GraphQL Metrics Auto-configuration enables the instrumentation of GraphQL queries, for any supported transport. Spring Boot records a `graphql.request` timer with: | Tag | Description | Sample values | | --- | --- | --- | | outcome | Request outcome | "SUCCESS", "ERROR" | A single GraphQL query can involve many `DataFetcher` calls, so there is a dedicated `graphql.datafetcher` timer: | Tag | Description | Sample values | | --- | --- | --- | | path | data fetcher path | "Query.project" | | outcome | data fetching outcome | "SUCCESS", "ERROR" | The `graphql.request.datafetch.count` [distribution summary](https://micrometer.io/docs/concepts#_distribution_summaries) counts the number of non-trivial `DataFetcher` calls made per request. This metric is useful for detecting "N+1" data fetching issues and considering batch loading; it provides the `"TOTAL"` number of data fetcher calls made over the `"COUNT"` of recorded requests, as well as the `"MAX"` calls made for a single request over the considered period. More options are available for [configuring distributions with application properties](application-properties#application-properties.actuator.management.metrics.distribution.maximum-expected-value). A single response can contain many GraphQL errors, counted by the `graphql.error` counter: | Tag | Description | Sample values | | --- | --- | --- | | errorType | error type | "DataFetchingException" | | errorPath | error JSON Path | "$.project" | #### 6.3.13. DataSource Metrics Auto-configuration enables the instrumentation of all available `DataSource` objects with metrics prefixed with `jdbc.connections`. Data source instrumentation results in gauges that represent the currently active, idle, maximum allowed, and minimum allowed connections in the pool. Metrics are also tagged by the name of the `DataSource` computed based on the bean name. | | | | --- | --- | | | By default, Spring Boot provides metadata for all supported data sources. You can add additional `DataSourcePoolMetadataProvider` beans if your favorite data source is not supported. See `DataSourcePoolMetadataProvidersConfiguration` for examples. | Also, Hikari-specific metrics are exposed with a `hikaricp` prefix. Each metric is tagged by the name of the pool (you can control it with `spring.datasource.name`). #### 6.3.14. Hibernate Metrics If `org.hibernate:hibernate-micrometer` is on the classpath, all available Hibernate `EntityManagerFactory` instances that have statistics enabled are instrumented with a metric named `hibernate`. Metrics are also tagged by the name of the `EntityManagerFactory`, which is derived from the bean name. To enable statistics, the standard JPA property `hibernate.generate_statistics` must be set to `true`. You can enable that on the auto-configured `EntityManagerFactory`: Properties ``` spring.jpa.properties[hibernate.generate_statistics]=true ``` Yaml ``` spring: jpa: properties: "[hibernate.generate_statistics]": true ``` #### 6.3.15. Spring Data Repository Metrics Auto-configuration enables the instrumentation of all Spring Data `Repository` method invocations. By default, metrics are generated with the name, `spring.data.repository.invocations`. You can customize the name by setting the `management.metrics.data.repository.metric-name` property. `@Timed` annotations are supported on `Repository` classes and methods (see [@Timed Annotation Support](#actuator.metrics.supported.timed-annotation) for details). If you do not want to record metrics for all `Repository` invocations, you can set `management.metrics.data.repository.autotime.enabled` to `false` and exclusively use `@Timed` annotations instead. By default, repository invocation related metrics are tagged with the following information: | Tag | Description | | --- | --- | | `repository` | The simple class name of the source `Repository`. | | `method` | The name of the `Repository` method that was invoked. | | `state` | The result state (`SUCCESS`, `ERROR`, `CANCELED`, or `RUNNING`). | | `exception` | The simple class name of any exception that was thrown from the invocation. | To replace the default tags, provide a `@Bean` that implements `RepositoryTagsProvider`. #### 6.3.16. RabbitMQ Metrics Auto-configuration enables the instrumentation of all available RabbitMQ connection factories with a metric named `rabbitmq`. #### 6.3.17. Spring Integration Metrics Spring Integration automatically provides [Micrometer support](https://docs.spring.io/spring-integration/docs/5.5.12/reference/html/system-management.html#micrometer-integration) whenever a `MeterRegistry` bean is available. Metrics are published under the `spring.integration.` meter name. #### 6.3.18. Kafka Metrics Auto-configuration registers a `MicrometerConsumerListener` and `MicrometerProducerListener` for the auto-configured consumer factory and producer factory, respectively. It also registers a `KafkaStreamsMicrometerListener` for `StreamsBuilderFactoryBean`. For more detail, see the [Micrometer Native Metrics](https://docs.spring.io/spring-kafka/docs/2.8.6/reference/html/#micrometer-native) section of the Spring Kafka documentation. #### 6.3.19. MongoDB Metrics This section briefly describes the available metrics for MongoDB. ##### MongoDB Command Metrics Auto-configuration registers a `MongoMetricsCommandListener` with the auto-configured `MongoClient`. A timer metric named `mongodb.driver.commands` is created for each command issued to the underlying MongoDB driver. Each metric is tagged with the following information by default: | Tag | Description | | --- | --- | | `command` | The name of the command issued. | | `cluster.id` | The identifier of the cluster to which the command was sent. | | `server.address` | The address of the server to which the command was sent. | | `status` | The outcome of the command (`SUCCESS` or `FAILED`). | To replace the default metric tags, define a `MongoCommandTagsProvider` bean, as the following example shows: Java ``` import io.micrometer.core.instrument.binder.mongodb.MongoCommandTagsProvider; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyCommandTagsProviderConfiguration { @Bean public MongoCommandTagsProvider customCommandTagsProvider() { return new CustomCommandTagsProvider(); } } ``` Kotlin ``` import io.micrometer.core.instrument.binder.mongodb.MongoCommandTagsProvider import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyCommandTagsProviderConfiguration { @Bean fun customCommandTagsProvider(): MongoCommandTagsProvider? { return CustomCommandTagsProvider() } } ``` To disable the auto-configured command metrics, set the following property: Properties ``` management.metrics.mongo.command.enabled=false ``` Yaml ``` management: metrics: mongo: command: enabled: false ``` ##### MongoDB Connection Pool Metrics Auto-configuration registers a `MongoMetricsConnectionPoolListener` with the auto-configured `MongoClient`. The following gauge metrics are created for the connection pool: * `mongodb.driver.pool.size` reports the current size of the connection pool, including idle and and in-use members. * `mongodb.driver.pool.checkedout` reports the count of connections that are currently in use. * `mongodb.driver.pool.waitqueuesize` reports the current size of the wait queue for a connection from the pool. Each metric is tagged with the following information by default: | Tag | Description | | --- | --- | | `cluster.id` | The identifier of the cluster to which the connection pool corresponds. | | `server.address` | The address of the server to which the connection pool corresponds. | To replace the default metric tags, define a `MongoConnectionPoolTagsProvider` bean: Java ``` import io.micrometer.core.instrument.binder.mongodb.MongoConnectionPoolTagsProvider; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyConnectionPoolTagsProviderConfiguration { @Bean public MongoConnectionPoolTagsProvider customConnectionPoolTagsProvider() { return new CustomConnectionPoolTagsProvider(); } } ``` Kotlin ``` import io.micrometer.core.instrument.binder.mongodb.MongoConnectionPoolTagsProvider import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyConnectionPoolTagsProviderConfiguration { @Bean fun customConnectionPoolTagsProvider(): MongoConnectionPoolTagsProvider { return CustomConnectionPoolTagsProvider() } } ``` To disable the auto-configured connection pool metrics, set the following property: Properties ``` management.metrics.mongo.connectionpool.enabled=false ``` Yaml ``` management: metrics: mongo: connectionpool: enabled: false ``` #### 6.3.20. Jetty Metrics Auto-configuration binds metrics for Jetty’s `ThreadPool` by using Micrometer’s `JettyServerThreadPoolMetrics`. Metrics for Jetty’s `Connector` instances are bound by using Micrometer’s `JettyConnectionMetrics` and, when `server.ssl.enabled` is set to `true`, Micrometer’s `JettySslHandshakeMetrics`. #### 6.3.21. @Timed Annotation Support You can use the `@Timed` annotation from the `io.micrometer.core.annotation` package with several of the supported technologies described earlier. If supported, you can use the annotation at either the class level or the method level. For example, the following code shows how you can use the annotation to instrument all request mappings in a `@RestController`: Java ``` import java.util.List; import io.micrometer.core.annotation.Timed; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RestController; @RestController @Timed public class MyController { @GetMapping("/api/addresses") public List<Address> listAddress() { return ... } @GetMapping("/api/people") public List<Person> listPeople() { return ... } } ``` Kotlin ``` import io.micrometer.core.annotation.Timed import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.bind.annotation.RestController @RestController @Timed class MyController { @GetMapping("/api/addresses") fun listAddress(): List<Address>? { return ... } @GetMapping("/api/people") fun listPeople(): List<Person>? { return ... } } ``` If you want only to instrument a single mapping, you can use the annotation on the method instead of the class: Java ``` import java.util.List; import io.micrometer.core.annotation.Timed; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RestController; @RestController public class MyController { @GetMapping("/api/addresses") public List<Address> listAddress() { return ... } @GetMapping("/api/people") @Timed public List<Person> listPeople() { return ... } } ``` Kotlin ``` import io.micrometer.core.annotation.Timed import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.bind.annotation.RestController @RestController class MyController { @GetMapping("/api/addresses") fun listAddress(): List<Address>? { return ... } @GetMapping("/api/people") @Timed fun listPeople(): List<Person>? { return ... } } ``` You can also combine class-level and method-level annotations if you want to change the timing details for a specific method: Java ``` import java.util.List; import io.micrometer.core.annotation.Timed; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RestController; @RestController @Timed public class MyController { @GetMapping("/api/addresses") public List<Address> listAddress() { return ... } @GetMapping("/api/people") @Timed(extraTags = { "region", "us-east-1" }) @Timed(value = "all.people", longTask = true) public List<Person> listPeople() { return ... } } ``` Kotlin ``` import io.micrometer.core.annotation.Timed import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.bind.annotation.RestController @RestController @Timed class MyController { @GetMapping("/api/addresses") fun listAddress(): List<Address>? { return ... } @GetMapping("/api/people") @Timed(value = "all.people", longTask = true, extraTags = ["region", "us-east-1"]) fun listPeople(): List<Person>? { return ... } } ``` | | | | --- | --- | | | A `@Timed` annotation with `longTask = true` enables a long task timer for the method. Long task timers require a separate metric name and can be stacked with a short task timer. | #### 6.3.22. Redis Metrics Auto-configuration registers a `MicrometerCommandLatencyRecorder` for the auto-configured `LettuceConnectionFactory`. For more detail, see the [Micrometer Metrics section](https://lettuce.io/core/6.1.8.RELEASE/reference/index.html#command.latency.metrics.micrometer) of the Lettuce documentation. ### 6.4. Registering Custom Metrics To register custom metrics, inject `MeterRegistry` into your component: Java ``` import io.micrometer.core.instrument.MeterRegistry; import io.micrometer.core.instrument.Tags; import org.springframework.stereotype.Component; @Component public class MyBean { private final Dictionary dictionary; public MyBean(MeterRegistry registry) { this.dictionary = Dictionary.load(); registry.gauge("dictionary.size", Tags.empty(), this.dictionary.getWords().size()); } } ``` Kotlin ``` import io.micrometer.core.instrument.MeterRegistry import io.micrometer.core.instrument.Tags import org.springframework.stereotype.Component @Component class MyBean(registry: MeterRegistry) { private val dictionary: Dictionary init { dictionary = Dictionary.load() registry.gauge("dictionary.size", Tags.empty(), dictionary.words.size) } } ``` If your metrics depend on other beans, we recommend that you use a `MeterBinder` to register them: Java ``` import io.micrometer.core.instrument.Gauge; import io.micrometer.core.instrument.binder.MeterBinder; import org.springframework.context.annotation.Bean; public class MyMeterBinderConfiguration { @Bean public MeterBinder queueSize(Queue queue) { return (registry) -> Gauge.builder("queueSize", queue::size).register(registry); } } ``` Kotlin ``` import io.micrometer.core.instrument.Gauge import io.micrometer.core.instrument.binder.MeterBinder import org.springframework.context.annotation.Bean class MyMeterBinderConfiguration { @Bean fun queueSize(queue: Queue): MeterBinder { return MeterBinder { registry -> Gauge.builder("queueSize", queue::size).register(registry) } } } ``` Using a `MeterBinder` ensures that the correct dependency relationships are set up and that the bean is available when the metric’s value is retrieved. A `MeterBinder` implementation can also be useful if you find that you repeatedly instrument a suite of metrics across components or applications. | | | | --- | --- | | | By default, metrics from all `MeterBinder` beans are automatically bound to the Spring-managed `MeterRegistry`. | ### 6.5. Customizing Individual Metrics If you need to apply customizations to specific `Meter` instances, you can use the `io.micrometer.core.instrument.config.MeterFilter` interface. For example, if you want to rename the `mytag.region` tag to `mytag.area` for all meter IDs beginning with `com.example`, you can do the following: Java ``` import io.micrometer.core.instrument.config.MeterFilter; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyMetricsFilterConfiguration { @Bean public MeterFilter renameRegionTagMeterFilter() { return MeterFilter.renameTag("com.example", "mytag.region", "mytag.area"); } } ``` Kotlin ``` import io.micrometer.core.instrument.config.MeterFilter import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyMetricsFilterConfiguration { @Bean fun renameRegionTagMeterFilter(): MeterFilter { return MeterFilter.renameTag("com.example", "mytag.region", "mytag.area") } } ``` | | | | --- | --- | | | By default, all `MeterFilter` beans are automatically bound to the Spring-managed `MeterRegistry`. Make sure to register your metrics by using the Spring-managed `MeterRegistry` and not any of the static methods on `Metrics`. These use the global registry that is not Spring-managed. | #### 6.5.1. Common Tags Common tags are generally used for dimensional drill-down on the operating environment, such as host, instance, region, stack, and others. Commons tags are applied to all meters and can be configured, as the following example shows: Properties ``` management.metrics.tags.region=us-east-1 management.metrics.tags.stack=prod ``` Yaml ``` management: metrics: tags: region: "us-east-1" stack: "prod" ``` The preceding example adds `region` and `stack` tags to all meters with a value of `us-east-1` and `prod`, respectively. | | | | --- | --- | | | The order of common tags is important if you use Graphite. As the order of common tags cannot be guaranteed by using this approach, Graphite users are advised to define a custom `MeterFilter` instead. | #### 6.5.2. Per-meter Properties In addition to `MeterFilter` beans, you can apply a limited set of customization on a per-meter basis by using properties. Per-meter customizations apply to any meter IDs that start with the given name. The following example disables any meters that have an ID starting with `example.remote` Properties ``` management.metrics.enable.example.remote=false ``` Yaml ``` management: metrics: enable: example: remote: false ``` The following properties allow per-meter customization: Table 1. Per-meter customizations | Property | Description | | --- | --- | | `management.metrics.enable` | Whether to prevent meters from emitting any metrics. | | `management.metrics.distribution.percentiles-histogram` | Whether to publish a histogram suitable for computing aggregable (across dimension) percentile approximations. | | `management.metrics.distribution.minimum-expected-value`, `management.metrics.distribution.maximum-expected-value` | Publish fewer histogram buckets by clamping the range of expected values. | | `management.metrics.distribution.percentiles` | Publish percentile values computed in your application | | `management.metrics.distribution.expiry`, `management.metrics.distribution.buffer-length` | Give greater weight to recent samples by accumulating them in ring buffers which rotate after a configurable expiry, with a configurable buffer length. | | `management.metrics.distribution.slo` | Publish a cumulative histogram with buckets defined by your service-level objectives. | For more details on the concepts behind `percentiles-histogram`, `percentiles`, and `slo`, see the [“Histograms and percentiles” section](https://micrometer.io/docs/concepts#_histograms_and_percentiles) of the Micrometer documentation. ### 6.6. Metrics Endpoint Spring Boot provides a `metrics` endpoint that you can use diagnostically to examine the metrics collected by an application. The endpoint is not available by default and must be exposed. See [exposing endpoints](#actuator.endpoints.exposing) for more details. Navigating to `/actuator/metrics` displays a list of available meter names. You can drill down to view information about a particular meter by providing its name as a selector — for example, `/actuator/metrics/jvm.memory.max`. | | | | --- | --- | | | The name you use here should match the name used in the code, not the name after it has been naming-convention normalized for a monitoring system to which it is shipped. In other words, if `jvm.memory.max` appears as `jvm_memory_max` in Prometheus because of its snake case naming convention, you should still use `jvm.memory.max` as the selector when inspecting the meter in the `metrics` endpoint. | You can also add any number of `tag=KEY:VALUE` query parameters to the end of the URL to dimensionally drill down on a meter — for example, `/actuator/metrics/jvm.memory.max?tag=area:nonheap`. | | | | --- | --- | | | The reported measurements are the *sum* of the statistics of all meters that match the meter name and any tags that have been applied. In the preceding example, the returned `Value` statistic is the sum of the maximum memory footprints of the “Code Cache”, “Compressed Class Space”, and “Metaspace” areas of the heap. If you wanted to see only the maximum size for the “Metaspace”, you could add an additional `tag=id:Metaspace` — that is, `/actuator/metrics/jvm.memory.max?tag=area:nonheap&tag=id:Metaspace`. | 7. Auditing ------------ Once Spring Security is in play, Spring Boot Actuator has a flexible audit framework that publishes events (by default, “authentication success”, “failure” and “access denied” exceptions). This feature can be very useful for reporting and for implementing a lock-out policy based on authentication failures. You can enable auditing by providing a bean of type `AuditEventRepository` in your application’s configuration. For convenience, Spring Boot offers an `InMemoryAuditEventRepository`. `InMemoryAuditEventRepository` has limited capabilities, and we recommend using it only for development environments. For production environments, consider creating your own alternative `AuditEventRepository` implementation. ### 7.1. Custom Auditing To customize published security events, you can provide your own implementations of `AbstractAuthenticationAuditListener` and `AbstractAuthorizationAuditListener`. You can also use the audit services for your own business events. To do so, either inject the `AuditEventRepository` bean into your own components and use that directly or publish an `AuditApplicationEvent` with the Spring `ApplicationEventPublisher` (by implementing `ApplicationEventPublisherAware`). 8. HTTP Tracing ---------------- You can enable HTTP Tracing by providing a bean of type `HttpTraceRepository` in your application’s configuration. For convenience, Spring Boot offers `InMemoryHttpTraceRepository`, which stores traces for the last 100 (the default) request-response exchanges. `InMemoryHttpTraceRepository` is limited compared to other tracing solutions, and we recommend using it only for development environments. For production environments, we recommend using a production-ready tracing or observability solution, such as Zipkin or Spring Cloud Sleuth. Alternatively, you can create your own `HttpTraceRepository`. You can use the `httptrace` endpoint to obtain information about the request-response exchanges that are stored in the `HttpTraceRepository`. ### 8.1. Custom HTTP tracing To customize the items that are included in each trace, use the `management.trace.http.include` configuration property. For advanced customization, consider registering your own `HttpExchangeTracer` implementation. 9. Process Monitoring ---------------------- In the `spring-boot` module, you can find two classes to create files that are often useful for process monitoring: * `ApplicationPidFileWriter` creates a file that contains the application PID (by default, in the application directory with a file name of `application.pid`). * `WebServerPortFileWriter` creates a file (or files) that contain the ports of the running web server (by default, in the application directory with a file name of `application.port`). By default, these writers are not activated, but you can enable them: * [By Extending Configuration](#actuator.process-monitoring.configuration) * [Programmatically Enabling Process Monitoring](#actuator.process-monitoring.programmatically) ### 9.1. Extending Configuration In the `META-INF/spring.factories` file, you can activate the listener (or listeners) that writes a PID file: ``` org.springframework.context.ApplicationListener=\ org.springframework.boot.context.ApplicationPidFileWriter,\ org.springframework.boot.web.context.WebServerPortFileWriter ``` ### 9.2. Programmatically Enabling Process Monitoring You can also activate a listener by invoking the `SpringApplication.addListeners(…​)` method and passing the appropriate `Writer` object. This method also lets you customize the file name and path in the `Writer` constructor. 10. Cloud Foundry Support -------------------------- Spring Boot’s actuator module includes additional support that is activated when you deploy to a compatible Cloud Foundry instance. The `/cloudfoundryapplication` path provides an alternative secured route to all `@Endpoint` beans. The extended support lets Cloud Foundry management UIs (such as the web application that you can use to view deployed applications) be augmented with Spring Boot actuator information. For example, an application status page can include full health information instead of the typical “running” or “stopped” status. | | | | --- | --- | | | The `/cloudfoundryapplication` path is not directly accessible to regular users. To use the endpoint, you must pass a valid UAA token with the request. | ### 10.1. Disabling Extended Cloud Foundry Actuator Support If you want to fully disable the `/cloudfoundryapplication` endpoints, you can add the following setting to your `application.properties` file: Properties ``` management.cloudfoundry.enabled=false ``` Yaml ``` management: cloudfoundry: enabled: false ``` ### 10.2. Cloud Foundry Self-signed Certificates By default, the security verification for `/cloudfoundryapplication` endpoints makes SSL calls to various Cloud Foundry services. If your Cloud Foundry UAA or Cloud Controller services use self-signed certificates, you need to set the following property: Properties ``` management.cloudfoundry.skip-ssl-validation=true ``` Yaml ``` management: cloudfoundry: skip-ssl-validation: true ``` ### 10.3. Custom Context Path If the server’s context-path has been configured to anything other than `/`, the Cloud Foundry endpoints are not available at the root of the application. For example, if `server.servlet.context-path=/app`, Cloud Foundry endpoints are available at `/app/cloudfoundryapplication/*`. If you expect the Cloud Foundry endpoints to always be available at `/cloudfoundryapplication/*`, regardless of the server’s context-path, you need to explicitly configure that in your application. The configuration differs, depending on the web server in use. For Tomcat, you can add the following configuration: Java ``` import java.io.IOException; import java.util.Collections; import javax.servlet.GenericServlet; import javax.servlet.Servlet; import javax.servlet.ServletContainerInitializer; import javax.servlet.ServletContext; import javax.servlet.ServletException; import javax.servlet.ServletRequest; import javax.servlet.ServletResponse; import org.apache.catalina.Host; import org.apache.catalina.core.StandardContext; import org.apache.catalina.startup.Tomcat; import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory; import org.springframework.boot.web.servlet.ServletContextInitializer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyCloudFoundryConfiguration { @Bean public TomcatServletWebServerFactory servletWebServerFactory() { return new TomcatServletWebServerFactory() { @Override protected void prepareContext(Host host, ServletContextInitializer[] initializers) { super.prepareContext(host, initializers); StandardContext child = new StandardContext(); child.addLifecycleListener(new Tomcat.FixContextListener()); child.setPath("/cloudfoundryapplication"); ServletContainerInitializer initializer = getServletContextInitializer(getContextPath()); child.addServletContainerInitializer(initializer, Collections.emptySet()); child.setCrossContext(true); host.addChild(child); } }; } private ServletContainerInitializer getServletContextInitializer(String contextPath) { return (classes, context) -> { Servlet servlet = new GenericServlet() { @Override public void service(ServletRequest req, ServletResponse res) throws ServletException, IOException { ServletContext context = req.getServletContext().getContext(contextPath); context.getRequestDispatcher("/cloudfoundryapplication").forward(req, res); } }; context.addServlet("cloudfoundry", servlet).addMapping("/\*"); }; } } ``` Kotlin ``` import org.apache.catalina.Host import org.apache.catalina.core.StandardContext import org.apache.catalina.startup.Tomcat.FixContextListener import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory import org.springframework.boot.web.servlet.ServletContextInitializer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import java.io.IOException import java.util.Collections.emptySet import javax.servlet.GenericServlet import javax.servlet.Servlet import javax.servlet.ServletContainerInitializer import javax.servlet.ServletContext import javax.servlet.ServletException import javax.servlet.ServletRequest import javax.servlet.ServletResponse import kotlin.jvm.Throws @Configuration(proxyBeanMethods = false) class MyCloudFoundryConfiguration { @Bean fun servletWebServerFactory(): TomcatServletWebServerFactory { return object : TomcatServletWebServerFactory() { override fun prepareContext(host: Host, initializers: Array<ServletContextInitializer>) { super.prepareContext(host, initializers) val child = StandardContext() child.addLifecycleListener(FixContextListener()) child.path = "/cloudfoundryapplication" val initializer = getServletContextInitializer(contextPath) child.addServletContainerInitializer(initializer, emptySet()) child.crossContext = true host.addChild(child) } } } private fun getServletContextInitializer(contextPath: String): ServletContainerInitializer { return ServletContainerInitializer { classes: Set<Class<\*>?>?, context: ServletContext -> val servlet: Servlet = object : GenericServlet() { @Throws(ServletException::class, IOException::class) override fun service(req: ServletRequest, res: ServletResponse) { val servletContext = req.servletContext.getContext(contextPath) servletContext.getRequestDispatcher("/cloudfoundryapplication").forward(req, res) } } context.addServlet("cloudfoundry", servlet).addMapping("/\*") } } } ``` 11. What to Read Next ---------------------- You might want to read about graphing tools such as [Graphite](https://graphiteapp.org). Otherwise, you can continue on to read about [“deployment options”](deployment#deployment) or jump ahead for some in-depth information about Spring Boot’s [build tool plugins](build-tool-plugins#build-tool-plugins).
programming_docs
spring_boot Spring Boot CLI Spring Boot CLI =============== The Spring Boot CLI is a command line tool that you can use if you want to quickly develop a Spring application. It lets you run Groovy scripts, which means that you have a familiar Java-like syntax without so much boilerplate code. You can also bootstrap a new project or write your own command for it. 1. Installing the CLI ---------------------- The Spring Boot CLI (Command-Line Interface) can be installed manually by using SDKMAN! (the SDK Manager) or by using Homebrew or MacPorts if you are an OSX user. See *[getting-started.html](getting-started#getting-started.installing.cli)* in the “Getting started” section for comprehensive installation instructions. 2. Using the CLI ----------------- Once you have installed the CLI, you can run it by typing `spring` and pressing Enter at the command line. If you run `spring` without any arguments, a help screen is displayed, as follows: ``` $ spring usage: spring [--help] [--version] <command> [<args>] Available commands are: run [options] <files> [--] [args] Run a spring groovy script _... more command help is shown here_ ``` You can type `spring help` to get more details about any of the supported commands, as shown in the following example: ``` $ spring help run spring run - Run a spring groovy script usage: spring run [options] <files> [--] [args] Option Description ------ ----------- --autoconfigure [Boolean] Add autoconfigure compiler transformations (default: true) --classpath, -cp Additional classpath entries --no-guess-dependencies Do not attempt to guess dependencies --no-guess-imports Do not attempt to guess imports -q, --quiet Quiet logging -v, --verbose Verbose logging of dependency resolution --watch Watch the specified file for changes ``` The `version` command provides a quick way to check which version of Spring Boot you are using, as follows: ``` $ spring version Spring CLI v2.7.0 ``` ### 2.1. Running Applications with the CLI You can compile and run Groovy source code by using the `run` command. The Spring Boot CLI is completely self-contained, so you do not need any external Groovy installation. The following example shows a “hello world” web application written in Groovy: hello.groovy ``` @RestController class WebApplication { @RequestMapping("/") String home() { "Hello World!" } } ``` To compile and run the application, type the following command: ``` $ spring run hello.groovy ``` To pass command-line arguments to the application, use `--` to separate the commands from the “spring” command arguments, as shown in the following example: ``` $ spring run hello.groovy -- --server.port=9000 ``` To set JVM command line arguments, you can use the `JAVA_OPTS` environment variable, as shown in the following example: ``` $ JAVA_OPTS=-Xmx1024m spring run hello.groovy ``` | | | | --- | --- | | | When setting `JAVA_OPTS` on Microsoft Windows, make sure to quote the entire instruction, such as `set "JAVA_OPTS=-Xms256m -Xmx2048m"`. Doing so ensures the values are properly passed to the process. | #### 2.1.1. Deduced “grab” Dependencies Standard Groovy includes a `@Grab` annotation, which lets you declare dependencies on third-party libraries. This useful technique lets Groovy download jars in the same way as Maven or Gradle would but without requiring you to use a build tool. Spring Boot extends this technique further and tries to deduce which libraries to “grab” based on your code. For example, since the `WebApplication` code shown previously uses `@RestController` annotations, Spring Boot grabs "Tomcat" and "Spring MVC". The following items are used as “grab hints”: | Items | Grabs | | --- | --- | | `JdbcTemplate`, `NamedParameterJdbcTemplate`, `DataSource` | JDBC Application. | | `@EnableJms` | JMS Application. | | `@EnableCaching` | Caching abstraction. | | `@Test` | JUnit. | | `@EnableRabbit` | RabbitMQ. | | extends `Specification` | Spock test. | | `@EnableBatchProcessing` | Spring Batch. | | `@MessageEndpoint` `@EnableIntegration` | Spring Integration. | | `@Controller` `@RestController` `@EnableWebMvc` | Spring MVC + Embedded Tomcat. | | `@EnableWebSecurity` | Spring Security. | | `@EnableTransactionManagement` | Spring Transaction Management. | | | | | --- | --- | | | See subclasses of [`CompilerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-cli/src/main/java/org/springframework/boot/cli/compiler/CompilerAutoConfiguration.java) in the Spring Boot CLI source code to understand exactly how customizations are applied. | #### 2.1.2. Deduced “grab” Coordinates Spring Boot extends Groovy’s standard `@Grab` support by letting you specify a dependency without a group or version (for example, `@Grab('freemarker')`). Doing so consults Spring Boot’s default dependency metadata to deduce the artifact’s group and version. | | | | --- | --- | | | The default metadata is tied to the version of the CLI that you use. It changes only when you move to a new version of the CLI, putting you in control of when the versions of your dependencies may change. A table showing the dependencies and their versions that are included in the default metadata can be found in the [appendix](dependency-versions#appendix.dependency-versions). | #### 2.1.3. Default Import Statements To help reduce the size of your Groovy code, several `import` statements are automatically included. Notice how the preceding example refers to `@Component`, `@RestController`, and `@RequestMapping` without needing to use fully-qualified names or `import` statements. | | | | --- | --- | | | Many Spring annotations work without using `import` statements. Try running your application to see what fails before adding imports. | #### 2.1.4. Automatic Main Method Unlike the equivalent Java application, you do not need to include a `public static void main(String[] args)` method with your `Groovy` scripts. A `SpringApplication` is automatically created, with your compiled code acting as the `source`. #### 2.1.5. Custom Dependency Management By default, the CLI uses the dependency management declared in `spring-boot-dependencies` when resolving `@Grab` dependencies. Additional dependency management, which overrides the default dependency management, can be configured by using the `@DependencyManagementBom` annotation. The annotation’s value should specify the coordinates (`groupId:artifactId:version`) of one or more Maven BOMs. For example, consider the following declaration: ``` @DependencyManagementBom("com.example.custom-bom:1.0.0") ``` The preceding declaration picks up `custom-bom-1.0.0.pom` in a Maven repository under `com/example/custom-versions/1.0.0/`. When you specify multiple BOMs, they are applied in the order in which you declare them, as shown in the following example: ``` @DependencyManagementBom([ "com.example.custom-bom:1.0.0", "com.example.another-bom:1.0.0"]) ``` The preceding example indicates that the dependency management in `another-bom` overrides the dependency management in `custom-bom`. You can use `@DependencyManagementBom` anywhere that you can use `@Grab`. However, to ensure consistent ordering of the dependency management, you can use `@DependencyManagementBom` at most once in your application. ### 2.2. Applications with Multiple Source Files You can use “shell globbing” with all commands that accept file input. Doing so lets you use multiple files from a single directory, as shown in the following example: ``` $ spring run *.groovy ``` ### 2.3. Packaging Your Application You can use the `jar` command to package your application into a self-contained executable jar file, as shown in the following example: ``` $ spring jar my-app.jar *.groovy ``` The resulting jar contains the classes produced by compiling the application and all of the application’s dependencies so that it can then be run by using `java -jar`. The jar file also contains entries from the application’s classpath. You can add and remove explicit paths to the jar by using `--include` and `--exclude`. Both are comma-separated, and both accept prefixes, in the form of “+” and “-”, to signify that they should be removed from the defaults. The default includes are as follows: ``` public/**, resources/**, static/**, templates/**, META-INF/**, * ``` The default excludes are as follows: ``` .*, repository/**, build/**, target/**, **/*.jar, **/*.groovy ``` Type `spring help jar` on the command line for more information. ### 2.4. Initialize a New Project The `init` command lets you create a new project by using [start.spring.io](https://start.spring.io) without leaving the shell, as shown in the following example: ``` $ spring init --dependencies=web,data-jpa my-project Using service at https://start.spring.io Project extracted to '/Users/developer/example/my-project' ``` The preceding example creates a `my-project` directory with a Maven-based project that uses `spring-boot-starter-web` and `spring-boot-starter-data-jpa`. You can list the capabilities of the service by using the `--list` flag, as shown in the following example: ``` $ spring init --list ======================================= Capabilities of https://start.spring.io ======================================= Available dependencies: ----------------------- actuator - Actuator: Production ready features to help you monitor and manage your application ... web - Web: Support for full-stack web development, including Tomcat and spring-webmvc websocket - Websocket: Support for WebSocket development ws - WS: Support for Spring Web Services Available project types: ------------------------ gradle-build - Gradle Config [format:build, build:gradle] gradle-project - Gradle Project [format:project, build:gradle] maven-build - Maven POM [format:build, build:maven] maven-project - Maven Project [format:project, build:maven] (default) ... ``` The `init` command supports many options. See the `help` output for more details. For instance, the following command creates a Gradle project that uses Java 8 and `war` packaging: ``` $ spring init --build=gradle --java-version=1.8 --dependencies=websocket --packaging=war sample-app.zip Using service at https://start.spring.io Content saved to 'sample-app.zip' ``` ### 2.5. Using the Embedded Shell Spring Boot includes command-line completion scripts for the BASH and zsh shells. If you do not use either of these shells (perhaps you are a Windows user), you can use the `shell` command to launch an integrated shell, as shown in the following example: ``` $ spring shell **Spring Boot** (v2.7.0) Hit TAB to complete. Type \'help' and hit RETURN for help, and \'exit' to quit. ``` From inside the embedded shell, you can run other commands directly: ``` $ version Spring CLI v2.7.0 ``` The embedded shell supports ANSI color output as well as `tab` completion. If you need to run a native command, you can use the `!` prefix. To exit the embedded shell, press `ctrl-c`. ### 2.6. Adding Extensions to the CLI You can add extensions to the CLI by using the `install` command. The command takes one or more sets of artifact coordinates in the format `group:artifact:version`, as shown in the following example: ``` $ spring install com.example:spring-boot-cli-extension:1.0.0.RELEASE ``` In addition to installing the artifacts identified by the coordinates you supply, all of the artifacts' dependencies are also installed. To uninstall a dependency, use the `uninstall` command. As with the `install` command, it takes one or more sets of artifact coordinates in the format of `group:artifact:version`, as shown in the following example: ``` $ spring uninstall com.example:spring-boot-cli-extension:1.0.0.RELEASE ``` It uninstalls the artifacts identified by the coordinates you supply and their dependencies. To uninstall all additional dependencies, you can use the `--all` option, as shown in the following example: ``` $ spring uninstall --all ``` 3. Developing Applications with the Groovy Beans DSL ----------------------------------------------------- Spring Framework 4.0 has native support for a `beans{}` “DSL” (borrowed from [Grails](https://grails.org/)), and you can embed bean definitions in your Groovy application scripts by using the same format. This is sometimes a good way to include external features like middleware declarations, as shown in the following example: ``` @Configuration(proxyBeanMethods = false) class Application implements CommandLineRunner { @Autowired SharedService service @Override void run(String... args) { println service.message } } import my.company.SharedService beans { service(SharedService) { message = "Hello World" } } ``` You can mix class declarations with `beans{}` in the same file as long as they stay at the top level, or, if you prefer, you can put the beans DSL in a separate file. 4. Configuring the CLI with settings.xml ----------------------------------------- The Spring Boot CLI uses Maven Resolver, Maven’s dependency resolution engine, to resolve dependencies. The CLI makes use of the Maven configuration found in `~/.m2/settings.xml` to configure Maven Resolver. The following configuration settings are honored by the CLI: * Offline * Mirrors * Servers * Proxies * Profiles + Activation + Repositories * Active profiles See [Maven’s settings documentation](https://maven.apache.org/settings.html) for further information. 5. What to Read Next --------------------- There are some [sample groovy scripts](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-cli/samples) available from the GitHub repository that you can use to try out the Spring Boot CLI. There is also extensive Javadoc throughout the [source code](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-cli/src/main/java/org/springframework/boot/cli). If you find that you reach the limit of the CLI tool, you probably want to look at converting your application to a full Gradle or Maven built “Groovy project”. The next section covers Spring Boot’s "[Build tool plugins](build-tool-plugins#build-tool-plugins)", which you can use with Gradle or Maven. spring_boot Getting Started Getting Started =============== If you are getting started with Spring Boot, or “Spring” in general, start by reading this section. It answers the basic “what?”, “how?” and “why?” questions. It includes an introduction to Spring Boot, along with installation instructions. We then walk you through building your first Spring Boot application, discussing some core principles as we go. 1. Introducing Spring Boot --------------------------- Spring Boot helps you to create stand-alone, production-grade Spring-based applications that you can run. We take an opinionated view of the Spring platform and third-party libraries, so that you can get started with minimum fuss. Most Spring Boot applications need very little Spring configuration. You can use Spring Boot to create Java applications that can be started by using `java -jar` or more traditional war deployments. We also provide a command line tool that runs “spring scripts”. Our primary goals are: * Provide a radically faster and widely accessible getting-started experience for all Spring development. * Be opinionated out of the box but get out of the way quickly as requirements start to diverge from the defaults. * Provide a range of non-functional features that are common to large classes of projects (such as embedded servers, security, metrics, health checks, and externalized configuration). * Absolutely no code generation and no requirement for XML configuration. 2. System Requirements ----------------------- Spring Boot 2.7.0 requires [Java 8](https://www.java.com) and is compatible up to and including Java 18. [Spring Framework 5.3.20](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/) or above is also required. Explicit build support is provided for the following build tools: | Build Tool | Version | | --- | --- | | Maven | 3.5+ | | Gradle | 6.8.x, 6.9.x, and 7.x | ### 2.1. Servlet Containers Spring Boot supports the following embedded servlet containers: | Name | Servlet Version | | --- | --- | | Tomcat 9.0 | 4.0 | | Jetty 9.4 | 3.1 | | Jetty 10.0 | 4.0 | | Undertow 2.0 | 4.0 | You can also deploy Spring Boot applications to any servlet 3.1+ compatible container. 3. Installing Spring Boot -------------------------- Spring Boot can be used with “classic” Java development tools or installed as a command line tool. Either way, you need [Java SDK v1.8](https://www.java.com) or higher. Before you begin, you should check your current Java installation by using the following command: ``` $ java -version ``` If you are new to Java development or if you want to experiment with Spring Boot, you might want to try the [Spring Boot CLI](#getting-started.installing.cli) (Command Line Interface) first. Otherwise, read on for “classic” installation instructions. ### 3.1. Installation Instructions for the Java Developer You can use Spring Boot in the same way as any standard Java library. To do so, include the appropriate `spring-boot-*.jar` files on your classpath. Spring Boot does not require any special tools integration, so you can use any IDE or text editor. Also, there is nothing special about a Spring Boot application, so you can run and debug a Spring Boot application as you would any other Java program. Although you *could* copy Spring Boot jars, we generally recommend that you use a build tool that supports dependency management (such as Maven or Gradle). #### 3.1.1. Maven Installation Spring Boot is compatible with Apache Maven 3.3 or above. If you do not already have Maven installed, you can follow the instructions at [maven.apache.org](https://maven.apache.org). | | | | --- | --- | | | On many operating systems, Maven can be installed with a package manager. If you use OSX Homebrew, try `brew install maven`. Ubuntu users can run `sudo apt-get install maven`. Windows users with [Chocolatey](https://chocolatey.org/) can run `choco install maven` from an elevated (administrator) prompt. | Spring Boot dependencies use the `org.springframework.boot` `groupId`. Typically, your Maven POM file inherits from the `spring-boot-starter-parent` project and declares dependencies to one or more [“Starters”](using#using.build-systems.starters). Spring Boot also provides an optional [Maven plugin](build-tool-plugins#build-tool-plugins.maven) to create executable jars. More details on getting started with Spring Boot and Maven can be found in the [Getting Started section](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#getting-started) of the Maven plugin’s reference guide. #### 3.1.2. Gradle Installation Spring Boot is compatible with Gradle 6.8, 6.9, and 7.x. If you do not already have Gradle installed, you can follow the instructions at [gradle.org](https://gradle.org). Spring Boot dependencies can be declared by using the `org.springframework.boot` `group`. Typically, your project declares dependencies to one or more [“Starters”](using#using.build-systems.starters). Spring Boot provides a useful [Gradle plugin](build-tool-plugins#build-tool-plugins.gradle) that can be used to simplify dependency declarations and to create executable jars. Gradle Wrapper The Gradle Wrapper provides a nice way of “obtaining” Gradle when you need to build a project. It is a small script and library that you commit alongside your code to bootstrap the build process. See [docs.gradle.org/current/userguide/gradle\_wrapper.html](https://docs.gradle.org/current/userguide/gradle_wrapper.html) for details. More details on getting started with Spring Boot and Gradle can be found in the [Getting Started section](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#getting-started) of the Gradle plugin’s reference guide. ### 3.2. Installing the Spring Boot CLI The Spring Boot CLI (Command Line Interface) is a command line tool that you can use to quickly prototype with Spring. It lets you run [Groovy](https://groovy-lang.org/) scripts, which means that you have a familiar Java-like syntax without so much boilerplate code. You do not need to use the CLI to work with Spring Boot, but it is a quick way to get a Spring application off the ground without an IDE. #### 3.2.1. Manual Installation You can download the Spring CLI distribution from the Spring software repository: * [spring-boot-cli-2.7.0-bin.zip](https://repo.spring.io/release/org/springframework/boot/spring-boot-cli/2.7.0/spring-boot-cli-2.7.0-bin.zip) * [spring-boot-cli-2.7.0-bin.tar.gz](https://repo.spring.io/release/org/springframework/boot/spring-boot-cli/2.7.0/spring-boot-cli-2.7.0-bin.tar.gz) Cutting edge [snapshot distributions](https://repo.spring.io/snapshot/org/springframework/boot/spring-boot-cli/) are also available. Once downloaded, follow the [INSTALL.txt](https://raw.githubusercontent.com/spring-projects/spring-boot/v2.7.0/spring-boot-project/spring-boot-cli/src/main/content/INSTALL.txt) instructions from the unpacked archive. In summary, there is a `spring` script (`spring.bat` for Windows) in a `bin/` directory in the `.zip` file. Alternatively, you can use `java -jar` with the `.jar` file (the script helps you to be sure that the classpath is set correctly). #### 3.2.2. Installation with SDKMAN! SDKMAN! (The Software Development Kit Manager) can be used for managing multiple versions of various binary SDKs, including Groovy and the Spring Boot CLI. Get SDKMAN! from [sdkman.io](https://sdkman.io) and install Spring Boot by using the following commands: ``` $ sdk install springboot $ spring --version Spring CLI v2.7.0 ``` If you develop features for the CLI and want access to the version you built, use the following commands: ``` $ sdk install springboot dev /path/to/spring-boot/spring-boot-cli/target/spring-boot-cli-2.7.0-bin/spring-2.7.0/ $ sdk default springboot dev $ spring --version Spring CLI v2.7.0 ``` The preceding instructions install a local instance of `spring` called the `dev` instance. It points at your target build location, so every time you rebuild Spring Boot, `spring` is up-to-date. You can see it by running the following command: ``` $ sdk ls springboot ================================================================================ Available Springboot Versions ================================================================================ > + dev * 2.7.0 ================================================================================ + - local version * - installed > - currently in use ================================================================================ ``` #### 3.2.3. OSX Homebrew Installation If you are on a Mac and use [Homebrew](https://brew.sh/), you can install the Spring Boot CLI by using the following commands: ``` $ brew tap spring-io/tap $ brew install spring-boot ``` Homebrew installs `spring` to `/usr/local/bin`. | | | | --- | --- | | | If you do not see the formula, your installation of brew might be out-of-date. In that case, run `brew update` and try again. | #### 3.2.4. MacPorts Installation If you are on a Mac and use [MacPorts](https://www.macports.org/), you can install the Spring Boot CLI by using the following command: ``` $ sudo port install spring-boot-cli ``` #### 3.2.5. Command-line Completion The Spring Boot CLI includes scripts that provide command completion for the [BASH](https://en.wikipedia.org/wiki/Bash_%28Unix_shell%29) and [zsh](https://en.wikipedia.org/wiki/Z_shell) shells. You can `source` the script (also named `spring`) in any shell or put it in your personal or system-wide bash completion initialization. On a Debian system, the system-wide scripts are in `/shell-completion/bash` and all scripts in that directory are executed when a new shell starts. For example, to run the script manually if you have installed by using SDKMAN!, use the following commands: ``` $ . ~/.sdkman/candidates/springboot/current/shell-completion/bash/spring $ spring <HIT TAB HERE> grab help jar run test version ``` | | | | --- | --- | | | If you install the Spring Boot CLI by using Homebrew or MacPorts, the command-line completion scripts are automatically registered with your shell. | #### 3.2.6. Windows Scoop Installation If you are on a Windows and use [Scoop](https://scoop.sh/), you can install the Spring Boot CLI by using the following commands: ``` > scoop bucket add extras > scoop install springboot ``` Scoop installs `spring` to `~/scoop/apps/springboot/current/bin`. | | | | --- | --- | | | If you do not see the app manifest, your installation of scoop might be out-of-date. In that case, run `scoop update` and try again. | #### 3.2.7. Quick-start Spring CLI Example You can use the following web application to test your installation. To start, create a file called `app.groovy`, as follows: ``` @RestController class ThisWillActuallyRun { @RequestMapping("/") String home() { "Hello World!" } } ``` Then run it from a shell, as follows: ``` $ spring run app.groovy ``` | | | | --- | --- | | | The first run of your application is slow, as dependencies are downloaded. Subsequent runs are much quicker. | Open `[localhost:8080](http://localhost:8080)` in your favorite web browser. You should see the following output: ``` Hello World! ``` 4. Developing Your First Spring Boot Application ------------------------------------------------- This section describes how to develop a small “Hello World!” web application that highlights some of Spring Boot’s key features. We use Maven to build this project, since most IDEs support it. | | | | --- | --- | | | The [spring.io](https://spring.io) web site contains many “Getting Started” [guides](https://spring.io/guides) that use Spring Boot. If you need to solve a specific problem, check there first. You can shortcut the steps below by going to [start.spring.io](https://start.spring.io) and choosing the "Web" starter from the dependencies searcher. Doing so generates a new project structure so that you can [start coding right away](#getting-started.first-application.code). Check the [start.spring.io user guide](https://github.com/spring-io/start.spring.io/blob/main/USING.adoc) for more details. | Before we begin, open a terminal and run the following commands to ensure that you have valid versions of Java and Maven installed: ``` $ java -version java version "1.8.0_102" Java(TM) SE Runtime Environment (build 1.8.0_102-b14) Java HotSpot(TM) 64-Bit Server VM (build 25.102-b14, mixed mode) ``` ``` $ mvn -v Apache Maven 3.5.4 (1edded0938998edf8bf061f1ceb3cfdeccf443fe; 2018-06-17T14:33:14-04:00) Maven home: /usr/local/Cellar/maven/3.3.9/libexec Java version: 1.8.0_102, vendor: Oracle Corporation ``` | | | | --- | --- | | | This sample needs to be created in its own directory. Subsequent instructions assume that you have created a suitable directory and that it is your current directory. | ### 4.1. Creating the POM We need to start by creating a Maven `pom.xml` file. The `pom.xml` is the recipe that is used to build your project. Open your favorite text editor and add the following: ``` <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.example</groupId> <artifactId>myproject</artifactId> <version>0.0.1-SNAPSHOT</version> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.7.0</version> </parent> <!-- Additional lines to be added here... --> </project> ``` The preceding listing should give you a working build. You can test it by running `mvn package` (for now, you can ignore the “jar will be empty - no content was marked for inclusion!” warning). | | | | --- | --- | | | At this point, you could import the project into an IDE (most modern Java IDEs include built-in support for Maven). For simplicity, we continue to use a plain text editor for this example. | ### 4.2. Adding Classpath Dependencies Spring Boot provides a number of “Starters” that let you add jars to your classpath. Our applications for smoke tests use the `spring-boot-starter-parent` in the `parent` section of the POM. The `spring-boot-starter-parent` is a special starter that provides useful Maven defaults. It also provides a [`dependency-management`](using#using.build-systems.dependency-management) section so that you can omit `version` tags for “blessed” dependencies. Other “Starters” provide dependencies that you are likely to need when developing a specific type of application. Since we are developing a web application, we add a `spring-boot-starter-web` dependency. Before that, we can look at what we currently have by running the following command: ``` $ mvn dependency:tree [INFO] com.example:myproject:jar:0.0.1-SNAPSHOT ``` The `mvn dependency:tree` command prints a tree representation of your project dependencies. You can see that `spring-boot-starter-parent` provides no dependencies by itself. To add the necessary dependencies, edit your `pom.xml` and add the `spring-boot-starter-web` dependency immediately below the `parent` section: ``` <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> </dependencies> ``` If you run `mvn dependency:tree` again, you see that there are now a number of additional dependencies, including the Tomcat web server and Spring Boot itself. ### 4.3. Writing the Code To finish our application, we need to create a single Java file. By default, Maven compiles sources from `src/main/java`, so you need to create that directory structure and then add a file named `src/main/java/MyApplication.java` to contain the following code: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.EnableAutoConfiguration; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; @RestController @EnableAutoConfiguration public class MyApplication { @RequestMapping("/") String home() { return "Hello World!"; } public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.EnableAutoConfiguration import org.springframework.boot.runApplication import org.springframework.web.bind.annotation.RequestMapping import org.springframework.web.bind.annotation.RestController @RestController @EnableAutoConfiguration class MyApplication { @RequestMapping("/") fun home() = "Hello World!" } fun main(args: Array<String>) { runApplication<MyApplication>(\*args) } ``` Although there is not much code here, quite a lot is going on. We step through the important parts in the next few sections. #### 4.3.1. The @RestController and @RequestMapping Annotations The first annotation on our `MyApplication` class is `@RestController`. This is known as a *stereotype* annotation. It provides hints for people reading the code and for Spring that the class plays a specific role. In this case, our class is a web `@Controller`, so Spring considers it when handling incoming web requests. The `@RequestMapping` annotation provides “routing” information. It tells Spring that any HTTP request with the `/` path should be mapped to the `home` method. The `@RestController` annotation tells Spring to render the resulting string directly back to the caller. | | | | --- | --- | | | The `@RestController` and `@RequestMapping` annotations are Spring MVC annotations (they are not specific to Spring Boot). See the [MVC section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#mvc) in the Spring Reference Documentation for more details. | #### 4.3.2. The @EnableAutoConfiguration Annotation The second class-level annotation is `@EnableAutoConfiguration`. This annotation tells Spring Boot to “guess” how you want to configure Spring, based on the jar dependencies that you have added. Since `spring-boot-starter-web` added Tomcat and Spring MVC, the auto-configuration assumes that you are developing a web application and sets up Spring accordingly. Starters and Auto-configuration Auto-configuration is designed to work well with “Starters”, but the two concepts are not directly tied. You are free to pick and choose jar dependencies outside of the starters. Spring Boot still does its best to auto-configure your application. #### 4.3.3. The “main” Method The final part of our application is the `main` method. This is a standard method that follows the Java convention for an application entry point. Our main method delegates to Spring Boot’s `SpringApplication` class by calling `run`. `SpringApplication` bootstraps our application, starting Spring, which, in turn, starts the auto-configured Tomcat web server. We need to pass `MyApplication.class` as an argument to the `run` method to tell `SpringApplication` which is the primary Spring component. The `args` array is also passed through to expose any command-line arguments. ### 4.4. Running the Example At this point, your application should work. Since you used the `spring-boot-starter-parent` POM, you have a useful `run` goal that you can use to start the application. Type `mvn spring-boot:run` from the root project directory to start the application. You should see output similar to the following: ``` $ mvn spring-boot:run . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.7.0) ....... . . . ....... . . . (log output here) ....... . . . ........ Started MyApplication in 2.222 seconds (JVM running for 6.514) ``` If you open a web browser to `[localhost:8080](http://localhost:8080)`, you should see the following output: ``` Hello World! ``` To gracefully exit the application, press `ctrl-c`. ### 4.5. Creating an Executable Jar We finish our example by creating a completely self-contained executable jar file that we could run in production. Executable jars (sometimes called “fat jars”) are archives containing your compiled classes along with all of the jar dependencies that your code needs to run. Executable jars and Java Java does not provide a standard way to load nested jar files (jar files that are themselves contained within a jar). This can be problematic if you are looking to distribute a self-contained application. To solve this problem, many developers use “uber” jars. An uber jar packages all the classes from all the application’s dependencies into a single archive. The problem with this approach is that it becomes hard to see which libraries are in your application. It can also be problematic if the same filename is used (but with different content) in multiple jars. Spring Boot takes a [different approach](executable-jar#appendix.executable-jar) and lets you actually nest jars directly. To create an executable jar, we need to add the `spring-boot-maven-plugin` to our `pom.xml`. To do so, insert the following lines just below the `dependencies` section: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> </plugins> </build> ``` | | | | --- | --- | | | The `spring-boot-starter-parent` POM includes `<executions>` configuration to bind the `repackage` goal. If you do not use the parent POM, you need to declare this configuration yourself. See the [plugin documentation](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#getting-started) for details. | Save your `pom.xml` and run `mvn package` from the command line, as follows: ``` $ mvn package [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building myproject 0.0.1-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] .... .. [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ myproject --- [INFO] Building jar: /Users/developer/example/spring-boot-example/target/myproject-0.0.1-SNAPSHOT.jar [INFO] [INFO] --- spring-boot-maven-plugin:2.7.0:repackage (default) @ myproject --- [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ ``` If you look in the `target` directory, you should see `myproject-0.0.1-SNAPSHOT.jar`. The file should be around 10 MB in size. If you want to peek inside, you can use `jar tvf`, as follows: ``` $ jar tvf target/myproject-0.0.1-SNAPSHOT.jar ``` You should also see a much smaller file named `myproject-0.0.1-SNAPSHOT.jar.original` in the `target` directory. This is the original jar file that Maven created before it was repackaged by Spring Boot. To run that application, use the `java -jar` command, as follows: ``` $ java -jar target/myproject-0.0.1-SNAPSHOT.jar . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.7.0) ....... . . . ....... . . . (log output here) ....... . . . ........ Started MyApplication in 2.536 seconds (JVM running for 2.864) ``` As before, to exit the application, press `ctrl-c`. 5. What to Read Next --------------------- Hopefully, this section provided some of the Spring Boot basics and got you on your way to writing your own applications. If you are a task-oriented type of developer, you might want to jump over to [spring.io](https://spring.io) and follow some of the [getting started](https://spring.io/guides/) guides that solve specific “How do I do that with Spring?” problems. We also have Spring Boot-specific “[How-to](howto#howto)” reference documentation. Otherwise, the next logical step is to read *[using.html](using#using)*. If you are really impatient, you could also jump ahead and read about *[Spring Boot features](features#features)*.
programming_docs
spring_boot Container Images Container Images ================ Spring Boot applications can be containerized [using Dockerfiles](#container-images.dockerfiles), or by [using Cloud Native Buildpacks to create optimized docker compatible container images that you can run anywhere](#container-images.buildpacks). 1. Efficient container images ------------------------------ It is easily possible to package a Spring Boot fat jar as a docker image. However, there are various downsides to copying and running the fat jar as is in the docker image. There’s always a certain amount of overhead when running a fat jar without unpacking it, and in a containerized environment this can be noticeable. The other issue is that putting your application’s code and all its dependencies in one layer in the Docker image is sub-optimal. Since you probably recompile your code more often than you upgrade the version of Spring Boot you use, it’s often better to separate things a bit more. If you put jar files in the layer before your application classes, Docker often only needs to change the very bottom layer and can pick others up from its cache. ### 1.1. Unpacking the fat jar If you are running your application from a container, you can use an executable jar, but it is also often an advantage to explode it and run it in a different way. Certain PaaS implementations may also choose to unpack archives before they run. For example, Cloud Foundry operates this way. One way to run an unpacked archive is by starting the appropriate launcher, as follows: ``` $ jar -xf myapp.jar $ java org.springframework.boot.loader.JarLauncher ``` This is actually slightly faster on startup (depending on the size of the jar) than running from an unexploded archive. At runtime you should not expect any differences. Once you have unpacked the jar file, you can also get an extra boost to startup time by running the app with its "natural" main method instead of the `JarLauncher`. For example: ``` $ jar -xf myapp.jar $ java -cp BOOT-INF/classes:BOOT-INF/lib/* com.example.MyApplication ``` | | | | --- | --- | | | Using the `JarLauncher` over the application’s main method has the added benefit of a predictable classpath order. The jar contains a `classpath.idx` file which is used by the `JarLauncher` when constructing the classpath. | ### 1.2. Layering Docker Images To make it easier to create optimized Docker images, Spring Boot supports adding a layer index file to the jar. It provides a list of layers and the parts of the jar that should be contained within them. The list of layers in the index is ordered based on the order in which the layers should be added to the Docker/OCI image. Out-of-the-box, the following layers are supported: * `dependencies` (for regular released dependencies) * `spring-boot-loader` (for everything under `org/springframework/boot/loader`) * `snapshot-dependencies` (for snapshot dependencies) * `application` (for application classes and resources) The following shows an example of a `layers.idx` file: ``` - "dependencies": - BOOT-INF/lib/library1.jar - BOOT-INF/lib/library2.jar - "spring-boot-loader": - org/springframework/boot/loader/JarLauncher.class - org/springframework/boot/loader/jar/JarEntry.class - "snapshot-dependencies": - BOOT-INF/lib/library3-SNAPSHOT.jar - "application": - META-INF/MANIFEST.MF - BOOT-INF/classes/a/b/C.class ``` This layering is designed to separate code based on how likely it is to change between application builds. Library code is less likely to change between builds, so it is placed in its own layers to allow tooling to re-use the layers from cache. Application code is more likely to change between builds so it is isolated in a separate layer. Spring Boot also supports layering for war files with the help of a `layers.idx`. For Maven, see the [packaging layered jar or war section](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#repackage-layers) for more details on adding a layer index to the archive. For Gradle, see the [packaging layered jar or war section](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#packaging-layered-archives) of the Gradle plugin documentation. 2. Dockerfiles --------------- While it is possible to convert a Spring Boot fat jar into a docker image with just a few lines in the Dockerfile, we will use the [layering feature](#container-images.efficient-images.layering) to create an optimized docker image. When you create a jar containing the layers index file, the `spring-boot-jarmode-layertools` jar will be added as a dependency to your jar. With this jar on the classpath, you can launch your application in a special mode which allows the bootstrap code to run something entirely different from your application, for example, something that extracts the layers. | | | | --- | --- | | | The `layertools` mode can not be used with a [fully executable Spring Boot archive](deployment#deployment.installing) that includes a launch script. Disable launch script configuration when building a jar file that is intended to be used with `layertools`. | Here’s how you can launch your jar with a `layertools` jar mode: ``` $ java -Djarmode=layertools -jar my-app.jar ``` This will provide the following output: ``` Usage: java -Djarmode=layertools -jar my-app.jar Available commands: list List layers from the jar that can be extracted extract Extracts layers from the jar for image creation help Help about any command ``` The `extract` command can be used to easily split the application into layers to be added to the dockerfile. Here is an example of a Dockerfile using `jarmode`. ``` FROM eclipse-temurin:11-jre as builder WORKDIR application ARG JAR_FILE=target/*.jar COPY ${JAR_FILE} application.jar RUN java -Djarmode=layertools -jar application.jar extract FROM eclipse-temurin:11-jre WORKDIR application COPY --from=builder application/dependencies/ ./ COPY --from=builder application/spring-boot-loader/ ./ COPY --from=builder application/snapshot-dependencies/ ./ COPY --from=builder application/application/ ./ ENTRYPOINT ["java", "org.springframework.boot.loader.JarLauncher"] ``` Assuming the above `Dockerfile` is in the current directory, your docker image can be built with `docker build .`, or optionally specifying the path to your application jar, as shown in the following example: ``` $ docker build --build-arg JAR_FILE=path/to/myapp.jar . ``` This is a multi-stage dockerfile. The builder stage extracts the directories that are needed later. Each of the `COPY` commands relates to the layers extracted by the jarmode. Of course, a Dockerfile can be written without using the jarmode. You can use some combination of `unzip` and `mv` to move things to the right layer but jarmode simplifies that. 3. Cloud Native Buildpacks --------------------------- Dockerfiles are just one way to build docker images. Another way to build docker images is directly from your Maven or Gradle plugin, using buildpacks. If you’ve ever used an application platform such as Cloud Foundry or Heroku then you’ve probably used a buildpack. Buildpacks are the part of the platform that takes your application and converts it into something that the platform can actually run. For example, Cloud Foundry’s Java buildpack will notice that you’re pushing a `.jar` file and automatically add a relevant JRE. With Cloud Native Buildpacks, you can create Docker compatible images that you can run anywhere. Spring Boot includes buildpack support directly for both Maven and Gradle. This means you can just type a single command and quickly get a sensible image into your locally running Docker daemon. See the individual plugin documentation on how to use buildpacks with [Maven](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#build-image) and [Gradle](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#build-image). | | | | --- | --- | | | The [Paketo Spring Boot buildpack](https://github.com/paketo-buildpacks/spring-boot) has also been updated to support the `layers.idx` file so any customization that is applied to it will be reflected in the image created by the buildpack. | | | | | --- | --- | | | In order to achieve reproducible builds and container image caching, Buildpacks can manipulate the application resources metadata (such as the file "last modified" information). You should ensure that your application does not rely on that metadata at runtime. Spring Boot can use that information when serving static resources, but this can be disabled with `spring.web.resources.cache.use-last-modified` | 4. What to Read Next --------------------- Once you’ve learned how to build efficient container images, you can read about [deploying applications to a cloud platform](deployment#deployment.cloud.kubernetes), such as Kubernetes. spring_boot “How-to” Guides “How-to” Guides =============== This section provides answers to some common ‘how do I do that…​’ questions that often arise when using Spring Boot. Its coverage is not exhaustive, but it does cover quite a lot. If you have a specific problem that we do not cover here, you might want to check [stackoverflow.com](https://stackoverflow.com/tags/spring-boot) to see if someone has already provided an answer. This is also a great place to ask new questions (please use the `spring-boot` tag). We are also more than happy to extend this section. If you want to add a ‘how-to’, send us a [pull request](https://github.com/spring-projects/spring-boot/tree/v2.7.0). 1. Spring Boot Application --------------------------- This section includes topics relating directly to Spring Boot applications. ### 1.1. Create Your Own FailureAnalyzer [`FailureAnalyzer`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/diagnostics/FailureAnalyzer.html) is a great way to intercept an exception on startup and turn it into a human-readable message, wrapped in a [`FailureAnalysis`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/diagnostics/FailureAnalysis.html). Spring Boot provides such an analyzer for application-context-related exceptions, JSR-303 validations, and more. You can also create your own. `AbstractFailureAnalyzer` is a convenient extension of `FailureAnalyzer` that checks the presence of a specified exception type in the exception to handle. You can extend from that so that your implementation gets a chance to handle the exception only when it is actually present. If, for whatever reason, you cannot handle the exception, return `null` to give another implementation a chance to handle the exception. `FailureAnalyzer` implementations must be registered in `META-INF/spring.factories`. The following example registers `ProjectConstraintViolationFailureAnalyzer`: ``` org.springframework.boot.diagnostics.FailureAnalyzer=\ com.example.ProjectConstraintViolationFailureAnalyzer ``` | | | | --- | --- | | | If you need access to the `BeanFactory` or the `Environment`, your `FailureAnalyzer` can implement `BeanFactoryAware` or `EnvironmentAware` respectively. | ### 1.2. Troubleshoot Auto-configuration The Spring Boot auto-configuration tries its best to “do the right thing”, but sometimes things fail, and it can be hard to tell why. There is a really useful `ConditionEvaluationReport` available in any Spring Boot `ApplicationContext`. You can see it if you enable `DEBUG` logging output. If you use the `spring-boot-actuator` (see [the Actuator chapter](actuator#actuator)), there is also a `conditions` endpoint that renders the report in JSON. Use that endpoint to debug the application and see what features have been added (and which have not been added) by Spring Boot at runtime. Many more questions can be answered by looking at the source code and the Javadoc. When reading the code, remember the following rules of thumb: * Look for classes called `*AutoConfiguration` and read their sources. Pay special attention to the `@Conditional*` annotations to find out what features they enable and when. Add `--debug` to the command line or a System property `-Ddebug` to get a log on the console of all the auto-configuration decisions that were made in your app. In a running application with actuator enabled, look at the `conditions` endpoint (`/actuator/conditions` or the JMX equivalent) for the same information. * Look for classes that are `@ConfigurationProperties` (such as [`ServerProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/ServerProperties.java)) and read from there the available external configuration options. The `@ConfigurationProperties` annotation has a `name` attribute that acts as a prefix to external properties. Thus, `ServerProperties` has `prefix="server"` and its configuration properties are `server.port`, `server.address`, and others. In a running application with actuator enabled, look at the `configprops` endpoint. * Look for uses of the `bind` method on the `Binder` to pull configuration values explicitly out of the `Environment` in a relaxed manner. It is often used with a prefix. * Look for `@Value` annotations that bind directly to the `Environment`. * Look for `@ConditionalOnExpression` annotations that switch features on and off in response to SpEL expressions, normally evaluated with placeholders resolved from the `Environment`. ### 1.3. Customize the Environment or ApplicationContext Before It Starts A `SpringApplication` has `ApplicationListeners` and `ApplicationContextInitializers` that are used to apply customizations to the context or environment. Spring Boot loads a number of such customizations for use internally from `META-INF/spring.factories`. There is more than one way to register additional customizations: * Programmatically, per application, by calling the `addListeners` and `addInitializers` methods on `SpringApplication` before you run it. * Declaratively, per application, by setting the `context.initializer.classes` or `context.listener.classes` properties. * Declaratively, for all applications, by adding a `META-INF/spring.factories` and packaging a jar file that the applications all use as a library. The `SpringApplication` sends some special `ApplicationEvents` to the listeners (some even before the context is created) and then registers the listeners for events published by the `ApplicationContext` as well. See “[Application Events and Listeners](features#features.spring-application.application-events-and-listeners)” in the ‘Spring Boot features’ section for a complete list. It is also possible to customize the `Environment` before the application context is refreshed by using `EnvironmentPostProcessor`. Each implementation should be registered in `META-INF/spring.factories`, as shown in the following example: ``` org.springframework.boot.env.EnvironmentPostProcessor=com.example.YourEnvironmentPostProcessor ``` The implementation can load arbitrary files and add them to the `Environment`. For instance, the following example loads a YAML configuration file from the classpath: Java ``` import java.io.IOException; import org.springframework.boot.SpringApplication; import org.springframework.boot.env.EnvironmentPostProcessor; import org.springframework.boot.env.YamlPropertySourceLoader; import org.springframework.core.env.ConfigurableEnvironment; import org.springframework.core.env.PropertySource; import org.springframework.core.io.ClassPathResource; import org.springframework.core.io.Resource; import org.springframework.util.Assert; public class MyEnvironmentPostProcessor implements EnvironmentPostProcessor { private final YamlPropertySourceLoader loader = new YamlPropertySourceLoader(); @Override public void postProcessEnvironment(ConfigurableEnvironment environment, SpringApplication application) { Resource path = new ClassPathResource("com/example/myapp/config.yml"); PropertySource<?> propertySource = loadYaml(path); environment.getPropertySources().addLast(propertySource); } private PropertySource<?> loadYaml(Resource path) { Assert.isTrue(path.exists(), () -> "Resource " + path + " does not exist"); try { return this.loader.load("custom-resource", path).get(0); } catch (IOException ex) { throw new IllegalStateException("Failed to load yaml configuration from " + path, ex); } } } ``` Kotlin ``` import org.springframework.boot.SpringApplication import org.springframework.boot.env.EnvironmentPostProcessor import org.springframework.boot.env.YamlPropertySourceLoader import org.springframework.core.env.ConfigurableEnvironment import org.springframework.core.env.PropertySource import org.springframework.core.io.ClassPathResource import org.springframework.core.io.Resource import org.springframework.util.Assert import java.io.IOException class MyEnvironmentPostProcessor : EnvironmentPostProcessor { private val loader = YamlPropertySourceLoader() override fun postProcessEnvironment(environment: ConfigurableEnvironment, application: SpringApplication) { val path: Resource = ClassPathResource("com/example/myapp/config.yml") val propertySource = loadYaml(path) environment.propertySources.addLast(propertySource) } private fun loadYaml(path: Resource): PropertySource<\*> { Assert.isTrue(path.exists()) { "Resource $path does not exist" } return try { loader.load("custom-resource", path)[0] } catch (ex: IOException) { throw IllegalStateException("Failed to load yaml configuration from $path", ex) } } } ``` | | | | --- | --- | | | The `Environment` has already been prepared with all the usual property sources that Spring Boot loads by default. It is therefore possible to get the location of the file from the environment. The preceding example adds the `custom-resource` property source at the end of the list so that a key defined in any of the usual other locations takes precedence. A custom implementation may define another order. | | | | | --- | --- | | | While using `@PropertySource` on your `@SpringBootApplication` may seem to be a convenient way to load a custom resource in the `Environment`, we do not recommend it. Such property sources are not added to the `Environment` until the application context is being refreshed. This is too late to configure certain properties such as `logging.*` and `spring.main.*` which are read before refresh begins. | ### 1.4. Build an ApplicationContext Hierarchy (Adding a Parent or Root Context) You can use the `ApplicationBuilder` class to create parent/child `ApplicationContext` hierarchies. See “[features.html](features#features.spring-application.fluent-builder-api)” in the ‘Spring Boot features’ section for more information. ### 1.5. Create a Non-web Application Not all Spring applications have to be web applications (or web services). If you want to execute some code in a `main` method but also bootstrap a Spring application to set up the infrastructure to use, you can use the `SpringApplication` features of Spring Boot. A `SpringApplication` changes its `ApplicationContext` class, depending on whether it thinks it needs a web application or not. The first thing you can do to help it is to leave server-related dependencies (such as the servlet API) off the classpath. If you cannot do that (for example, you run two applications from the same code base) then you can explicitly call `setWebApplicationType(WebApplicationType.NONE)` on your `SpringApplication` instance or set the `applicationContextClass` property (through the Java API or with external properties). Application code that you want to run as your business logic can be implemented as a `CommandLineRunner` and dropped into the context as a `@Bean` definition. 2. Properties and Configuration -------------------------------- This section includes topics about setting and reading properties and configuration settings and their interaction with Spring Boot applications. ### 2.1. Automatically Expand Properties at Build Time Rather than hardcoding some properties that are also specified in your project’s build configuration, you can automatically expand them by instead using the existing build configuration. This is possible in both Maven and Gradle. #### 2.1.1. Automatic Property Expansion Using Maven You can automatically expand properties from the Maven project by using resource filtering. If you use the `spring-boot-starter-parent`, you can then refer to your Maven ‘project properties’ with `@..@` placeholders, as shown in the following example: Properties ``` [email protected]@ [email protected]@ ``` Yaml ``` app: encoding: "@project.build.sourceEncoding@" java: version: "@java.version@" ``` | | | | --- | --- | | | Only production configuration is filtered that way (in other words, no filtering is applied on `src/test/resources`). | | | | | --- | --- | | | If you enable the `addResources` flag, the `spring-boot:run` goal can add `src/main/resources` directly to the classpath (for hot reloading purposes). Doing so circumvents the resource filtering and this feature. Instead, you can use the `exec:java` goal or customize the plugin’s configuration. See the [plugin usage page](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#getting-started) for more details. | If you do not use the starter parent, you need to include the following element inside the `<build/>` element of your `pom.xml`: ``` <resources> <resource> <directory>src/main/resources</directory> <filtering>true</filtering> </resource> </resources> ``` You also need to include the following element inside `<plugins/>`: ``` <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-resources-plugin</artifactId> <version>2.7</version> <configuration> <delimiters> <delimiter>@</delimiter> </delimiters> <useDefaultDelimiters>false</useDefaultDelimiters> </configuration> </plugin> ``` | | | | --- | --- | | | The `useDefaultDelimiters` property is important if you use standard Spring placeholders (such as `${placeholder}`) in your configuration. If that property is not set to `false`, these may be expanded by the build. | #### 2.1.2. Automatic Property Expansion Using Gradle You can automatically expand properties from the Gradle project by configuring the Java plugin’s `processResources` task to do so, as shown in the following example: ``` tasks.named('processResources') { expand(project.properties) } ``` You can then refer to your Gradle project’s properties by using placeholders, as shown in the following example: Properties ``` app.name=${name} app.description=${description} ``` Yaml ``` app: name: "${name}" description: "${description}" ``` | | | | --- | --- | | | Gradle’s `expand` method uses Groovy’s `SimpleTemplateEngine`, which transforms `${..}` tokens. The `${..}` style conflicts with Spring’s own property placeholder mechanism. To use Spring property placeholders together with automatic expansion, escape the Spring property placeholders as follows: `\${..}`. | ### 2.2. Externalize the Configuration of SpringApplication A `SpringApplication` has bean property setters, so you can use its Java API as you create the application to modify its behavior. Alternatively, you can externalize the configuration by setting properties in `spring.main.*`. For example, in `application.properties`, you might have the following settings: Properties ``` spring.main.web-application-type=none spring.main.banner-mode=off ``` Yaml ``` spring: main: web-application-type: "none" banner-mode: "off" ``` Then the Spring Boot banner is not printed on startup, and the application is not starting an embedded web server. Properties defined in external configuration override and replace the values specified with the Java API, with the notable exception of the primary sources. Primary sources are those provided to the `SpringApplication` constructor: Java ``` import org.springframework.boot.Banner; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class MyApplication { public static void main(String[] args) { SpringApplication application = new SpringApplication(MyApplication.class); application.setBannerMode(Banner.Mode.OFF); application.run(args); } } ``` Kotlin ``` import org.springframework.boot.Banner import org.springframework.boot.SpringApplication import org.springframework.boot.autoconfigure.SpringBootApplication @SpringBootApplication object MyApplication { @JvmStatic fun main(args: Array<String>) { val application = SpringApplication(MyApplication::class.java) application.setBannerMode(Banner.Mode.OFF) application.run(\*args) } } ``` Or to `sources(…​)` method of a `SpringApplicationBuilder`: Java ``` import org.springframework.boot.Banner; import org.springframework.boot.builder.SpringApplicationBuilder; public class MyApplication { public static void main(String[] args) { new SpringApplicationBuilder() .bannerMode(Banner.Mode.OFF) .sources(MyApplication.class) .run(args); } } ``` Kotlin ``` import org.springframework.boot.Banner import org.springframework.boot.builder.SpringApplicationBuilder object MyApplication { @JvmStatic fun main(args: Array<String>) { SpringApplicationBuilder() .bannerMode(Banner.Mode.OFF) .sources(MyApplication::class.java) .run(\*args) } } ``` Given the examples above, if we have the following configuration: Properties ``` spring.main.sources=com.example.MyDatabaseConfig,com.example.MyJmsConfig spring.main.banner-mode=console ``` Yaml ``` spring: main: sources: "com.example.MyDatabaseConfig,com.example.MyJmsConfig" banner-mode: "console" ``` The actual application will show the banner (as overridden by configuration) and uses three sources for the `ApplicationContext`. The application sources are: 1. `MyApplication` (from the code) 2. `MyDatabaseConfig` (from the external config) 3. `MyJmsConfig`(from the external config) ### 2.3. Change the Location of External Properties of an Application By default, properties from different sources are added to the Spring `Environment` in a defined order (see “[features.html](features#features.external-config)” in the ‘Spring Boot features’ section for the exact order). You can also provide the following System properties (or environment variables) to change the behavior: * `spring.config.name` (`SPRING_CONFIG_NAME`): Defaults to `application` as the root of the file name. * `spring.config.location` (`SPRING_CONFIG_LOCATION`): The file to load (such as a classpath resource or a URL). A separate `Environment` property source is set up for this document and it can be overridden by system properties, environment variables, or the command line. No matter what you set in the environment, Spring Boot always loads `application.properties` as described above. By default, if YAML is used, then files with the ‘.yml’ extension are also added to the list. Spring Boot logs the configuration files that are loaded at the `DEBUG` level and the candidates it has not found at `TRACE` level. See [`ConfigFileApplicationListener`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/context/config/ConfigFileApplicationListener.java) for more detail. ### 2.4. Use ‘Short’ Command Line Arguments Some people like to use (for example) `--port=9000` instead of `--server.port=9000` to set configuration properties on the command line. You can enable this behavior by using placeholders in `application.properties`, as shown in the following example: Properties ``` server.port=${port:8080} ``` Yaml ``` server: port: "${port:8080}" ``` | | | | --- | --- | | | If you inherit from the `spring-boot-starter-parent` POM, the default filter token of the `maven-resources-plugins` has been changed from `${*}` to `@` (that is, `@maven.token@` instead of `${maven.token}`) to prevent conflicts with Spring-style placeholders. If you have enabled Maven filtering for the `application.properties` directly, you may want to also change the default filter token to use [other delimiters](https://maven.apache.org/plugins/maven-resources-plugin/resources-mojo.html#delimiters). | | | | | --- | --- | | | In this specific case, the port binding works in a PaaS environment such as Heroku or Cloud Foundry. In those two platforms, the `PORT` environment variable is set automatically and Spring can bind to capitalized synonyms for `Environment` properties. | ### 2.5. Use YAML for External Properties YAML is a superset of JSON and, as such, is a convenient syntax for storing external properties in a hierarchical format, as shown in the following example: ``` spring: application: name: "cruncher" datasource: driver-class-name: "com.mysql.jdbc.Driver" url: "jdbc:mysql://localhost/test" server: port: 9000 ``` Create a file called `application.yml` and put it in the root of your classpath. Then add `snakeyaml` to your dependencies (Maven coordinates `org.yaml:snakeyaml`, already included if you use the `spring-boot-starter`). A YAML file is parsed to a Java `Map<String,Object>` (like a JSON object), and Spring Boot flattens the map so that it is one level deep and has period-separated keys, as many people are used to with `Properties` files in Java. The preceding example YAML corresponds to the following `application.properties` file: ``` spring.application.name=cruncher spring.datasource.driver-class-name=com.mysql.jdbc.Driver spring.datasource.url=jdbc:mysql://localhost/test server.port=9000 ``` See “[features.html](features#features.external-config.yaml)” in the ‘Spring Boot features’ section for more information about YAML. ### 2.6. Set the Active Spring Profiles The Spring `Environment` has an API for this, but you would normally set a System property (`spring.profiles.active`) or an OS environment variable (`SPRING_PROFILES_ACTIVE`). Also, you can launch your application with a `-D` argument (remember to put it before the main class or jar archive), as follows: ``` $ java -jar -Dspring.profiles.active=production demo-0.0.1-SNAPSHOT.jar ``` In Spring Boot, you can also set the active profile in `application.properties`, as shown in the following example: Properties ``` spring.profiles.active=production ``` Yaml ``` spring: profiles: active: "production" ``` A value set this way is replaced by the System property or environment variable setting but not by the `SpringApplicationBuilder.profiles()` method. Thus, the latter Java API can be used to augment the profiles without changing the defaults. See “[features.html](features#features.profiles)” in the “Spring Boot features” section for more information. ### 2.7. Set the Default Profile Name The default profile is a profile that is enabled if no profile is active. By default, the name of the default profile is `default`, but it could be changed using a System property (`spring.profiles.default`) or an OS environment variable (`SPRING_PROFILES_DEFAULT`). In Spring Boot, you can also set the default profile name in `application.properties`, as shown in the following example: Properties ``` spring.profiles.default=dev ``` Yaml ``` spring: profiles: default: "dev" ``` See “[features.html](features#features.profiles)” in the “Spring Boot features” section for more information. ### 2.8. Change Configuration Depending on the Environment Spring Boot supports multi-document YAML and Properties files (see [features.html](features#features.external-config.files.multi-document) for details) which can be activated conditionally based on the active profiles. If a document contains a `spring.config.activate.on-profile` key, then the profiles value (a comma-separated list of profiles or a profile expression) is fed into the Spring `Environment.acceptsProfiles()` method. If the profile expression matches then that document is included in the final merge (otherwise, it is not), as shown in the following example: Properties ``` server.port=9000 #--- spring.config.activate.on-profile=development server.port=9001 #--- spring.config.activate.on-profile=production server.port=0 ``` Yaml ``` server: port: 9000 --- spring: config: activate: on-profile: "development" server: port: 9001 --- spring: config: activate: on-profile: "production" server: port: 0 ``` In the preceding example, the default port is 9000. However, if the Spring profile called ‘development’ is active, then the port is 9001. If ‘production’ is active, then the port is 0. | | | | --- | --- | | | The documents are merged in the order in which they are encountered. Later values override earlier values. | ### 2.9. Discover Built-in Options for External Properties Spring Boot binds external properties from `application.properties` (or `.yml` files and other places) into an application at runtime. There is not (and technically cannot be) an exhaustive list of all supported properties in a single location, because contributions can come from additional jar files on your classpath. A running application with the Actuator features has a `configprops` endpoint that shows all the bound and bindable properties available through `@ConfigurationProperties`. The appendix includes an [`application.properties`](application-properties#appendix.application-properties) example with a list of the most common properties supported by Spring Boot. The definitive list comes from searching the source code for `@ConfigurationProperties` and `@Value` annotations as well as the occasional use of `Binder`. For more about the exact ordering of loading properties, see "[features.html](features#features.external-config)". 3. Embedded Web Servers ------------------------ Each Spring Boot web application includes an embedded web server. This feature leads to a number of how-to questions, including how to change the embedded server and how to configure the embedded server. This section answers those questions. ### 3.1. Use Another Web Server Many Spring Boot starters include default embedded containers. * For servlet stack applications, the `spring-boot-starter-web` includes Tomcat by including `spring-boot-starter-tomcat`, but you can use `spring-boot-starter-jetty` or `spring-boot-starter-undertow` instead. * For reactive stack applications, the `spring-boot-starter-webflux` includes Reactor Netty by including `spring-boot-starter-reactor-netty`, but you can use `spring-boot-starter-tomcat`, `spring-boot-starter-jetty`, or `spring-boot-starter-undertow` instead. When switching to a different HTTP server, you need to swap the default dependencies for those that you need instead. To help with this process, Spring Boot provides a separate starter for each of the supported HTTP servers. The following Maven example shows how to exclude Tomcat and include Jetty for Spring MVC: ``` <properties> <servlet-api.version>3.1.0</servlet-api.version> </properties> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <!-- Exclude the Tomcat dependency --> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-tomcat</artifactId> </exclusion> </exclusions> </dependency> <!-- Use Jetty instead --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-jetty</artifactId> </dependency> ``` | | | | --- | --- | | | The version of the servlet API has been overridden as, unlike Tomcat 9 and Undertow 2, Jetty 9.4 does not support servlet 4.0. | If you wish to use Jetty 10, which does support servlet 4.0, you can do so as shown in the following example: ``` <properties> <jetty.version>10.0.8</jetty.version> </properties> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <!-- Exclude the Tomcat dependency --> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-tomcat</artifactId> </exclusion> </exclusions> </dependency> <!-- Use Jetty instead --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-jetty</artifactId> <exclusions> <!-- Exclude the Jetty-9 specific dependencies --> <exclusion> <groupId>org.eclipse.jetty.websocket</groupId> <artifactId>websocket-server</artifactId> </exclusion> <exclusion> <groupId>org.eclipse.jetty.websocket</groupId> <artifactId>javax-websocket-server-impl</artifactId> </exclusion> </exclusions> </dependency> ``` Note that along with excluding the Tomcat starter, a couple of Jetty9-specific dependencies also need to be excluded. The following Gradle example configures the necessary dependencies and a [module replacement](https://docs.gradle.org/current/userguide/resolution_rules.html#sec:module_replacement) to use Undertow in place of Reactor Netty for Spring WebFlux: ``` dependencies { implementation "org.springframework.boot:spring-boot-starter-undertow" implementation "org.springframework.boot:spring-boot-starter-webflux" modules { module("org.springframework.boot:spring-boot-starter-reactor-netty") { replacedBy("org.springframework.boot:spring-boot-starter-undertow", "Use Undertow instead of Reactor Netty") } } } ``` | | | | --- | --- | | | `spring-boot-starter-reactor-netty` is required to use the `WebClient` class, so you may need to keep a dependency on Netty even when you need to include a different HTTP server. | ### 3.2. Disabling the Web Server If your classpath contains the necessary bits to start a web server, Spring Boot will automatically start it. To disable this behavior configure the `WebApplicationType` in your `application.properties`, as shown in the following example: Properties ``` spring.main.web-application-type=none ``` Yaml ``` spring: main: web-application-type: "none" ``` ### 3.3. Change the HTTP Port In a standalone application, the main HTTP port defaults to `8080` but can be set with `server.port` (for example, in `application.properties` or as a System property). Thanks to relaxed binding of `Environment` values, you can also use `SERVER_PORT` (for example, as an OS environment variable). To switch off the HTTP endpoints completely but still create a `WebApplicationContext`, use `server.port=-1` (doing so is sometimes useful for testing). For more details, see “[web.html](web#web.servlet.embedded-container.customizing)” in the ‘Spring Boot Features’ section, or the [`ServerProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/ServerProperties.java) source code. ### 3.4. Use a Random Unassigned HTTP Port To scan for a free port (using OS natives to prevent clashes) use `server.port=0`. ### 3.5. Discover the HTTP Port at Runtime You can access the port the server is running on from log output or from the `WebServerApplicationContext` through its `WebServer`. The best way to get that and be sure it has been initialized is to add a `@Bean` of type `ApplicationListener<WebServerInitializedEvent>` and pull the container out of the event when it is published. Tests that use `@SpringBootTest(webEnvironment=WebEnvironment.RANDOM_PORT)` can also inject the actual port into a field by using the `@LocalServerPort` annotation, as shown in the following example: Java ``` import org.springframework.boot.test.context.SpringBootTest; import org.springframework.boot.test.context.SpringBootTest.WebEnvironment; import org.springframework.boot.test.web.server.LocalServerPort; @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) public class MyWebIntegrationTests { @LocalServerPort int port; // ... } ``` Kotlin ``` import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.context.SpringBootTest.WebEnvironment import org.springframework.boot.test.web.server.LocalServerPort @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MyWebIntegrationTests { @LocalServerPort var port = 0 // ... } ``` | | | | --- | --- | | | `@LocalServerPort` is a meta-annotation for `@Value("${local.server.port}")`. Do not try to inject the port in a regular application. As we just saw, the value is set only after the container has been initialized. Contrary to a test, application code callbacks are processed early (before the value is actually available). | ### 3.6. Enable HTTP Response Compression HTTP response compression is supported by Jetty, Tomcat, Reactor Netty, and Undertow. It can be enabled in `application.properties`, as follows: Properties ``` server.compression.enabled=true ``` Yaml ``` server: compression: enabled: true ``` By default, responses must be at least 2048 bytes in length for compression to be performed. You can configure this behavior by setting the `server.compression.min-response-size` property. By default, responses are compressed only if their content type is one of the following: * `text/html` * `text/xml` * `text/plain` * `text/css` * `text/javascript` * `application/javascript` * `application/json` * `application/xml` You can configure this behavior by setting the `server.compression.mime-types` property. ### 3.7. Configure SSL SSL can be configured declaratively by setting the various `server.ssl.*` properties, typically in `application.properties` or `application.yml`. The following example shows setting SSL properties using a Java KeyStore file: Properties ``` server.port=8443 server.ssl.key-store=classpath:keystore.jks server.ssl.key-store-password=secret server.ssl.key-password=another-secret ``` Yaml ``` server: port: 8443 ssl: key-store: "classpath:keystore.jks" key-store-password: "secret" key-password: "another-secret" ``` The following example shows setting SSL properties using PEM-encoded certificate and private key files: Properties ``` server.port=8443 server.ssl.certificate=classpath:my-cert.crt server.ssl.certificate-private-key=classpath:my-cert.key server.ssl.trust-certificate=classpath:ca-cert.crt server.ssl.key-store-password=secret ``` Yaml ``` server: port: 8443 ssl: certificate: "classpath:my-cert.crt" certificate-private-key: "classpath:my-cert.key" trust-certificate: "classpath:ca-cert.crt" key-store-password: "secret" ``` See [`Ssl`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/web/server/Ssl.java) for details of all of the supported properties. Using configuration such as the preceding example means the application no longer supports a plain HTTP connector at port 8080. Spring Boot does not support the configuration of both an HTTP connector and an HTTPS connector through `application.properties`. If you want to have both, you need to configure one of them programmatically. We recommend using `application.properties` to configure HTTPS, as the HTTP connector is the easier of the two to configure programmatically. ### 3.8. Configure HTTP/2 You can enable HTTP/2 support in your Spring Boot application with the `server.http2.enabled` configuration property. Both `h2` (HTTP/2 over TLS) and `h2c` (HTTP/2 over TCP) are supported. To use `h2`, SSL must also be enabled. When SSL is not enabled, `h2c` will be used. The details of the `h2` support depend on the chosen web server and the application environment, since that protocol is not supported out-of-the-box by all JDK 8 releases. #### 3.8.1. HTTP/2 with Tomcat Spring Boot ships by default with Tomcat 9.0.x which supports `h2c` out of the box and `h2` out of the box when using JDK 9 or later. Alternatively, `h2` can be used on JDK 8 if the `libtcnative` library and its dependencies are installed on the host operating system. The library directory must be made available, if not already, to the JVM library path. You can do so with a JVM argument such as `-Djava.library.path=/usr/local/opt/tomcat-native/lib`. More on this in the [official Tomcat documentation](https://tomcat.apache.org/tomcat-9.0-doc/apr.html). Starting Tomcat 9.0.x on JDK 8 with HTTP/2 and SSL enabled but without that native support logs the following error: ``` ERROR 8787 --- [ main] o.a.coyote.http11.Http11NioProtocol : The upgrade handler [org.apache.coyote.http2.Http2Protocol] for [h2] only supports upgrade via ALPN but has been configured for the ["https-jsse-nio-8443"] connector that does not support ALPN. ``` This error is not fatal, and the application still starts with HTTP/1.1 SSL support. #### 3.8.2. HTTP/2 with Jetty For HTTP/2 support, Jetty requires the additional `org.eclipse.jetty.http2:http2-server` dependency. To use `h2c` no other dependencies are required. To use `h2`, you also need to choose one of the following dependencies, depending on your deployment: * `org.eclipse.jetty:jetty-alpn-java-server` for applications running on JDK9+ * `org.eclipse.jetty:jetty-alpn-openjdk8-server` for applications running on JDK8u252+ * `org.eclipse.jetty:jetty-alpn-conscrypt-server` and the [Conscrypt library](https://www.conscrypt.org/) with no JDK requirement #### 3.8.3. HTTP/2 with Reactor Netty The `spring-boot-webflux-starter` is using by default Reactor Netty as a server. Reactor Netty supports `h2c` using JDK 8 or later with no additional dependencies. Reactor Netty supports `h2` using the JDK support with JDK 9 or later. For JDK 8 environments, or for optimal runtime performance, this server also supports `h2` with native libraries. To enable that, your application needs to have an additional dependency. Spring Boot manages the version for the `io.netty:netty-tcnative-boringssl-static` "uber jar", containing native libraries for all platforms. Developers can choose to import only the required dependencies using a classifier (see [the Netty official documentation](https://netty.io/wiki/forked-tomcat-native.html)). #### 3.8.4. HTTP/2 with Undertow As of Undertow 1.4.0+, both `h2` and `h2c` are supported on JDK 8 without any additional dependencies. ### 3.9. Configure the Web Server Generally, you should first consider using one of the many available configuration keys and customize your web server by adding new entries in your `application.properties` or `application.yml` file. See “[Discover Built-in Options for External Properties](#howto.properties-and-configuration.discover-build-in-options-for-external-properties)”). The `server.*` namespace is quite useful here, and it includes namespaces like `server.tomcat.*`, `server.jetty.*` and others, for server-specific features. See the list of [application-properties.html](application-properties#appendix.application-properties). The previous sections covered already many common use cases, such as compression, SSL or HTTP/2. However, if a configuration key does not exist for your use case, you should then look at [`WebServerFactoryCustomizer`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/web/server/WebServerFactoryCustomizer.html). You can declare such a component and get access to the server factory relevant to your choice: you should select the variant for the chosen Server (Tomcat, Jetty, Reactor Netty, Undertow) and the chosen web stack (servlet or reactive). The example below is for Tomcat with the `spring-boot-starter-web` (servlet stack): Java ``` import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory; import org.springframework.boot.web.server.WebServerFactoryCustomizer; import org.springframework.stereotype.Component; @Component public class MyTomcatWebServerCustomizer implements WebServerFactoryCustomizer<TomcatServletWebServerFactory> { @Override public void customize(TomcatServletWebServerFactory factory) { // customize the factory here } } ``` Kotlin ``` import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory import org.springframework.boot.web.server.WebServerFactoryCustomizer import org.springframework.stereotype.Component @Component class MyTomcatWebServerCustomizer : WebServerFactoryCustomizer<TomcatServletWebServerFactory?> { override fun customize(factory: TomcatServletWebServerFactory?) { // customize the factory here } } ``` | | | | --- | --- | | | Spring Boot uses that infrastructure internally to auto-configure the server. Auto-configured `WebServerFactoryCustomizer` beans have an order of `0` and will be processed before any user-defined customizers, unless it has an explicit order that states otherwise. | Once you have got access to a `WebServerFactory` using the customizer, you can use it to configure specific parts, like connectors, server resources, or the server itself - all using server-specific APIs. In addition Spring Boot provides: | Server | Servlet stack | Reactive stack | | --- | --- | --- | | Tomcat | `TomcatServletWebServerFactory` | `TomcatReactiveWebServerFactory` | | Jetty | `JettyServletWebServerFactory` | `JettyReactiveWebServerFactory` | | Undertow | `UndertowServletWebServerFactory` | `UndertowReactiveWebServerFactory` | | Reactor | N/A | `NettyReactiveWebServerFactory` | As a last resort, you can also declare your own `WebServerFactory` bean, which will override the one provided by Spring Boot. When you do so, auto-configured customizers are still applied on your custom factory, so use that option carefully. ### 3.10. Add a Servlet, Filter, or Listener to an Application In a servlet stack application, that is with the `spring-boot-starter-web`, there are two ways to add `Servlet`, `Filter`, `ServletContextListener`, and the other listeners supported by the Servlet API to your application: * [Add a Servlet, Filter, or Listener by Using a Spring Bean](#howto.webserver.add-servlet-filter-listener.spring-bean) * [Add Servlets, Filters, and Listeners by Using Classpath Scanning](#howto.webserver.add-servlet-filter-listener.using-scanning) #### 3.10.1. Add a Servlet, Filter, or Listener by Using a Spring Bean To add a `Servlet`, `Filter`, or servlet `*Listener` by using a Spring bean, you must provide a `@Bean` definition for it. Doing so can be very useful when you want to inject configuration or dependencies. However, you must be very careful that they do not cause eager initialization of too many other beans, because they have to be installed in the container very early in the application lifecycle. (For example, it is not a good idea to have them depend on your `DataSource` or JPA configuration.) You can work around such restrictions by initializing the beans lazily when first used instead of on initialization. In the case of filters and servlets, you can also add mappings and init parameters by adding a `FilterRegistrationBean` or a `ServletRegistrationBean` instead of or in addition to the underlying component. | | | | --- | --- | | | If no `dispatcherType` is specified on a filter registration, `REQUEST` is used. This aligns with the servlet specification’s default dispatcher type. | Like any other Spring bean, you can define the order of servlet filter beans; please make sure to check the “[web.html](web#web.servlet.embedded-container.servlets-filters-listeners.beans)” section. ##### Disable Registration of a Servlet or Filter As [described earlier](#howto.webserver.add-servlet-filter-listener.spring-bean), any `Servlet` or `Filter` beans are registered with the servlet container automatically. To disable registration of a particular `Filter` or `Servlet` bean, create a registration bean for it and mark it as disabled, as shown in the following example: Java ``` import org.springframework.boot.web.servlet.FilterRegistrationBean; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyFilterConfiguration { @Bean public FilterRegistrationBean<MyFilter> registration(MyFilter filter) { FilterRegistrationBean<MyFilter> registration = new FilterRegistrationBean<>(filter); registration.setEnabled(false); return registration; } } ``` Kotlin ``` import org.springframework.boot.web.servlet.FilterRegistrationBean import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyFilterConfiguration { @Bean fun registration(filter: MyFilter): FilterRegistrationBean<MyFilter> { val registration = FilterRegistrationBean(filter) registration.isEnabled = false return registration } } ``` #### 3.10.2. Add Servlets, Filters, and Listeners by Using Classpath Scanning `@WebServlet`, `@WebFilter`, and `@WebListener` annotated classes can be automatically registered with an embedded servlet container by annotating a `@Configuration` class with `@ServletComponentScan` and specifying the package(s) containing the components that you want to register. By default, `@ServletComponentScan` scans from the package of the annotated class. ### 3.11. Configure Access Logging Access logs can be configured for Tomcat, Undertow, and Jetty through their respective namespaces. For instance, the following settings log access on Tomcat with a [custom pattern](https://tomcat.apache.org/tomcat-9.0-doc/config/valve.html#Access_Logging). Properties ``` server.tomcat.basedir=my-tomcat server.tomcat.accesslog.enabled=true server.tomcat.accesslog.pattern=%t %a %r %s (%D ms) ``` Yaml ``` server: tomcat: basedir: "my-tomcat" accesslog: enabled: true pattern: "%t %a %r %s (%D ms)" ``` | | | | --- | --- | | | The default location for logs is a `logs` directory relative to the Tomcat base directory. By default, the `logs` directory is a temporary directory, so you may want to fix Tomcat’s base directory or use an absolute path for the logs. In the preceding example, the logs are available in `my-tomcat/logs` relative to the working directory of the application. | Access logging for Undertow can be configured in a similar fashion, as shown in the following example: Properties ``` server.undertow.accesslog.enabled=true server.undertow.accesslog.pattern=%t %a %r %s (%D ms) ``` Yaml ``` server: undertow: accesslog: enabled: true pattern: "%t %a %r %s (%D ms)" ``` Logs are stored in a `logs` directory relative to the working directory of the application. You can customize this location by setting the `server.undertow.accesslog.dir` property. Finally, access logging for Jetty can also be configured as follows: Properties ``` server.jetty.accesslog.enabled=true server.jetty.accesslog.filename=/var/log/jetty-access.log ``` Yaml ``` server: jetty: accesslog: enabled: true filename: "/var/log/jetty-access.log" ``` By default, logs are redirected to `System.err`. For more details, see the Jetty documentation. ### 3.12. Running Behind a Front-end Proxy Server If your application is running behind a proxy, a load-balancer or in the cloud, the request information (like the host, port, scheme…​) might change along the way. Your application may be running on `10.10.10.10:8080`, but HTTP clients should only see `example.org`. [RFC7239 "Forwarded Headers"](https://tools.ietf.org/html/rfc7239) defines the `Forwarded` HTTP header; proxies can use this header to provide information about the original request. You can configure your application to read those headers and automatically use that information when creating links and sending them to clients in HTTP 302 responses, JSON documents or HTML pages. There are also non-standard headers, like `X-Forwarded-Host`, `X-Forwarded-Port`, `X-Forwarded-Proto`, `X-Forwarded-Ssl`, and `X-Forwarded-Prefix`. If the proxy adds the commonly used `X-Forwarded-For` and `X-Forwarded-Proto` headers, setting `server.forward-headers-strategy` to `NATIVE` is enough to support those. With this option, the Web servers themselves natively support this feature; you can check their specific documentation to learn about specific behavior. If this is not enough, Spring Framework provides a [ForwardedHeaderFilter](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#filters-forwarded-headers). You can register it as a servlet filter in your application by setting `server.forward-headers-strategy` is set to `FRAMEWORK`. | | | | --- | --- | | | If you are using Tomcat and terminating SSL at the proxy, `server.tomcat.redirect-context-root` should be set to `false`. This allows the `X-Forwarded-Proto` header to be honored before any redirects are performed. | | | | | --- | --- | | | If your application runs in Cloud Foundry or Heroku, the `server.forward-headers-strategy` property defaults to `NATIVE`. In all other instances, it defaults to `NONE`. | #### 3.12.1. Customize Tomcat’s Proxy Configuration If you use Tomcat, you can additionally configure the names of the headers used to carry “forwarded” information, as shown in the following example: Properties ``` server.tomcat.remoteip.remote-ip-header=x-your-remote-ip-header server.tomcat.remoteip.protocol-header=x-your-protocol-header ``` Yaml ``` server: tomcat: remoteip: remote-ip-header: "x-your-remote-ip-header" protocol-header: "x-your-protocol-header" ``` Tomcat is also configured with a regular expression that matches internal proxies that are to be trusted. See the [`server.tomcat.remoteip.internal-proxies` entry in the appendix](application-properties#application-properties.server.server.tomcat.remoteip.internal-proxies) for its default value. You can customize the valve’s configuration by adding an entry to `application.properties`, as shown in the following example: Properties ``` server.tomcat.remoteip.internal-proxies=192\\.168\\.\\d{1,3}\\.\\d{1,3} ``` Yaml ``` server: tomcat: remoteip: internal-proxies: "192\\.168\\.\\d{1,3}\\.\\d{1,3}" ``` | | | | --- | --- | | | You can trust all proxies by setting the `internal-proxies` to empty (but do not do so in production). | You can take complete control of the configuration of Tomcat’s `RemoteIpValve` by switching the automatic one off (to do so, set `server.forward-headers-strategy=NONE`) and adding a new valve instance using a `WebServerFactoryCustomizer` bean. ### 3.13. Enable Multiple Connectors with Tomcat You can add an `org.apache.catalina.connector.Connector` to the `TomcatServletWebServerFactory`, which can allow multiple connectors, including HTTP and HTTPS connectors, as shown in the following example: Java ``` import java.io.IOException; import java.net.URL; import org.apache.catalina.connector.Connector; import org.apache.coyote.http11.Http11NioProtocol; import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory; import org.springframework.boot.web.server.WebServerFactoryCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.util.ResourceUtils; @Configuration(proxyBeanMethods = false) public class MyTomcatConfiguration { @Bean public WebServerFactoryCustomizer<TomcatServletWebServerFactory> sslConnectorCustomizer() { return (tomcat) -> tomcat.addAdditionalTomcatConnectors(createSslConnector()); } private Connector createSslConnector() { Connector connector = new Connector("org.apache.coyote.http11.Http11NioProtocol"); Http11NioProtocol protocol = (Http11NioProtocol) connector.getProtocolHandler(); try { URL keystore = ResourceUtils.getURL("keystore"); URL truststore = ResourceUtils.getURL("truststore"); connector.setScheme("https"); connector.setSecure(true); connector.setPort(8443); protocol.setSSLEnabled(true); protocol.setKeystoreFile(keystore.toString()); protocol.setKeystorePass("changeit"); protocol.setTruststoreFile(truststore.toString()); protocol.setTruststorePass("changeit"); protocol.setKeyAlias("apitester"); return connector; } catch (IOException ex) { throw new IllegalStateException("Fail to create ssl connector", ex); } } } ``` Kotlin ``` import org.apache.catalina.connector.Connector import org.apache.coyote.http11.Http11NioProtocol import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory import org.springframework.boot.web.server.WebServerFactoryCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.util.ResourceUtils import java.io.IOException @Configuration(proxyBeanMethods = false) class MyTomcatConfiguration { @Bean fun sslConnectorCustomizer(): WebServerFactoryCustomizer<TomcatServletWebServerFactory> { return WebServerFactoryCustomizer { tomcat: TomcatServletWebServerFactory -> tomcat.addAdditionalTomcatConnectors( createSslConnector() ) } } private fun createSslConnector(): Connector { val connector = Connector("org.apache.coyote.http11.Http11NioProtocol") val protocol = connector.protocolHandler as Http11NioProtocol return try { val keystore = ResourceUtils.getURL("keystore") val truststore = ResourceUtils.getURL("truststore") connector.scheme = "https" connector.secure = true connector.port = 8443 protocol.isSSLEnabled = true protocol.keystoreFile = keystore.toString() protocol.keystorePass = "changeit" protocol.truststoreFile = truststore.toString() protocol.truststorePass = "changeit" protocol.keyAlias = "apitester" connector } catch (ex: IOException) { throw IllegalStateException("Fail to create ssl connector", ex) } } } ``` ### 3.14. Use Tomcat’s LegacyCookieProcessor By default, the embedded Tomcat used by Spring Boot does not support "Version 0" of the Cookie format, so you may see the following error: ``` java.lang.IllegalArgumentException: An invalid character [32] was present in the Cookie value ``` If at all possible, you should consider updating your code to only store values compliant with later Cookie specifications. If, however, you cannot change the way that cookies are written, you can instead configure Tomcat to use a `LegacyCookieProcessor`. To switch to the `LegacyCookieProcessor`, use an `WebServerFactoryCustomizer` bean that adds a `TomcatContextCustomizer`, as shown in the following example: Java ``` import org.apache.tomcat.util.http.LegacyCookieProcessor; import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory; import org.springframework.boot.web.server.WebServerFactoryCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyLegacyCookieProcessorConfiguration { @Bean public WebServerFactoryCustomizer<TomcatServletWebServerFactory> cookieProcessorCustomizer() { return (factory) -> factory .addContextCustomizers((context) -> context.setCookieProcessor(new LegacyCookieProcessor())); } } ``` Kotlin ``` import org.apache.catalina.Context import org.apache.tomcat.util.http.LegacyCookieProcessor import org.springframework.boot.web.embedded.tomcat.TomcatContextCustomizer import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory import org.springframework.boot.web.server.WebServerFactoryCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyLegacyCookieProcessorConfiguration { @Bean fun cookieProcessorCustomizer(): WebServerFactoryCustomizer<TomcatServletWebServerFactory> { return WebServerFactoryCustomizer { factory: TomcatServletWebServerFactory -> factory .addContextCustomizers(TomcatContextCustomizer { context: Context -> context.cookieProcessor = LegacyCookieProcessor() }) } } } ``` ### 3.15. Enable Tomcat’s MBean Registry Embedded Tomcat’s MBean registry is disabled by default. This minimizes Tomcat’s memory footprint. If you want to use Tomcat’s MBeans, for example so that they can be used by Micrometer to expose metrics, you must use the `server.tomcat.mbeanregistry.enabled` property to do so, as shown in the following example: Properties ``` server.tomcat.mbeanregistry.enabled=true ``` Yaml ``` server: tomcat: mbeanregistry: enabled: true ``` ### 3.16. Enable Multiple Listeners with Undertow Add an `UndertowBuilderCustomizer` to the `UndertowServletWebServerFactory` and add a listener to the `Builder`, as shown in the following example: Java ``` import io.undertow.Undertow.Builder; import org.springframework.boot.web.embedded.undertow.UndertowServletWebServerFactory; import org.springframework.boot.web.server.WebServerFactoryCustomizer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyUndertowConfiguration { @Bean public WebServerFactoryCustomizer<UndertowServletWebServerFactory> undertowListenerCustomizer() { return (factory) -> factory.addBuilderCustomizers(this::addHttpListener); } private Builder addHttpListener(Builder builder) { return builder.addHttpListener(8080, "0.0.0.0"); } } ``` Kotlin ``` import io.undertow.Undertow import org.springframework.boot.web.embedded.undertow.UndertowBuilderCustomizer import org.springframework.boot.web.embedded.undertow.UndertowServletWebServerFactory import org.springframework.boot.web.server.WebServerFactoryCustomizer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyUndertowConfiguration { @Bean fun undertowListenerCustomizer(): WebServerFactoryCustomizer<UndertowServletWebServerFactory> { return WebServerFactoryCustomizer { factory: UndertowServletWebServerFactory -> factory.addBuilderCustomizers( UndertowBuilderCustomizer { builder: Undertow.Builder -> addHttpListener(builder) }) } } private fun addHttpListener(builder: Undertow.Builder): Undertow.Builder { return builder.addHttpListener(8080, "0.0.0.0") } } ``` ### 3.17. Create WebSocket Endpoints Using @ServerEndpoint If you want to use `@ServerEndpoint` in a Spring Boot application that used an embedded container, you must declare a single `ServerEndpointExporter` `@Bean`, as shown in the following example: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.web.socket.server.standard.ServerEndpointExporter; @Configuration(proxyBeanMethods = false) public class MyWebSocketConfiguration { @Bean public ServerEndpointExporter serverEndpointExporter() { return new ServerEndpointExporter(); } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.web.socket.server.standard.ServerEndpointExporter @Configuration(proxyBeanMethods = false) class MyWebSocketConfiguration { @Bean fun serverEndpointExporter(): ServerEndpointExporter { return ServerEndpointExporter() } } ``` The bean shown in the preceding example registers any `@ServerEndpoint` annotated beans with the underlying WebSocket container. When deployed to a standalone servlet container, this role is performed by a servlet container initializer, and the `ServerEndpointExporter` bean is not required. 4. Spring MVC -------------- Spring Boot has a number of starters that include Spring MVC. Note that some starters include a dependency on Spring MVC rather than include it directly. This section answers common questions about Spring MVC and Spring Boot. ### 4.1. Write a JSON REST Service Any Spring `@RestController` in a Spring Boot application should render JSON response by default as long as Jackson2 is on the classpath, as shown in the following example: Java ``` import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; @RestController public class MyController { @RequestMapping("/thing") public MyThing thing() { return new MyThing(); } } ``` Kotlin ``` import org.springframework.web.bind.annotation.RequestMapping import org.springframework.web.bind.annotation.RestController @RestController class MyController { @RequestMapping("/thing") fun thing(): MyThing { return MyThing() } } ``` As long as `MyThing` can be serialized by Jackson2 (true for a normal POJO or Groovy object), then `[localhost:8080/thing](http://localhost:8080/thing)` serves a JSON representation of it by default. Note that, in a browser, you might sometimes see XML responses, because browsers tend to send accept headers that prefer XML. ### 4.2. Write an XML REST Service If you have the Jackson XML extension (`jackson-dataformat-xml`) on the classpath, you can use it to render XML responses. The previous example that we used for JSON would work. To use the Jackson XML renderer, add the following dependency to your project: ``` <dependency> <groupId>com.fasterxml.jackson.dataformat</groupId> <artifactId>jackson-dataformat-xml</artifactId> </dependency> ``` If Jackson’s XML extension is not available and JAXB is available, XML can be rendered with the additional requirement of having `MyThing` annotated as `@XmlRootElement`, as shown in the following example: Java ``` import javax.xml.bind.annotation.XmlRootElement; @XmlRootElement public class MyThing { private String name; // getters/setters ... public String getName() { return this.name; } public void setName(String name) { this.name = name; } } ``` Kotlin ``` import javax.xml.bind.annotation.XmlRootElement @XmlRootElement class MyThing { var name: String? = null } ``` JAXB is only available out of the box with Java 8. If you use a more recent Java generation, add the following dependency to your project: ``` <dependency> <groupId>org.glassfish.jaxb</groupId> <artifactId>jaxb-runtime</artifactId> </dependency> ``` | | | | --- | --- | | | To get the server to render XML instead of JSON, you might have to send an `Accept: text/xml` header (or use a browser). | ### 4.3. Customize the Jackson ObjectMapper Spring MVC (client and server side) uses `HttpMessageConverters` to negotiate content conversion in an HTTP exchange. If Jackson is on the classpath, you already get the default converter(s) provided by `Jackson2ObjectMapperBuilder`, an instance of which is auto-configured for you. The `ObjectMapper` (or `XmlMapper` for Jackson XML converter) instance (created by default) has the following customized properties: * `MapperFeature.DEFAULT_VIEW_INCLUSION` is disabled * `DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES` is disabled * `SerializationFeature.WRITE_DATES_AS_TIMESTAMPS` is disabled Spring Boot also has some features to make it easier to customize this behavior. You can configure the `ObjectMapper` and `XmlMapper` instances by using the environment. Jackson provides an extensive suite of on/off features that can be used to configure various aspects of its processing. These features are described in six enums (in Jackson) that map onto properties in the environment: | Enum | Property | Values | | --- | --- | --- | | `com.fasterxml.jackson.databind.DeserializationFeature` | `spring.jackson.deserialization.<feature_name>` | `true`, `false` | | `com.fasterxml.jackson.core.JsonGenerator.Feature` | `spring.jackson.generator.<feature_name>` | `true`, `false` | | `com.fasterxml.jackson.databind.MapperFeature` | `spring.jackson.mapper.<feature_name>` | `true`, `false` | | `com.fasterxml.jackson.core.JsonParser.Feature` | `spring.jackson.parser.<feature_name>` | `true`, `false` | | `com.fasterxml.jackson.databind.SerializationFeature` | `spring.jackson.serialization.<feature_name>` | `true`, `false` | | `com.fasterxml.jackson.annotation.JsonInclude.Include` | `spring.jackson.default-property-inclusion` | `always`, `non_null`, `non_absent`, `non_default`, `non_empty` | For example, to enable pretty print, set `spring.jackson.serialization.indent_output=true`. Note that, thanks to the use of [relaxed binding](features#features.external-config.typesafe-configuration-properties.relaxed-binding), the case of `indent_output` does not have to match the case of the corresponding enum constant, which is `INDENT_OUTPUT`. This environment-based configuration is applied to the auto-configured `Jackson2ObjectMapperBuilder` bean and applies to any mappers created by using the builder, including the auto-configured `ObjectMapper` bean. The context’s `Jackson2ObjectMapperBuilder` can be customized by one or more `Jackson2ObjectMapperBuilderCustomizer` beans. Such customizer beans can be ordered (Boot’s own customizer has an order of 0), letting additional customization be applied both before and after Boot’s customization. Any beans of type `com.fasterxml.jackson.databind.Module` are automatically registered with the auto-configured `Jackson2ObjectMapperBuilder` and are applied to any `ObjectMapper` instances that it creates. This provides a global mechanism for contributing custom modules when you add new features to your application. If you want to replace the default `ObjectMapper` completely, either define a `@Bean` of that type and mark it as `@Primary` or, if you prefer the builder-based approach, define a `Jackson2ObjectMapperBuilder` `@Bean`. Note that, in either case, doing so disables all auto-configuration of the `ObjectMapper`. If you provide any `@Beans` of type `MappingJackson2HttpMessageConverter`, they replace the default value in the MVC configuration. Also, a convenience bean of type `HttpMessageConverters` is provided (and is always available if you use the default MVC configuration). It has some useful methods to access the default and user-enhanced message converters. See the “[Customize the @ResponseBody Rendering](#howto.spring-mvc.customize-responsebody-rendering)” section and the [`WebMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration.java) source code for more details. ### 4.4. Customize the @ResponseBody Rendering Spring uses `HttpMessageConverters` to render `@ResponseBody` (or responses from `@RestController`). You can contribute additional converters by adding beans of the appropriate type in a Spring Boot context. If a bean you add is of a type that would have been included by default anyway (such as `MappingJackson2HttpMessageConverter` for JSON conversions), it replaces the default value. A convenience bean of type `HttpMessageConverters` is provided and is always available if you use the default MVC configuration. It has some useful methods to access the default and user-enhanced message converters (For example, it can be useful if you want to manually inject them into a custom `RestTemplate`). As in normal MVC usage, any `WebMvcConfigurer` beans that you provide can also contribute converters by overriding the `configureMessageConverters` method. However, unlike with normal MVC, you can supply only additional converters that you need (because Spring Boot uses the same mechanism to contribute its defaults). Finally, if you opt out of the Spring Boot default MVC configuration by providing your own `@EnableWebMvc` configuration, you can take control completely and do everything manually by using `getMessageConverters` from `WebMvcConfigurationSupport`. See the [`WebMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration.java) source code for more details. ### 4.5. Handling Multipart File Uploads Spring Boot embraces the servlet 3 `javax.servlet.http.Part` API to support uploading files. By default, Spring Boot configures Spring MVC with a maximum size of 1MB per file and a maximum of 10MB of file data in a single request. You may override these values, the location to which intermediate data is stored (for example, to the `/tmp` directory), and the threshold past which data is flushed to disk by using the properties exposed in the `MultipartProperties` class. For example, if you want to specify that files be unlimited, set the `spring.servlet.multipart.max-file-size` property to `-1`. The multipart support is helpful when you want to receive multipart encoded file data as a `@RequestParam`-annotated parameter of type `MultipartFile` in a Spring MVC controller handler method. See the [`MultipartAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/MultipartAutoConfiguration.java) source for more details. | | | | --- | --- | | | It is recommended to use the container’s built-in support for multipart uploads rather than introducing an additional dependency such as Apache Commons File Upload. | ### 4.6. Switch Off the Spring MVC DispatcherServlet By default, all content is served from the root of your application (`/`). If you would rather map to a different path, you can configure one as follows: Properties ``` spring.mvc.servlet.path=/mypath ``` Yaml ``` spring: mvc: servlet: path: "/mypath" ``` If you have additional servlets you can declare a `@Bean` of type `Servlet` or `ServletRegistrationBean` for each and Spring Boot will register them transparently to the container. Because servlets are registered that way, they can be mapped to a sub-context of the `DispatcherServlet` without invoking it. Configuring the `DispatcherServlet` yourself is unusual but if you really need to do it, a `@Bean` of type `DispatcherServletPath` must be provided as well to provide the path of your custom `DispatcherServlet`. ### 4.7. Switch off the Default MVC Configuration The easiest way to take complete control over MVC configuration is to provide your own `@Configuration` with the `@EnableWebMvc` annotation. Doing so leaves all MVC configuration in your hands. ### 4.8. Customize ViewResolvers A `ViewResolver` is a core component of Spring MVC, translating view names in `@Controller` to actual `View` implementations. Note that `ViewResolvers` are mainly used in UI applications, rather than REST-style services (a `View` is not used to render a `@ResponseBody`). There are many implementations of `ViewResolver` to choose from, and Spring on its own is not opinionated about which ones you should use. Spring Boot, on the other hand, installs one or two for you, depending on what it finds on the classpath and in the application context. The `DispatcherServlet` uses all the resolvers it finds in the application context, trying each one in turn until it gets a result. If you add your own, you have to be aware of the order and in which position your resolver is added. `WebMvcAutoConfiguration` adds the following `ViewResolvers` to your context: * An `InternalResourceViewResolver` named ‘defaultViewResolver’. This one locates physical resources that can be rendered by using the `DefaultServlet` (including static resources and JSP pages, if you use those). It applies a prefix and a suffix to the view name and then looks for a physical resource with that path in the servlet context (the defaults are both empty but are accessible for external configuration through `spring.mvc.view.prefix` and `spring.mvc.view.suffix`). You can override it by providing a bean of the same type. * A `BeanNameViewResolver` named ‘beanNameViewResolver’. This is a useful member of the view resolver chain and picks up any beans with the same name as the `View` being resolved. It should not be necessary to override or replace it. * A `ContentNegotiatingViewResolver` named ‘viewResolver’ is added only if there **are** actually beans of type `View` present. This is a composite resolver, delegating to all the others and attempting to find a match to the ‘Accept’ HTTP header sent by the client. There is a useful [blog about `ContentNegotiatingViewResolver`](https://spring.io/blog/2013/06/03/content-negotiation-using-views) that you might like to study to learn more, and you might also look at the source code for detail. You can switch off the auto-configured `ContentNegotiatingViewResolver` by defining a bean named ‘viewResolver’. * If you use Thymeleaf, you also have a `ThymeleafViewResolver` named ‘thymeleafViewResolver’. It looks for resources by surrounding the view name with a prefix and suffix. The prefix is `spring.thymeleaf.prefix`, and the suffix is `spring.thymeleaf.suffix`. The values of the prefix and suffix default to ‘classpath:/templates/’ and ‘.html’, respectively. You can override `ThymeleafViewResolver` by providing a bean of the same name. * If you use FreeMarker, you also have a `FreeMarkerViewResolver` named ‘freeMarkerViewResolver’. It looks for resources in a loader path (which is externalized to `spring.freemarker.templateLoaderPath` and has a default value of ‘classpath:/templates/’) by surrounding the view name with a prefix and a suffix. The prefix is externalized to `spring.freemarker.prefix`, and the suffix is externalized to `spring.freemarker.suffix`. The default values of the prefix and suffix are empty and ‘.ftlh’, respectively. You can override `FreeMarkerViewResolver` by providing a bean of the same name. * If you use Groovy templates (actually, if `groovy-templates` is on your classpath), you also have a `GroovyMarkupViewResolver` named ‘groovyMarkupViewResolver’. It looks for resources in a loader path by surrounding the view name with a prefix and suffix (externalized to `spring.groovy.template.prefix` and `spring.groovy.template.suffix`). The prefix and suffix have default values of ‘classpath:/templates/’ and ‘.tpl’, respectively. You can override `GroovyMarkupViewResolver` by providing a bean of the same name. * If you use Mustache, you also have a `MustacheViewResolver` named ‘mustacheViewResolver’. It looks for resources by surrounding the view name with a prefix and suffix. The prefix is `spring.mustache.prefix`, and the suffix is `spring.mustache.suffix`. The values of the prefix and suffix default to ‘classpath:/templates/’ and ‘.mustache’, respectively. You can override `MustacheViewResolver` by providing a bean of the same name. For more detail, see the following sections: * [`WebMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration.java) * [`ThymeleafAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/thymeleaf/ThymeleafAutoConfiguration.java) * [`FreeMarkerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/freemarker/FreeMarkerAutoConfiguration.java) * [`GroovyTemplateAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/groovy/template/GroovyTemplateAutoConfiguration.java) 5. Jersey ---------- ### 5.1. Secure Jersey endpoints with Spring Security Spring Security can be used to secure a Jersey-based web application in much the same way as it can be used to secure a Spring MVC-based web application. However, if you want to use Spring Security’s method-level security with Jersey, you must configure Jersey to use `setStatus(int)` rather `sendError(int)`. This prevents Jersey from committing the response before Spring Security has had an opportunity to report an authentication or authorization failure to the client. The `jersey.config.server.response.setStatusOverSendError` property must be set to `true` on the application’s `ResourceConfig` bean, as shown in the following example: Java ``` import java.util.Collections; import org.glassfish.jersey.server.ResourceConfig; import org.springframework.stereotype.Component; @Component public class JerseySetStatusOverSendErrorConfig extends ResourceConfig { public JerseySetStatusOverSendErrorConfig() { register(Endpoint.class); setProperties(Collections.singletonMap("jersey.config.server.response.setStatusOverSendError", true)); } } ``` Kotlin ``` import org.glassfish.jersey.server.ResourceConfig import org.springframework.stereotype.Component import java.util.Collections @Component class JerseySetStatusOverSendErrorConfig : ResourceConfig() { init { register(Endpoint::class.java) setProperties(Collections.singletonMap("jersey.config.server.response.setStatusOverSendError", true)) } } ``` ### 5.2. Use Jersey Alongside Another Web Framework To use Jersey alongside another web framework, such as Spring MVC, it should be configured so that it will allow the other framework to handle requests that it cannot handle. First, configure Jersey to use a filter rather than a servlet by configuring the `spring.jersey.type` application property with a value of `filter`. Second, configure your `ResourceConfig` to forward requests that would have resulted in a 404, as shown in the following example. Java ``` import org.glassfish.jersey.server.ResourceConfig; import org.glassfish.jersey.servlet.ServletProperties; import org.springframework.stereotype.Component; @Component public class JerseyConfig extends ResourceConfig { public JerseyConfig() { register(Endpoint.class); property(ServletProperties.FILTER\_FORWARD\_ON\_404, true); } } ``` Kotlin ``` import org.glassfish.jersey.server.ResourceConfig import org.glassfish.jersey.servlet.ServletProperties import org.springframework.stereotype.Component @Component class JerseyConfig : ResourceConfig() { init { register(Endpoint::class.java) property(ServletProperties.FILTER\_FORWARD\_ON\_404, true) } } ``` 6. HTTP Clients ---------------- Spring Boot offers a number of starters that work with HTTP clients. This section answers questions related to using them. ### 6.1. Configure RestTemplate to Use a Proxy As described in [io.html](io#io.rest-client.resttemplate.customization), you can use a `RestTemplateCustomizer` with `RestTemplateBuilder` to build a customized `RestTemplate`. This is the recommended approach for creating a `RestTemplate` configured to use a proxy. The exact details of the proxy configuration depend on the underlying client request factory that is being used. ### 6.2. Configure the TcpClient used by a Reactor Netty-based WebClient When Reactor Netty is on the classpath a Reactor Netty-based `WebClient` is auto-configured. To customize the client’s handling of network connections, provide a `ClientHttpConnector` bean. The following example configures a 60 second connect timeout and adds a `ReadTimeoutHandler`: Java ``` import io.netty.channel.ChannelOption; import io.netty.handler.timeout.ReadTimeoutHandler; import reactor.netty.http.client.HttpClient; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.client.reactive.ClientHttpConnector; import org.springframework.http.client.reactive.ReactorClientHttpConnector; import org.springframework.http.client.reactive.ReactorResourceFactory; @Configuration(proxyBeanMethods = false) public class MyReactorNettyClientConfiguration { @Bean ClientHttpConnector clientHttpConnector(ReactorResourceFactory resourceFactory) { HttpClient httpClient = HttpClient.create(resourceFactory.getConnectionProvider()) .runOn(resourceFactory.getLoopResources()) .option(ChannelOption.CONNECT\_TIMEOUT\_MILLIS, 60000) .doOnConnected((connection) -> connection.addHandlerLast(new ReadTimeoutHandler(60))); return new ReactorClientHttpConnector(httpClient); } } ``` Kotlin ``` import io.netty.channel.ChannelOption import io.netty.handler.timeout.ReadTimeoutHandler import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.http.client.reactive.ClientHttpConnector import org.springframework.http.client.reactive.ReactorClientHttpConnector import org.springframework.http.client.reactive.ReactorResourceFactory import reactor.netty.http.client.HttpClient @Configuration(proxyBeanMethods = false) class MyReactorNettyClientConfiguration { @Bean fun clientHttpConnector(resourceFactory: ReactorResourceFactory): ClientHttpConnector { val httpClient = HttpClient.create(resourceFactory.connectionProvider) .runOn(resourceFactory.loopResources) .option(ChannelOption.CONNECT\_TIMEOUT\_MILLIS, 60000) .doOnConnected { connection -> connection.addHandlerLast(ReadTimeoutHandler(60)) } return ReactorClientHttpConnector(httpClient) } } ``` | | | | --- | --- | | | Note the use of `ReactorResourceFactory` for the connection provider and event loop resources. This ensures efficient sharing of resources for the server receiving requests and the client making requests. | 7. Logging ----------- Spring Boot has no mandatory logging dependency, except for the Commons Logging API, which is typically provided by Spring Framework’s `spring-jcl` module. To use [Logback](https://logback.qos.ch), you need to include it and `spring-jcl` on the classpath. The recommended way to do that is through the starters, which all depend on `spring-boot-starter-logging`. For a web application, you need only `spring-boot-starter-web`, since it depends transitively on the logging starter. If you use Maven, the following dependency adds logging for you: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> ``` Spring Boot has a `LoggingSystem` abstraction that attempts to configure logging based on the content of the classpath. If Logback is available, it is the first choice. If the only change you need to make to logging is to set the levels of various loggers, you can do so in `application.properties` by using the "logging.level" prefix, as shown in the following example: Properties ``` logging.level.org.springframework.web=debug logging.level.org.hibernate=error ``` Yaml ``` logging: level: org.springframework.web: "debug" org.hibernate: "error" ``` You can also set the location of a file to which to write the log (in addition to the console) by using `logging.file.name`. To configure the more fine-grained settings of a logging system, you need to use the native configuration format supported by the `LoggingSystem` in question. By default, Spring Boot picks up the native configuration from its default location for the system (such as `classpath:logback.xml` for Logback), but you can set the location of the config file by using the `logging.config` property. ### 7.1. Configure Logback for Logging If you need to apply customizations to logback beyond those that can be achieved with `application.properties`, you will need to add a standard logback configuration file. You can add a `logback.xml` file to the root of your classpath for logback to find. You can also use `logback-spring.xml` if you want to use the [Spring Boot Logback extensions](features#features.logging.logback-extensions). | | | | --- | --- | | | The Logback documentation has a [dedicated section that covers configuration](https://logback.qos.ch/manual/configuration.html) in some detail. | Spring Boot provides a number of logback configurations that be `included` from your own configuration. These includes are designed to allow certain common Spring Boot conventions to be re-applied. The following files are provided under `org/springframework/boot/logging/logback/`: * `defaults.xml` - Provides conversion rules, pattern properties and common logger configurations. * `console-appender.xml` - Adds a `ConsoleAppender` using the `CONSOLE_LOG_PATTERN`. * `file-appender.xml` - Adds a `RollingFileAppender` using the `FILE_LOG_PATTERN` and `ROLLING_FILE_NAME_PATTERN` with appropriate settings. In addition, a legacy `base.xml` file is provided for compatibility with earlier versions of Spring Boot. A typical custom `logback.xml` file would look something like this: ``` <?xml version="1.0" encoding="UTF-8"?> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml"/> <include resource="org/springframework/boot/logging/logback/console-appender.xml" /> <root level="INFO"> <appender-ref ref="CONSOLE" /> </root> <logger name="org.springframework.web" level="DEBUG"/> </configuration> ``` Your logback configuration file can also make use of System properties that the `LoggingSystem` takes care of creating for you: * `${PID}`: The current process ID. * `${LOG_FILE}`: Whether `logging.file.name` was set in Boot’s external configuration. * `${LOG_PATH}`: Whether `logging.file.path` (representing a directory for log files to live in) was set in Boot’s external configuration. * `${LOG_EXCEPTION_CONVERSION_WORD}`: Whether `logging.exception-conversion-word` was set in Boot’s external configuration. * `${ROLLING_FILE_NAME_PATTERN}`: Whether `logging.pattern.rolling-file-name` was set in Boot’s external configuration. Spring Boot also provides some nice ANSI color terminal output on a console (but not in a log file) by using a custom Logback converter. See the `CONSOLE_LOG_PATTERN` in the `defaults.xml` configuration for an example. If Groovy is on the classpath, you should be able to configure Logback with `logback.groovy` as well. If present, this setting is given preference. | | | | --- | --- | | | Spring extensions are not supported with Groovy configuration. Any `logback-spring.groovy` files will not be detected. | #### 7.1.1. Configure Logback for File-only Output If you want to disable console logging and write output only to a file, you need a custom `logback-spring.xml` that imports `file-appender.xml` but not `console-appender.xml`, as shown in the following example: ``` <?xml version="1.0" encoding="UTF-8"?> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml" /> <property name="LOG_FILE" value="${LOG_FILE:-${LOG_PATH:-${LOG_TEMP:-${java.io.tmpdir:-/tmp}}/}spring.log}"/> <include resource="org/springframework/boot/logging/logback/file-appender.xml" /> <root level="INFO"> <appender-ref ref="FILE" /> </root> </configuration> ``` You also need to add `logging.file.name` to your `application.properties` or `application.yaml`, as shown in the following example: Properties ``` logging.file.name=myapplication.log ``` Yaml ``` logging: file: name: "myapplication.log" ``` ### 7.2. Configure Log4j for Logging Spring Boot supports [Log4j 2](https://logging.apache.org/log4j/2.x/) for logging configuration if it is on the classpath. If you use the starters for assembling dependencies, you have to exclude Logback and then include log4j 2 instead. If you do not use the starters, you need to provide (at least) `spring-jcl` in addition to Log4j 2. The recommended path is through the starters, even though it requires some jiggling. The following example shows how to set up the starters in Maven: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter</artifactId> <exclusions> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-logging</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-log4j2</artifactId> </dependency> ``` Gradle provides a few different ways to set up the starters. One way is to use a [module replacement](https://docs.gradle.org/current/userguide/resolution_rules.html#sec:module_replacement). To do so, declare a dependency on the Log4j 2 starter and tell Gradle that any occurrences of the default logging starter should be replaced by the Log4j 2 starter, as shown in the following example: ``` dependencies { implementation "org.springframework.boot:spring-boot-starter-log4j2" modules { module("org.springframework.boot:spring-boot-starter-logging") { replacedBy("org.springframework.boot:spring-boot-starter-log4j2", "Use Log4j2 instead of Logback") } } } ``` | | | | --- | --- | | | The Log4j starters gather together the dependencies for common logging requirements (such as having Tomcat use `java.util.logging` but configuring the output using Log4j 2). | | | | | --- | --- | | | To ensure that debug logging performed using `java.util.logging` is routed into Log4j 2, configure its [JDK logging adapter](https://logging.apache.org/log4j/2.x/log4j-jul/index.html) by setting the `java.util.logging.manager` system property to `org.apache.logging.log4j.jul.LogManager`. | #### 7.2.1. Use YAML or JSON to Configure Log4j 2 In addition to its default XML configuration format, Log4j 2 also supports YAML and JSON configuration files. To configure Log4j 2 to use an alternative configuration file format, add the appropriate dependencies to the classpath and name your configuration files to match your chosen file format, as shown in the following example: | Format | Dependencies | File names | | --- | --- | --- | | YAML | `com.fasterxml.jackson.core:jackson-databind` + `com.fasterxml.jackson.dataformat:jackson-dataformat-yaml` | `log4j2.yaml` + `log4j2.yml` | | JSON | `com.fasterxml.jackson.core:jackson-databind` | `log4j2.json` + `log4j2.jsn` | #### 7.2.2. Use Composite Configuration to Configure Log4j 2 Log4j 2 has support for combining multiple configuration files into a single composite configuration. To use this support in Spring Boot, configure `logging.log4j2.config.override` with the locations of one or more secondary configuration files. The secondary configuration files will be merged with the primary configuration, whether the primary’s source is Spring Boot’s defaults, a standard location such as `log4j.xml`, or the location configured by the `logging.config` property. 8. Data Access --------------- Spring Boot includes a number of starters for working with data sources. This section answers questions related to doing so. ### 8.1. Configure a Custom DataSource To configure your own `DataSource`, define a `@Bean` of that type in your configuration. Spring Boot reuses your `DataSource` anywhere one is required, including database initialization. If you need to externalize some settings, you can bind your `DataSource` to the environment (see “[features.html](features#features.external-config.typesafe-configuration-properties.third-party-configuration)”). The following example shows how to define a data source in a bean: Java ``` import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyDataSourceConfiguration { @Bean @ConfigurationProperties(prefix = "app.datasource") public SomeDataSource dataSource() { return new SomeDataSource(); } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyDataSourceConfiguration { @Bean @ConfigurationProperties(prefix = "app.datasource") fun dataSource(): SomeDataSource { return SomeDataSource() } } ``` The following example shows how to define a data source by setting properties: Properties ``` app.datasource.url=jdbc:h2:mem:mydb app.datasource.username=sa app.datasource.pool-size=30 ``` Yaml ``` app: datasource: url: "jdbc:h2:mem:mydb" username: "sa" pool-size: 30 ``` Assuming that `SomeDataSource` has regular JavaBean properties for the URL, the username, and the pool size, these settings are bound automatically before the `DataSource` is made available to other components. Spring Boot also provides a utility builder class, called `DataSourceBuilder`, that can be used to create one of the standard data sources (if it is on the classpath). The builder can detect the one to use based on what is available on the classpath. It also auto-detects the driver based on the JDBC URL. The following example shows how to create a data source by using a `DataSourceBuilder`: Java ``` import javax.sql.DataSource; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.jdbc.DataSourceBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyDataSourceConfiguration { @Bean @ConfigurationProperties("app.datasource") public DataSource dataSource() { return DataSourceBuilder.create().build(); } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.jdbc.DataSourceBuilder import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import javax.sql.DataSource @Configuration(proxyBeanMethods = false) class MyDataSourceConfiguration { @Bean @ConfigurationProperties("app.datasource") fun dataSource(): DataSource { return DataSourceBuilder.create().build() } } ``` To run an app with that `DataSource`, all you need is the connection information. Pool-specific settings can also be provided. Check the implementation that is going to be used at runtime for more details. The following example shows how to define a JDBC data source by setting properties: Properties ``` app.datasource.url=jdbc:mysql://localhost/test app.datasource.username=dbuser app.datasource.password=dbpass app.datasource.pool-size=30 ``` Yaml ``` app: datasource: url: "jdbc:mysql://localhost/test" username: "dbuser" password: "dbpass" pool-size: 30 ``` However, there is a catch. Because the actual type of the connection pool is not exposed, no keys are generated in the metadata for your custom `DataSource` and no completion is available in your IDE (because the `DataSource` interface exposes no properties). Also, if you happen to have Hikari on the classpath, this basic setup does not work, because Hikari has no `url` property (but does have a `jdbcUrl` property). In that case, you must rewrite your configuration as follows: Properties ``` app.datasource.jdbc-url=jdbc:mysql://localhost/test app.datasource.username=dbuser app.datasource.password=dbpass app.datasource.pool-size=30 ``` Yaml ``` app: datasource: jdbc-url: "jdbc:mysql://localhost/test" username: "dbuser" password: "dbpass" pool-size: 30 ``` You can fix that by forcing the connection pool to use and return a dedicated implementation rather than `DataSource`. You cannot change the implementation at runtime, but the list of options will be explicit. The following example shows how create a `HikariDataSource` with `DataSourceBuilder`: Java ``` import com.zaxxer.hikari.HikariDataSource; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.jdbc.DataSourceBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyDataSourceConfiguration { @Bean @ConfigurationProperties("app.datasource") public HikariDataSource dataSource() { return DataSourceBuilder.create().type(HikariDataSource.class).build(); } } ``` Kotlin ``` import com.zaxxer.hikari.HikariDataSource import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.jdbc.DataSourceBuilder import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyDataSourceConfiguration { @Bean @ConfigurationProperties("app.datasource") fun dataSource(): HikariDataSource { return DataSourceBuilder.create().type(HikariDataSource::class.java).build() } } ``` You can even go further by leveraging what `DataSourceProperties` does for you — that is, by providing a default embedded database with a sensible username and password if no URL is provided. You can easily initialize a `DataSourceBuilder` from the state of any `DataSourceProperties` object, so you could also inject the DataSource that Spring Boot creates automatically. However, that would split your configuration into two namespaces: `url`, `username`, `password`, `type`, and `driver` on `spring.datasource` and the rest on your custom namespace (`app.datasource`). To avoid that, you can redefine a custom `DataSourceProperties` on your custom namespace, as shown in the following example: Java ``` import com.zaxxer.hikari.HikariDataSource; import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Primary; @Configuration(proxyBeanMethods = false) public class MyDataSourceConfiguration { @Bean @Primary @ConfigurationProperties("app.datasource") public DataSourceProperties dataSourceProperties() { return new DataSourceProperties(); } @Bean @ConfigurationProperties("app.datasource.configuration") public HikariDataSource dataSource(DataSourceProperties properties) { return properties.initializeDataSourceBuilder().type(HikariDataSource.class).build(); } } ``` Kotlin ``` import com.zaxxer.hikari.HikariDataSource import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Primary @Configuration(proxyBeanMethods = false) class MyDataSourceConfiguration { @Bean @Primary @ConfigurationProperties("app.datasource") fun dataSourceProperties(): DataSourceProperties { return DataSourceProperties() } @Bean @ConfigurationProperties("app.datasource.configuration") fun dataSource(properties: DataSourceProperties): HikariDataSource { return properties.initializeDataSourceBuilder().type(HikariDataSource::class.java).build() } } ``` This setup puts you *in sync* with what Spring Boot does for you by default, except that a dedicated connection pool is chosen (in code) and its settings are exposed in the `app.datasource.configuration` sub namespace. Because `DataSourceProperties` is taking care of the `url`/`jdbcUrl` translation for you, you can configure it as follows: Properties ``` app.datasource.url=jdbc:mysql://localhost/test app.datasource.username=dbuser app.datasource.password=dbpass app.datasource.configuration.maximum-pool-size=30 ``` Yaml ``` app: datasource: url: "jdbc:mysql://localhost/test" username: "dbuser" password: "dbpass" configuration: maximum-pool-size: 30 ``` | | | | --- | --- | | | Spring Boot will expose Hikari-specific settings to `spring.datasource.hikari`. This example uses a more generic `configuration` sub namespace as the example does not support multiple datasource implementations. | | | | | --- | --- | | | Because your custom configuration chooses to go with Hikari, `app.datasource.type` has no effect. In practice, the builder is initialized with whatever value you might set there and then overridden by the call to `.type()`. | See “[data.html](data#data.sql.datasource)” in the “Spring Boot features” section and the [`DataSourceAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/DataSourceAutoConfiguration.java) class for more details. ### 8.2. Configure Two DataSources If you need to configure multiple data sources, you can apply the same tricks that are described in the previous section. You must, however, mark one of the `DataSource` instances as `@Primary`, because various auto-configurations down the road expect to be able to get one by type. If you create your own `DataSource`, the auto-configuration backs off. In the following example, we provide the *exact* same feature set as the auto-configuration provides on the primary data source: Java ``` import com.zaxxer.hikari.HikariDataSource; import org.apache.commons.dbcp2.BasicDataSource; import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.jdbc.DataSourceBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Primary; @Configuration(proxyBeanMethods = false) public class MyDataSourcesConfiguration { @Bean @Primary @ConfigurationProperties("app.datasource.first") public DataSourceProperties firstDataSourceProperties() { return new DataSourceProperties(); } @Bean @Primary @ConfigurationProperties("app.datasource.first.configuration") public HikariDataSource firstDataSource(DataSourceProperties firstDataSourceProperties) { return firstDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build(); } @Bean @ConfigurationProperties("app.datasource.second") public BasicDataSource secondDataSource() { return DataSourceBuilder.create().type(BasicDataSource.class).build(); } } ``` Kotlin ``` import com.zaxxer.hikari.HikariDataSource import org.apache.commons.dbcp2.BasicDataSource import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.jdbc.DataSourceBuilder import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Primary @Configuration(proxyBeanMethods = false) class MyDataSourcesConfiguration { @Bean @Primary @ConfigurationProperties("app.datasource.first") fun firstDataSourceProperties(): DataSourceProperties { return DataSourceProperties() } @Bean @Primary @ConfigurationProperties("app.datasource.first.configuration") fun firstDataSource(firstDataSourceProperties: DataSourceProperties): HikariDataSource { return firstDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource::class.java).build() } @Bean @ConfigurationProperties("app.datasource.second") fun secondDataSource(): BasicDataSource { return DataSourceBuilder.create().type(BasicDataSource::class.java).build() } } ``` | | | | --- | --- | | | `firstDataSourceProperties` has to be flagged as `@Primary` so that the database initializer feature uses your copy (if you use the initializer). | Both data sources are also bound for advanced customizations. For instance, you could configure them as follows: Properties ``` app.datasource.first.url=jdbc:mysql://localhost/first app.datasource.first.username=dbuser app.datasource.first.password=dbpass app.datasource.first.configuration.maximum-pool-size=30 app.datasource.second.url=jdbc:mysql://localhost/second app.datasource.second.username=dbuser app.datasource.second.password=dbpass app.datasource.second.max-total=30 ``` Yaml ``` app: datasource: first: url: "jdbc:mysql://localhost/first" username: "dbuser" password: "dbpass" configuration: maximum-pool-size: 30 second: url: "jdbc:mysql://localhost/second" username: "dbuser" password: "dbpass" max-total: 30 ``` You can apply the same concept to the secondary `DataSource` as well, as shown in the following example: Java ``` import com.zaxxer.hikari.HikariDataSource; import org.apache.commons.dbcp2.BasicDataSource; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Primary; @Configuration(proxyBeanMethods = false) public class MyCompleteDataSourcesConfiguration { @Bean @Primary @ConfigurationProperties("app.datasource.first") public DataSourceProperties firstDataSourceProperties() { return new DataSourceProperties(); } @Bean @Primary @ConfigurationProperties("app.datasource.first.configuration") public HikariDataSource firstDataSource(DataSourceProperties firstDataSourceProperties) { return firstDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build(); } @Bean @ConfigurationProperties("app.datasource.second") public DataSourceProperties secondDataSourceProperties() { return new DataSourceProperties(); } @Bean @ConfigurationProperties("app.datasource.second.configuration") public BasicDataSource secondDataSource( @Qualifier("secondDataSourceProperties") DataSourceProperties secondDataSourceProperties) { return secondDataSourceProperties.initializeDataSourceBuilder().type(BasicDataSource.class).build(); } } ``` Kotlin ``` import com.zaxxer.hikari.HikariDataSource import org.apache.commons.dbcp2.BasicDataSource import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Primary @Configuration(proxyBeanMethods = false) class MyCompleteDataSourcesConfiguration { @Bean @Primary @ConfigurationProperties("app.datasource.first") fun firstDataSourceProperties(): DataSourceProperties { return DataSourceProperties() } @Bean @Primary @ConfigurationProperties("app.datasource.first.configuration") fun firstDataSource(firstDataSourceProperties: DataSourceProperties): HikariDataSource { return firstDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource::class.java).build() } @Bean @ConfigurationProperties("app.datasource.second") fun secondDataSourceProperties(): DataSourceProperties { return DataSourceProperties() } @Bean @ConfigurationProperties("app.datasource.second.configuration") fun secondDataSource(secondDataSourceProperties: DataSourceProperties): BasicDataSource { return secondDataSourceProperties.initializeDataSourceBuilder().type(BasicDataSource::class.java).build() } } ``` The preceding example configures two data sources on custom namespaces with the same logic as Spring Boot would use in auto-configuration. Note that each `configuration` sub namespace provides advanced settings based on the chosen implementation. ### 8.3. Use Spring Data Repositories Spring Data can create implementations of `@Repository` interfaces of various flavors. Spring Boot handles all of that for you, as long as those `@Repositories` are included in the same package (or a sub-package) of your `@EnableAutoConfiguration` class. For many applications, all you need is to put the right Spring Data dependencies on your classpath. There is a `spring-boot-starter-data-jpa` for JPA, `spring-boot-starter-data-mongodb` for Mongodb, and various other starters for supported technologies. To get started, create some repository interfaces to handle your `@Entity` objects. Spring Boot tries to guess the location of your `@Repository` definitions, based on the `@EnableAutoConfiguration` it finds. To get more control, use the `@EnableJpaRepositories` annotation (from Spring Data JPA). For more about Spring Data, see the [Spring Data project page](https://spring.io/projects/spring-data). ### 8.4. Separate @Entity Definitions from Spring Configuration Spring Boot tries to guess the location of your `@Entity` definitions, based on the `@EnableAutoConfiguration` it finds. To get more control, you can use the `@EntityScan` annotation, as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.EnableAutoConfiguration; import org.springframework.boot.autoconfigure.domain.EntityScan; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) @EnableAutoConfiguration @EntityScan(basePackageClasses = City.class) public class MyApplication { // ... } ``` Kotlin ``` import org.springframework.boot.autoconfigure.EnableAutoConfiguration import org.springframework.boot.autoconfigure.domain.EntityScan import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) @EnableAutoConfiguration @EntityScan(basePackageClasses = [City::class]) class MyApplication { // ... } ``` ### 8.5. Configure JPA Properties Spring Data JPA already provides some vendor-independent configuration options (such as those for SQL logging), and Spring Boot exposes those options and a few more for Hibernate as external configuration properties. Some of them are automatically detected according to the context so you should not have to set them. The `spring.jpa.hibernate.ddl-auto` is a special case, because, depending on runtime conditions, it has different defaults. If an embedded database is used and no schema manager (such as Liquibase or Flyway) is handling the `DataSource`, it defaults to `create-drop`. In all other cases, it defaults to `none`. The dialect to use is detected by the JPA provider. If you prefer to set the dialect yourself, set the `spring.jpa.database-platform` property. The most common options to set are shown in the following example: Properties ``` spring.jpa.hibernate.naming.physical-strategy=com.example.MyPhysicalNamingStrategy spring.jpa.show-sql=true ``` Yaml ``` spring: jpa: hibernate: naming: physical-strategy: "com.example.MyPhysicalNamingStrategy" show-sql: true ``` In addition, all properties in `spring.jpa.properties.*` are passed through as normal JPA properties (with the prefix stripped) when the local `EntityManagerFactory` is created. | | | | --- | --- | | | You need to ensure that names defined under `spring.jpa.properties.*` exactly match those expected by your JPA provider. Spring Boot will not attempt any kind of relaxed binding for these entries. For example, if you want to configure Hibernate’s batch size you must use `spring.jpa.properties.hibernate.jdbc.batch_size`. If you use other forms, such as `batchSize` or `batch-size`, Hibernate will not apply the setting. | | | | | --- | --- | | | If you need to apply advanced customization to Hibernate properties, consider registering a `HibernatePropertiesCustomizer` bean that will be invoked prior to creating the `EntityManagerFactory`. This takes precedence to anything that is applied by the auto-configuration. | ### 8.6. Configure Hibernate Naming Strategy Hibernate uses [two different naming strategies](https://docs.jboss.org/hibernate/orm/5.4/userguide/html_single/Hibernate_User_Guide.html#naming) to map names from the object model to the corresponding database names. The fully qualified class name of the physical and the implicit strategy implementations can be configured by setting the `spring.jpa.hibernate.naming.physical-strategy` and `spring.jpa.hibernate.naming.implicit-strategy` properties, respectively. Alternatively, if `ImplicitNamingStrategy` or `PhysicalNamingStrategy` beans are available in the application context, Hibernate will be automatically configured to use them. By default, Spring Boot configures the physical naming strategy with `CamelCaseToUnderscoresNamingStrategy`. Using this strategy, all dots are replaced by underscores and camel casing is replaced by underscores as well. Additionally, by default, all table names are generated in lower case. For example, a `TelephoneNumber` entity is mapped to the `telephone_number` table. If your schema requires mixed-case identifiers, define a custom `CamelCaseToUnderscoresNamingStrategy` bean, as shown in the following example: Java ``` import org.hibernate.boot.model.naming.CamelCaseToUnderscoresNamingStrategy; import org.hibernate.engine.jdbc.env.spi.JdbcEnvironment; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyHibernateConfiguration { @Bean public CamelCaseToUnderscoresNamingStrategy caseSensitivePhysicalNamingStrategy() { return new CamelCaseToUnderscoresNamingStrategy() { @Override protected boolean isCaseInsensitive(JdbcEnvironment jdbcEnvironment) { return false; } }; } } ``` Kotlin ``` import org.hibernate.boot.model.naming.CamelCaseToUnderscoresNamingStrategy import org.hibernate.engine.jdbc.env.spi.JdbcEnvironment import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyHibernateConfiguration { @Bean fun caseSensitivePhysicalNamingStrategy(): CamelCaseToUnderscoresNamingStrategy { return object : CamelCaseToUnderscoresNamingStrategy() { override fun isCaseInsensitive(jdbcEnvironment: JdbcEnvironment): Boolean { return false } } } } ``` If you prefer to use Hibernate 5’s default instead, set the following property: ``` spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl ``` Alternatively, you can configure the following bean: Java ``` import org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) class MyHibernateConfiguration { @Bean PhysicalNamingStrategyStandardImpl caseSensitivePhysicalNamingStrategy() { return new PhysicalNamingStrategyStandardImpl(); } } ``` Kotlin ``` import org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) internal class MyHibernateConfiguration { @Bean fun caseSensitivePhysicalNamingStrategy(): PhysicalNamingStrategyStandardImpl { return PhysicalNamingStrategyStandardImpl() } } ``` See [`HibernateJpaAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaAutoConfiguration.java) and [`JpaBaseConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/orm/jpa/JpaBaseConfiguration.java) for more details. ### 8.7. Configure Hibernate Second-Level Caching Hibernate [second-level cache](https://docs.jboss.org/hibernate/orm/5.4/userguide/html_single/Hibernate_User_Guide.html#caching) can be configured for a range of cache providers. Rather than configuring Hibernate to lookup the cache provider again, it is better to provide the one that is available in the context whenever possible. To do this with JCache, first make sure that `org.hibernate:hibernate-jcache` is available on the classpath. Then, add a `HibernatePropertiesCustomizer` bean as shown in the following example: Java ``` import org.hibernate.cache.jcache.ConfigSettings; import org.springframework.boot.autoconfigure.orm.jpa.HibernatePropertiesCustomizer; import org.springframework.cache.jcache.JCacheCacheManager; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyHibernateSecondLevelCacheConfiguration { @Bean public HibernatePropertiesCustomizer hibernateSecondLevelCacheCustomizer(JCacheCacheManager cacheManager) { return (properties) -> properties.put(ConfigSettings.CACHE\_MANAGER, cacheManager.getCacheManager()); } } ``` Kotlin ``` import org.hibernate.cache.jcache.ConfigSettings import org.springframework.boot.autoconfigure.orm.jpa.HibernatePropertiesCustomizer import org.springframework.cache.jcache.JCacheCacheManager import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyHibernateSecondLevelCacheConfiguration { @Bean fun hibernateSecondLevelCacheCustomizer(cacheManager: JCacheCacheManager): HibernatePropertiesCustomizer { return HibernatePropertiesCustomizer { properties -> properties[ConfigSettings.CACHE\_MANAGER] = cacheManager.cacheManager } } } ``` This customizer will configure Hibernate to use the same `CacheManager` as the one that the application uses. It is also possible to use separate `CacheManager` instances. For details, see [the Hibernate user guide](https://docs.jboss.org/hibernate/orm/5.4/userguide/html_single/Hibernate_User_Guide.html#caching-provider-jcache). ### 8.8. Use Dependency Injection in Hibernate Components By default, Spring Boot registers a `BeanContainer` implementation that uses the `BeanFactory` so that converters and entity listeners can use regular dependency injection. You can disable or tune this behavior by registering a `HibernatePropertiesCustomizer` that removes or changes the `hibernate.resource.beans.container` property. ### 8.9. Use a Custom EntityManagerFactory To take full control of the configuration of the `EntityManagerFactory`, you need to add a `@Bean` named ‘entityManagerFactory’. Spring Boot auto-configuration switches off its entity manager in the presence of a bean of that type. ### 8.10. Using Multiple EntityManagerFactories If you need to use JPA against multiple data sources, you likely need one `EntityManagerFactory` per data source. The `LocalContainerEntityManagerFactoryBean` from Spring ORM allows you to configure an `EntityManagerFactory` for your needs. You can also reuse `JpaProperties` to bind settings for each `EntityManagerFactory`, as shown in the following example: Java ``` import javax.sql.DataSource; import org.springframework.boot.autoconfigure.orm.jpa.JpaProperties; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.orm.jpa.JpaVendorAdapter; import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean; import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter; @Configuration(proxyBeanMethods = false) public class MyEntityManagerFactoryConfiguration { @Bean @ConfigurationProperties("app.jpa.first") public JpaProperties firstJpaProperties() { return new JpaProperties(); } @Bean public LocalContainerEntityManagerFactoryBean firstEntityManagerFactory(DataSource firstDataSource, JpaProperties firstJpaProperties) { EntityManagerFactoryBuilder builder = createEntityManagerFactoryBuilder(firstJpaProperties); return builder.dataSource(firstDataSource).packages(Order.class).persistenceUnit("firstDs").build(); } private EntityManagerFactoryBuilder createEntityManagerFactoryBuilder(JpaProperties jpaProperties) { JpaVendorAdapter jpaVendorAdapter = createJpaVendorAdapter(jpaProperties); return new EntityManagerFactoryBuilder(jpaVendorAdapter, jpaProperties.getProperties(), null); } private JpaVendorAdapter createJpaVendorAdapter(JpaProperties jpaProperties) { // ... map JPA properties as needed return new HibernateJpaVendorAdapter(); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.orm.jpa.JpaProperties import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.orm.jpa.JpaVendorAdapter import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter import javax.sql.DataSource @Configuration(proxyBeanMethods = false) class MyEntityManagerFactoryConfiguration { @Bean @ConfigurationProperties("app.jpa.first") fun firstJpaProperties(): JpaProperties { return JpaProperties() } @Bean fun firstEntityManagerFactory( firstDataSource: DataSource?, firstJpaProperties: JpaProperties ): LocalContainerEntityManagerFactoryBean { val builder = createEntityManagerFactoryBuilder(firstJpaProperties) return builder.dataSource(firstDataSource).packages(Order::class.java).persistenceUnit("firstDs").build() } private fun createEntityManagerFactoryBuilder(jpaProperties: JpaProperties): EntityManagerFactoryBuilder { val jpaVendorAdapter = createJpaVendorAdapter(jpaProperties) return EntityManagerFactoryBuilder(jpaVendorAdapter, jpaProperties.properties, null) } private fun createJpaVendorAdapter(jpaProperties: JpaProperties): JpaVendorAdapter { // ... map JPA properties as needed return HibernateJpaVendorAdapter() } } ``` The example above creates an `EntityManagerFactory` using a `DataSource` bean named `firstDataSource`. It scans entities located in the same package as `Order`. It is possible to map additional JPA properties using the `app.first.jpa` namespace. | | | | --- | --- | | | When you create a bean for `LocalContainerEntityManagerFactoryBean` yourself, any customization that was applied during the creation of the auto-configured `LocalContainerEntityManagerFactoryBean` is lost. For example, in case of Hibernate, any properties under the `spring.jpa.hibernate` prefix will not be automatically applied to your `LocalContainerEntityManagerFactoryBean`. If you were relying on these properties for configuring things like the naming strategy or the DDL mode, you will need to explicitly configure that when creating the `LocalContainerEntityManagerFactoryBean` bean. | You should provide a similar configuration for any additional data sources for which you need JPA access. To complete the picture, you need to configure a `JpaTransactionManager` for each `EntityManagerFactory` as well. Alternatively, you might be able to use a JTA transaction manager that spans both. If you use Spring Data, you need to configure `@EnableJpaRepositories` accordingly, as shown in the following examples: Java ``` import org.springframework.context.annotation.Configuration; import org.springframework.data.jpa.repository.config.EnableJpaRepositories; @Configuration(proxyBeanMethods = false) @EnableJpaRepositories(basePackageClasses = Order.class, entityManagerFactoryRef = "firstEntityManagerFactory") public class OrderConfiguration { } ``` Kotlin ``` import org.springframework.context.annotation.Configuration import org.springframework.data.jpa.repository.config.EnableJpaRepositories @Configuration(proxyBeanMethods = false) @EnableJpaRepositories(basePackageClasses = [Order::class], entityManagerFactoryRef = "firstEntityManagerFactory") class OrderConfiguration ``` Java ``` import org.springframework.context.annotation.Configuration; import org.springframework.data.jpa.repository.config.EnableJpaRepositories; @Configuration(proxyBeanMethods = false) @EnableJpaRepositories(basePackageClasses = Customer.class, entityManagerFactoryRef = "secondEntityManagerFactory") public class CustomerConfiguration { } ``` Kotlin ``` import org.springframework.context.annotation.Configuration import org.springframework.data.jpa.repository.config.EnableJpaRepositories @Configuration(proxyBeanMethods = false) @EnableJpaRepositories(basePackageClasses = [Customer::class], entityManagerFactoryRef = "secondEntityManagerFactory") class CustomerConfiguration ``` ### 8.11. Use a Traditional persistence.xml File Spring Boot will not search for or use a `META-INF/persistence.xml` by default. If you prefer to use a traditional `persistence.xml`, you need to define your own `@Bean` of type `LocalEntityManagerFactoryBean` (with an ID of ‘entityManagerFactory’) and set the persistence unit name there. See [`JpaBaseConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/orm/jpa/JpaBaseConfiguration.java) for the default settings. ### 8.12. Use Spring Data JPA and Mongo Repositories Spring Data JPA and Spring Data Mongo can both automatically create `Repository` implementations for you. If they are both present on the classpath, you might have to do some extra configuration to tell Spring Boot which repositories to create. The most explicit way to do that is to use the standard Spring Data `@EnableJpaRepositories` and `@EnableMongoRepositories` annotations and provide the location of your `Repository` interfaces. There are also flags (`spring.data.*.repositories.enabled` and `spring.data.*.repositories.type`) that you can use to switch the auto-configured repositories on and off in external configuration. Doing so is useful, for instance, in case you want to switch off the Mongo repositories and still use the auto-configured `MongoTemplate`. The same obstacle and the same features exist for other auto-configured Spring Data repository types (Elasticsearch, Solr, and others). To work with them, change the names of the annotations and flags accordingly. ### 8.13. Customize Spring Data’s Web Support Spring Data provides web support that simplifies the use of Spring Data repositories in a web application. Spring Boot provides properties in the `spring.data.web` namespace for customizing its configuration. Note that if you are using Spring Data REST, you must use the properties in the `spring.data.rest` namespace instead. ### 8.14. Expose Spring Data Repositories as REST Endpoint Spring Data REST can expose the `Repository` implementations as REST endpoints for you, provided Spring MVC has been enabled for the application. Spring Boot exposes a set of useful properties (from the `spring.data.rest` namespace) that customize the [`RepositoryRestConfiguration`](https://docs.spring.io/spring-data/rest/docs/3.7.0/api/org/springframework/data/rest/core/config/RepositoryRestConfiguration.html). If you need to provide additional customization, you should use a [`RepositoryRestConfigurer`](https://docs.spring.io/spring-data/rest/docs/3.7.0/api/org/springframework/data/rest/webmvc/config/RepositoryRestConfigurer.html) bean. | | | | --- | --- | | | If you do not specify any order on your custom `RepositoryRestConfigurer`, it runs after the one Spring Boot uses internally. If you need to specify an order, make sure it is higher than 0. | ### 8.15. Configure a Component that is Used by JPA If you want to configure a component that JPA uses, then you need to ensure that the component is initialized before JPA. When the component is auto-configured, Spring Boot takes care of this for you. For example, when Flyway is auto-configured, Hibernate is configured to depend upon Flyway so that Flyway has a chance to initialize the database before Hibernate tries to use it. If you are configuring a component yourself, you can use an `EntityManagerFactoryDependsOnPostProcessor` subclass as a convenient way of setting up the necessary dependencies. For example, if you use Hibernate Search with Elasticsearch as its index manager, any `EntityManagerFactory` beans must be configured to depend on the `elasticsearchClient` bean, as shown in the following example: Java ``` import javax.persistence.EntityManagerFactory; import org.springframework.boot.autoconfigure.orm.jpa.EntityManagerFactoryDependsOnPostProcessor; import org.springframework.stereotype.Component; /\*\* \* {@link EntityManagerFactoryDependsOnPostProcessor} that ensures that \* {@link EntityManagerFactory} beans depend on the {@code elasticsearchClient} bean. \*/ @Component public class ElasticsearchEntityManagerFactoryDependsOnPostProcessor extends EntityManagerFactoryDependsOnPostProcessor { public ElasticsearchEntityManagerFactoryDependsOnPostProcessor() { super("elasticsearchClient"); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.orm.jpa.EntityManagerFactoryDependsOnPostProcessor import org.springframework.stereotype.Component @Component class ElasticsearchEntityManagerFactoryDependsOnPostProcessor : EntityManagerFactoryDependsOnPostProcessor("elasticsearchClient") ``` ### 8.16. Configure jOOQ with Two DataSources If you need to use jOOQ with multiple data sources, you should create your own `DSLContext` for each one. See [JooqAutoConfiguration](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jooq/JooqAutoConfiguration.java) for more details. | | | | --- | --- | | | In particular, `JooqExceptionTranslator` and `SpringTransactionProvider` can be reused to provide similar features to what the auto-configuration does with a single `DataSource`. | 9. Database Initialization --------------------------- An SQL database can be initialized in different ways depending on what your stack is. Of course, you can also do it manually, provided the database is a separate process. It is recommended to use a single mechanism for schema generation. ### 9.1. Initialize a Database Using JPA JPA has features for DDL generation, and these can be set up to run on startup against the database. This is controlled through two external properties: * `spring.jpa.generate-ddl` (boolean) switches the feature on and off and is vendor independent. * `spring.jpa.hibernate.ddl-auto` (enum) is a Hibernate feature that controls the behavior in a more fine-grained way. This feature is described in more detail later in this guide. ### 9.2. Initialize a Database Using Hibernate You can set `spring.jpa.hibernate.ddl-auto` explicitly and the standard Hibernate property values are `none`, `validate`, `update`, `create`, and `create-drop`. Spring Boot chooses a default value for you based on whether it thinks your database is embedded. It defaults to `create-drop` if no schema manager has been detected or `none` in all other cases. An embedded database is detected by looking at the `Connection` type and JDBC url. `hsqldb`, `h2`, and `derby` are candidates, and others are not. Be careful when switching from in-memory to a ‘real’ database that you do not make assumptions about the existence of the tables and data in the new platform. You either have to set `ddl-auto` explicitly or use one of the other mechanisms to initialize the database. | | | | --- | --- | | | You can output the schema creation by enabling the `org.hibernate.SQL` logger. This is done for you automatically if you enable the [debug mode](features#features.logging.console-output). | In addition, a file named `import.sql` in the root of the classpath is executed on startup if Hibernate creates the schema from scratch (that is, if the `ddl-auto` property is set to `create` or `create-drop`). This can be useful for demos and for testing if you are careful but is probably not something you want to be on the classpath in production. It is a Hibernate feature (and has nothing to do with Spring). ### 9.3. Initialize a Database Using Basic SQL Scripts Spring Boot can automatically create the schema (DDL scripts) of your JDBC `DataSource` or R2DBC `ConnectionFactory` and initialize it (DML scripts). It loads SQL from the standard root classpath locations: `schema.sql` and `data.sql`, respectively. In addition, Spring Boot processes the `schema-${platform}.sql` and `data-${platform}.sql` files (if present), where `platform` is the value of `spring.sql.init.platform`. This allows you to switch to database-specific scripts if necessary. For example, you might choose to set it to the vendor name of the database (`hsqldb`, `h2`, `oracle`, `mysql`, `postgresql`, and so on). By default, SQL database initialization is only performed when using an embedded in-memory database. To always initialize an SQL database, irrespective of its type, set `spring.sql.init.mode` to `always`. Similarly, to disable initialization, set `spring.sql.init.mode` to `never`. By default, Spring Boot enables the fail-fast feature of its script-based database initializer. This means that, if the scripts cause exceptions, the application fails to start. You can tune that behavior by setting `spring.sql.init.continue-on-error`. Script-based `DataSource` initialization is performed, by default, before any JPA `EntityManagerFactory` beans are created. `schema.sql` can be used to create the schema for JPA-managed entities and `data.sql` can be used to populate it. While we do not recommend using multiple data source initialization technologies, if you want script-based `DataSource` initialization to be able to build upon the schema creation performed by Hibernate, set `spring.jpa.defer-datasource-initialization` to `true`. This will defer data source initialization until after any `EntityManagerFactory` beans have been created and initialized. `schema.sql` can then be used to make additions to any schema creation performed by Hibernate and `data.sql` can be used to populate it. If you are using a [Higher-level Database Migration Tool](#howto.data-initialization.migration-tool), like Flyway or Liquibase, you should use them alone to create and initialize the schema. Using the basic `schema.sql` and `data.sql` scripts alongside Flyway or Liquibase is not recommended and support will be removed in a future release. ### 9.4. Initialize a Spring Batch Database If you use Spring Batch, it comes pre-packaged with SQL initialization scripts for most popular database platforms. Spring Boot can detect your database type and execute those scripts on startup. If you use an embedded database, this happens by default. You can also enable it for any database type, as shown in the following example: Properties ``` spring.batch.jdbc.initialize-schema=always ``` Yaml ``` spring: batch: jdbc: initialize-schema: "always" ``` You can also switch off the initialization explicitly by setting `spring.batch.jdbc.initialize-schema` to `never`. ### 9.5. Use a Higher-level Database Migration Tool Spring Boot supports two higher-level migration tools: [Flyway](https://flywaydb.org/) and [Liquibase](https://www.liquibase.org/). #### 9.5.1. Execute Flyway Database Migrations on Startup To automatically run Flyway database migrations on startup, add the `org.flywaydb:flyway-core` to your classpath. Typically, migrations are scripts in the form `V<VERSION>__<NAME>.sql` (with `<VERSION>` an underscore-separated version, such as ‘1’ or ‘2\_1’). By default, they are in a directory called `classpath:db/migration`, but you can modify that location by setting `spring.flyway.locations`. This is a comma-separated list of one or more `classpath:` or `filesystem:` locations. For example, the following configuration would search for scripts in both the default classpath location and the `/opt/migration` directory: Properties ``` spring.flyway.locations=classpath:db/migration,filesystem:/opt/migration ``` Yaml ``` spring: flyway: locations: "classpath:db/migration,filesystem:/opt/migration" ``` You can also add a special `{vendor}` placeholder to use vendor-specific scripts. Assume the following: Properties ``` spring.flyway.locations=classpath:db/migration/{vendor} ``` Yaml ``` spring: flyway: locations: "classpath:db/migration/{vendor}" ``` Rather than using `db/migration`, the preceding configuration sets the directory to use according to the type of the database (such as `db/migration/mysql` for MySQL). The list of supported databases is available in [`DatabaseDriver`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jdbc/DatabaseDriver.java). Migrations can also be written in Java. Flyway will be auto-configured with any beans that implement `JavaMigration`. [`FlywayProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/flyway/FlywayProperties.java) provides most of Flyway’s settings and a small set of additional properties that can be used to disable the migrations or switch off the location checking. If you need more control over the configuration, consider registering a `FlywayConfigurationCustomizer` bean. Spring Boot calls `Flyway.migrate()` to perform the database migration. If you would like more control, provide a `@Bean` that implements [`FlywayMigrationStrategy`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/flyway/FlywayMigrationStrategy.java). Flyway supports SQL and Java [callbacks](https://flywaydb.org/documentation/concepts/callbacks). To use SQL-based callbacks, place the callback scripts in the `classpath:db/migration` directory. To use Java-based callbacks, create one or more beans that implement `Callback`. Any such beans are automatically registered with `Flyway`. They can be ordered by using `@Order` or by implementing `Ordered`. Beans that implement the deprecated `FlywayCallback` interface can also be detected, however they cannot be used alongside `Callback` beans. By default, Flyway autowires the (`@Primary`) `DataSource` in your context and uses that for migrations. If you like to use a different `DataSource`, you can create one and mark its `@Bean` as `@FlywayDataSource`. If you do so and want two data sources, remember to create another one and mark it as `@Primary`. Alternatively, you can use Flyway’s native `DataSource` by setting `spring.flyway.[url,user,password]` in external properties. Setting either `spring.flyway.url` or `spring.flyway.user` is sufficient to cause Flyway to use its own `DataSource`. If any of the three properties has not been set, the value of its equivalent `spring.datasource` property will be used. You can also use Flyway to provide data for specific scenarios. For example, you can place test-specific migrations in `src/test/resources` and they are run only when your application starts for testing. Also, you can use profile-specific configuration to customize `spring.flyway.locations` so that certain migrations run only when a particular profile is active. For example, in `application-dev.properties`, you might specify the following setting: Properties ``` spring.flyway.locations=classpath:/db/migration,classpath:/dev/db/migration ``` Yaml ``` spring: flyway: locations: "classpath:/db/migration,classpath:/dev/db/migration" ``` With that setup, migrations in `dev/db/migration` run only when the `dev` profile is active. #### 9.5.2. Execute Liquibase Database Migrations on Startup To automatically run Liquibase database migrations on startup, add the `org.liquibase:liquibase-core` to your classpath. | | | | --- | --- | | | When you add the `org.liquibase:liquibase-core` to your classpath, database migrations run by default for both during application startup and before your tests run. This behavior can be customized by using the `spring.liquibase.enabled` property, setting different values in the `main` and `test` configurations. It is not possible to use two different ways to initialize the database (for example Liquibase for application startup, JPA for test runs). | By default, the master change log is read from `db/changelog/db.changelog-master.yaml`, but you can change the location by setting `spring.liquibase.change-log`. In addition to YAML, Liquibase also supports JSON, XML, and SQL change log formats. By default, Liquibase autowires the (`@Primary`) `DataSource` in your context and uses that for migrations. If you need to use a different `DataSource`, you can create one and mark its `@Bean` as `@LiquibaseDataSource`. If you do so and you want two data sources, remember to create another one and mark it as `@Primary`. Alternatively, you can use Liquibase’s native `DataSource` by setting `spring.liquibase.[driver-class-name,url,user,password]` in external properties. Setting either `spring.liquibase.url` or `spring.liquibase.user` is sufficient to cause Liquibase to use its own `DataSource`. If any of the three properties has not been set, the value of its equivalent `spring.datasource` property will be used. See [`LiquibaseProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/liquibase/LiquibaseProperties.java) for details about available settings such as contexts, the default schema, and others. ### 9.6. Depend Upon an Initialized Database Database initialization is performed while the application is starting up as part of application context refresh. To allow an initialized database to be accessed during startup, beans that act as database initializers and beans that require that database to have been initialized are detected automatically. Beans whose initialization depends upon the database having been initialized are configured to depend upon those that initialize it. If, during startup, your application tries to access the database and it has not been initialized, you can configure additional detection of beans that initialize the database and require the database to have been initialized. #### 9.6.1. Detect a Database Initializer Spring Boot will automatically detect beans of the following types that initialize an SQL database: * `DataSourceScriptDatabaseInitializer` * `EntityManagerFactory` * `Flyway` * `FlywayMigrationInitializer` * `R2dbcScriptDatabaseInitializer` * `SpringLiquibase` If you are using a third-party starter for a database initialization library, it may provide a detector such that beans of other types are also detected automatically. To have other beans be detected, register an implementation of `DatabaseInitializerDetector` in `META-INF/spring-factories`. #### 9.6.2. Detect a Bean That Depends On Database Initialization Spring Boot will automatically detect beans of the following types that depends upon database initialization: * `AbstractEntityManagerFactoryBean` (unless `spring.jpa.defer-datasource-initialization` is set to `true`) * `DSLContext` (jOOQ) * `EntityManagerFactory` (unless `spring.jpa.defer-datasource-initialization` is set to `true`) * `JdbcOperations` * `NamedParameterJdbcOperations` If you are using a third-party starter data access library, it may provide a detector such that beans of other types are also detected automatically. To have other beans be detected, register an implementation of `DependsOnDatabaseInitializationDetector` in `META-INF/spring-factories`. Alternatively, annotate the bean’s class or its `@Bean` method with `@DependsOnDatabaseInitialization`. 10. Messaging -------------- Spring Boot offers a number of starters to support messaging. This section answers questions that arise from using messaging with Spring Boot. ### 10.1. Disable Transacted JMS Session If your JMS broker does not support transacted sessions, you have to disable the support of transactions altogether. If you create your own `JmsListenerContainerFactory`, there is nothing to do, since, by default it cannot be transacted. If you want to use the `DefaultJmsListenerContainerFactoryConfigurer` to reuse Spring Boot’s default, you can disable transacted sessions, as follows: Java ``` import javax.jms.ConnectionFactory; import org.springframework.boot.autoconfigure.jms.DefaultJmsListenerContainerFactoryConfigurer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.jms.config.DefaultJmsListenerContainerFactory; @Configuration(proxyBeanMethods = false) public class MyJmsConfiguration { @Bean public DefaultJmsListenerContainerFactory jmsListenerContainerFactory(ConnectionFactory connectionFactory, DefaultJmsListenerContainerFactoryConfigurer configurer) { DefaultJmsListenerContainerFactory listenerFactory = new DefaultJmsListenerContainerFactory(); configurer.configure(listenerFactory, connectionFactory); listenerFactory.setTransactionManager(null); listenerFactory.setSessionTransacted(false); return listenerFactory; } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.jms.DefaultJmsListenerContainerFactoryConfigurer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.jms.config.DefaultJmsListenerContainerFactory import javax.jms.ConnectionFactory @Configuration(proxyBeanMethods = false) class MyJmsConfiguration { @Bean fun jmsListenerContainerFactory(connectionFactory: ConnectionFactory?, configurer: DefaultJmsListenerContainerFactoryConfigurer): DefaultJmsListenerContainerFactory { val listenerFactory = DefaultJmsListenerContainerFactory() configurer.configure(listenerFactory, connectionFactory) listenerFactory.setTransactionManager(null) listenerFactory.setSessionTransacted(false) return listenerFactory } } ``` The preceding example overrides the default factory, and it should be applied to any other factory that your application defines, if any. 11. Batch Applications ----------------------- A number of questions often arise when people use Spring Batch from within a Spring Boot application. This section addresses those questions. ### 11.1. Specifying a Batch Data Source By default, batch applications require a `DataSource` to store job details. Spring Batch expects a single `DataSource` by default. To have it use a `DataSource` other than the application’s main `DataSource`, declare a `DataSource` bean, annotating its `@Bean` method with `@BatchDataSource`. If you do so and want two data sources, remember to mark the other one `@Primary`. To take greater control, implement `BatchConfigurer`. See [The Javadoc of `@EnableBatchProcessing`](https://docs.spring.io/spring-batch/docs/4.3.6/api/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html) for more details. For more info about Spring Batch, see the [Spring Batch project page](https://spring.io/projects/spring-batch). ### 11.2. Running Spring Batch Jobs on Startup Spring Batch auto-configuration is enabled by adding `@EnableBatchProcessing` to one of your `@Configuration` classes. By default, it executes **all** `Jobs` in the application context on startup (see [`JobLauncherApplicationRunner`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/batch/JobLauncherApplicationRunner.java) for details). You can narrow down to a specific job or jobs by specifying `spring.batch.job.names` (which takes a comma-separated list of job name patterns). See [BatchAutoConfiguration](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/batch/BatchAutoConfiguration.java) and [@EnableBatchProcessing](https://docs.spring.io/spring-batch/docs/4.3.6/api/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html) for more details. ### 11.3. Running from the Command Line Spring Boot converts any command line argument starting with `--` to a property to add to the `Environment`, see [accessing command line properties](features#features.external-config.command-line-args). This should not be used to pass arguments to batch jobs. To specify batch arguments on the command line, use the regular format (that is without `--`), as shown in the following example: ``` $ java -jar myapp.jar someParameter=someValue anotherParameter=anotherValue ``` If you specify a property of the `Environment` on the command line, it is ignored by the job. Consider the following command: ``` $ java -jar myapp.jar --server.port=7070 someParameter=someValue ``` This provides only one argument to the batch job: `someParameter=someValue`. ### 11.4. Storing the Job Repository Spring Batch requires a data store for the `Job` repository. If you use Spring Boot, you must use an actual database. Note that it can be an in-memory database, see [Configuring a Job Repository](https://docs.spring.io/spring-batch/docs/4.3.6/reference/html/job.html#configuringJobRepository). 12. Actuator ------------- Spring Boot includes the Spring Boot Actuator. This section answers questions that often arise from its use. ### 12.1. Change the HTTP Port or Address of the Actuator Endpoints In a standalone application, the Actuator HTTP port defaults to the same as the main HTTP port. To make the application listen on a different port, set the external property: `management.server.port`. To listen on a completely different network address (such as when you have an internal network for management and an external one for user applications), you can also set `management.server.address` to a valid IP address to which the server is able to bind. For more detail, see the [`ManagementServerProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-actuator-autoconfigure/src/main/java/org/springframework/boot/actuate/autoconfigure/web/server/ManagementServerProperties.java) source code and “[actuator.html](actuator#actuator.monitoring.customizing-management-server-port)” in the “Production-ready features” section. ### 12.2. Customize the ‘whitelabel’ Error Page Spring Boot installs a ‘whitelabel’ error page that you see in a browser client if you encounter a server error (machine clients consuming JSON and other media types should see a sensible response with the right error code). | | | | --- | --- | | | Set `server.error.whitelabel.enabled=false` to switch the default error page off. Doing so restores the default of the servlet container that you are using. Note that Spring Boot still tries to resolve the error view, so you should probably add your own error page rather than disabling it completely. | Overriding the error page with your own depends on the templating technology that you use. For example, if you use Thymeleaf, you can add an `error.html` template. If you use FreeMarker, you can add an `error.ftlh` template. In general, you need a `View` that resolves with a name of `error` or a `@Controller` that handles the `/error` path. Unless you replaced some of the default configuration, you should find a `BeanNameViewResolver` in your `ApplicationContext`, so a `@Bean` named `error` would be one way of doing that. See [`ErrorMvcAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/web/servlet/error/ErrorMvcAutoConfiguration.java) for more options. See also the section on “[Error Handling](web#web.servlet.spring-mvc.error-handling)” for details of how to register handlers in the servlet container. ### 12.3. Sanitize Sensitive Values Information returned by the `env` and `configprops` endpoints can be somewhat sensitive so keys matching certain patterns are sanitized by default (that is their values are replaced by `******`). Spring Boot uses sensible defaults for such keys: any key ending with the word "password", "secret", "key", "token", "vcap\_services", "sun.java.command" is entirely sanitized. Additionally, any key that holds the word `credentials` (configured as a regular expression, that is `.*credentials.*`) as part of the key is also entirely sanitized. Furthermore, Spring Boot sanitizes the sensitive portion of URI-like values for keys with one of the following endings: * `address` * `addresses` * `uri` * `uris` * `url` * `urls` The sensitive portion of the URI is identified using the format `<scheme>://<username>:<password>@<host>:<port>/`. For example, for the property `myclient.uri=http://user1:password1@localhost:8081`, the resulting sanitized value is `http://user1:******@localhost:8081`. #### 12.3.1. Customizing Sanitization Sanitization can be customized in two different ways. The default patterns used by the `env` and `configprops` endpoints can be replaced using `management.endpoint.env.keys-to-sanitize` and `management.endpoint.configprops.keys-to-sanitize` respectively. Alternatively, additional patterns can be configured using `management.endpoint.env.additional-keys-to-sanitize` and `management.endpoint.configprops.additional-keys-to-sanitize`. To take more control over the sanitization, define a `SanitizingFunction` bean. The `SanitizableData` with which the function is called provides access to the key and value as well as the `PropertySource` from which they came. This allows you to, for example, sanitize every value that comes from a particular property source. Each `SanitizingFunction` is called in order until a function changes the value of the sanitizable data. If no function changes its value, the built-in key-based sanitization is performed. ### 12.4. Map Health Indicators to Micrometer Metrics Spring Boot health indicators return a `Status` type to indicate the overall system health. If you want to monitor or alert on levels of health for a particular application, you can export these statuses as metrics with Micrometer. By default, the status codes “UP”, “DOWN”, “OUT\_OF\_SERVICE” and “UNKNOWN” are used by Spring Boot. To export these, you will need to convert these states to some set of numbers so that they can be used with a Micrometer `Gauge`. The following example shows one way to write such an exporter: Java ``` import io.micrometer.core.instrument.Gauge; import io.micrometer.core.instrument.MeterRegistry; import org.springframework.boot.actuate.health.HealthEndpoint; import org.springframework.boot.actuate.health.Status; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyHealthMetricsExportConfiguration { public MyHealthMetricsExportConfiguration(MeterRegistry registry, HealthEndpoint healthEndpoint) { // This example presumes common tags (such as the app) are applied elsewhere Gauge.builder("health", healthEndpoint, this::getStatusCode).strongReference(true).register(registry); } private int getStatusCode(HealthEndpoint health) { Status status = health.health().getStatus(); if (Status.UP.equals(status)) { return 3; } if (Status.OUT\_OF\_SERVICE.equals(status)) { return 2; } if (Status.DOWN.equals(status)) { return 1; } return 0; } } ``` Kotlin ``` import io.micrometer.core.instrument.Gauge import io.micrometer.core.instrument.MeterRegistry import org.springframework.boot.actuate.health.HealthEndpoint import org.springframework.boot.actuate.health.Status import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyHealthMetricsExportConfiguration(registry: MeterRegistry, healthEndpoint: HealthEndpoint) { init { // This example presumes common tags (such as the app) are applied elsewhere Gauge.builder("health", healthEndpoint) { health -> getStatusCode(health).toDouble() }.strongReference(true).register(registry) } private fun getStatusCode(health: HealthEndpoint): Int { val status = health.health().status if (Status.UP == status) { return 3 } if (Status.OUT\_OF\_SERVICE == status) { return 2 } if (Status.DOWN == status) { return 1 } return 0 } } ``` 13. Security ------------- This section addresses questions about security when working with Spring Boot, including questions that arise from using Spring Security with Spring Boot. For more about Spring Security, see the [Spring Security project page](https://spring.io/projects/spring-security). ### 13.1. Switch off the Spring Boot Security Configuration If you define a `@Configuration` with a `WebSecurityConfigurerAdapter` or a `SecurityFilterChain` bean in your application, it switches off the default webapp security settings in Spring Boot. ### 13.2. Change the UserDetailsService and Add User Accounts If you provide a `@Bean` of type `AuthenticationManager`, `AuthenticationProvider`, or `UserDetailsService`, the default `@Bean` for `InMemoryUserDetailsManager` is not created. This means you have the full feature set of Spring Security available (such as [various authentication options](https://docs.spring.io/spring-security/reference/5.7.1/servlet/authentication/index.html)). The easiest way to add user accounts is to provide your own `UserDetailsService` bean. ### 13.3. Enable HTTPS When Running behind a Proxy Server Ensuring that all your main endpoints are only available over HTTPS is an important chore for any application. If you use Tomcat as a servlet container, then Spring Boot adds Tomcat’s own `RemoteIpValve` automatically if it detects some environment settings, and you should be able to rely on the `HttpServletRequest` to report whether it is secure or not (even downstream of a proxy server that handles the real SSL termination). The standard behavior is determined by the presence or absence of certain request headers (`x-forwarded-for` and `x-forwarded-proto`), whose names are conventional, so it should work with most front-end proxies. You can switch on the valve by adding some entries to `application.properties`, as shown in the following example: Properties ``` server.tomcat.remoteip.remote-ip-header=x-forwarded-for server.tomcat.remoteip.protocol-header=x-forwarded-proto ``` Yaml ``` server: tomcat: remoteip: remote-ip-header: "x-forwarded-for" protocol-header: "x-forwarded-proto" ``` (The presence of either of those properties switches on the valve. Alternatively, you can add the `RemoteIpValve` by customizing the `TomcatServletWebServerFactory` using a `WebServerFactoryCustomizer` bean.) To configure Spring Security to require a secure channel for all (or some) requests, consider adding your own `SecurityFilterChain` bean that adds the following `HttpSecurity` configuration: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration public class MySecurityConfig { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { // Customize the application security ... http.requiresChannel().anyRequest().requiresSecure(); return http.build(); } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.web.SecurityFilterChain @Configuration class MySecurityConfig { @Bean fun securityFilterChain(http: HttpSecurity): SecurityFilterChain { // Customize the application security ... http.requiresChannel().anyRequest().requiresSecure() return http.build() } } ``` 14. Hot Swapping ----------------- Spring Boot supports hot swapping. This section answers questions about how it works. ### 14.1. Reload Static Content There are several options for hot reloading. The recommended approach is to use [`spring-boot-devtools`](using#using.devtools), as it provides additional development-time features, such as support for fast application restarts and LiveReload as well as sensible development-time configuration (such as template caching). Devtools works by monitoring the classpath for changes. This means that static resource changes must be "built" for the change to take effect. By default, this happens automatically in Eclipse when you save your changes. In IntelliJ IDEA, the Make Project command triggers the necessary build. Due to the [default restart exclusions](using#using.devtools.restart.excluding-resources), changes to static resources do not trigger a restart of your application. They do, however, trigger a live reload. Alternatively, running in an IDE (especially with debugging on) is a good way to do development (all modern IDEs allow reloading of static resources and usually also allow hot-swapping of Java class changes). Finally, the [Maven and Gradle plugins](build-tool-plugins#build-tool-plugins) can be configured (see the `addResources` property) to support running from the command line with reloading of static files directly from source. You can use that with an external css/js compiler process if you are writing that code with higher-level tools. ### 14.2. Reload Templates without Restarting the Container Most of the templating technologies supported by Spring Boot include a configuration option to disable caching (described later in this document). If you use the `spring-boot-devtools` module, these properties are [automatically configured](using#using.devtools.property-defaults) for you at development time. #### 14.2.1. Thymeleaf Templates If you use Thymeleaf, set `spring.thymeleaf.cache` to `false`. See [`ThymeleafAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/thymeleaf/ThymeleafAutoConfiguration.java) for other Thymeleaf customization options. #### 14.2.2. FreeMarker Templates If you use FreeMarker, set `spring.freemarker.cache` to `false`. See [`FreeMarkerAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/freemarker/FreeMarkerAutoConfiguration.java) for other FreeMarker customization options. #### 14.2.3. Groovy Templates If you use Groovy templates, set `spring.groovy.template.cache` to `false`. See [`GroovyTemplateAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/groovy/template/GroovyTemplateAutoConfiguration.java) for other Groovy customization options. ### 14.3. Fast Application Restarts The `spring-boot-devtools` module includes support for automatic application restarts. While not as fast as technologies such as [JRebel](https://www.jrebel.com/products/jrebel) it is usually significantly faster than a “cold start”. You should probably give it a try before investigating some of the more complex reload options discussed later in this document. For more details, see the [using.html](using#using.devtools) section. ### 14.4. Reload Java Classes without Restarting the Container Many modern IDEs (Eclipse, IDEA, and others) support hot swapping of bytecode. Consequently, if you make a change that does not affect class or method signatures, it should reload cleanly with no side effects. 15. Testing ------------ Spring Boot includes a number of testing utilities and support classes as well as a dedicated starter that provides common test dependencies. This section answers common questions about testing. ### 15.1. Testing With Spring Security Spring Security provides support for running tests as a specific user. For example, the test in the snippet below will run with an authenticated user that has the `ADMIN` role. Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest; import org.springframework.security.test.context.support.WithMockUser; import org.springframework.test.web.servlet.MockMvc; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; @WebMvcTest(UserController.class) class MySecurityTests { @Autowired private MockMvc mvc; @Test @WithMockUser(roles = "ADMIN") void requestProtectedUrlWithUser() throws Exception { this.mvc.perform(get("/")); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest import org.springframework.security.test.context.support.WithMockUser import org.springframework.test.web.servlet.MockMvc import org.springframework.test.web.servlet.request.MockMvcRequestBuilders @WebMvcTest(UserController::class) class MySecurityTests(@Autowired val mvc: MockMvc) { @Test @WithMockUser(roles = ["ADMIN"]) fun requestProtectedUrlWithUser() { mvc.perform(MockMvcRequestBuilders.get("/")) } } ``` Spring Security provides comprehensive integration with Spring MVC Test and this can also be used when testing controllers using the `@WebMvcTest` slice and `MockMvc`. For additional details on Spring Security’s testing support, see Spring Security’s [reference documentation](https://docs.spring.io/spring-security/reference/5.7.1/servlet/test/index.html). ### 15.2. Use Testcontainers for Integration Testing The [Testcontainers](https://www.testcontainers.org/) library provides a way to manage services running inside Docker containers. It integrates with JUnit, allowing you to write a test class that can start up a container before any of the tests run. Testcontainers is especially useful for writing integration tests that talk to a real backend service such as MySQL, MongoDB, Cassandra and others. Testcontainers can be used in a Spring Boot test as follows: Java ``` import org.junit.jupiter.api.Test; import org.testcontainers.containers.Neo4jContainer; import org.testcontainers.junit.jupiter.Container; import org.testcontainers.junit.jupiter.Testcontainers; import org.springframework.boot.test.context.SpringBootTest; @SpringBootTest @Testcontainers class MyIntegrationTests { @Container static Neo4jContainer<?> neo4j = new Neo4jContainer<>("neo4j:4.2"); @Test void myTest() { // ... } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.boot.test.context.SpringBootTest import org.testcontainers.containers.Neo4jContainer import org.testcontainers.junit.jupiter.Container import org.testcontainers.junit.jupiter.Testcontainers @SpringBootTest @Testcontainers internal class MyIntegrationTests { @Test fun myTest() { // ... } companion object { @Container var neo4j: Neo4jContainer<\*> = Neo4jContainer<Nothing>("neo4j:4.2") } } ``` This will start up a docker container running Neo4j (if Docker is running locally) before any of the tests are run. In most cases, you will need to configure the application using details from the running container, such as container IP or port. This can be done with a static `@DynamicPropertySource` method that allows adding dynamic property values to the Spring Environment. Java ``` import org.junit.jupiter.api.Test; import org.testcontainers.containers.Neo4jContainer; import org.testcontainers.junit.jupiter.Container; import org.testcontainers.junit.jupiter.Testcontainers; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.test.context.DynamicPropertyRegistry; import org.springframework.test.context.DynamicPropertySource; @SpringBootTest @Testcontainers class MyIntegrationTests { @Container static Neo4jContainer<?> neo4j = new Neo4jContainer<>("neo4j:4.2"); @Test void myTest() { // ... } @DynamicPropertySource static void neo4jProperties(DynamicPropertyRegistry registry) { registry.add("spring.neo4j.uri", neo4j::getBoltUrl); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.boot.test.context.SpringBootTest import org.springframework.test.context.DynamicPropertyRegistry import org.springframework.test.context.DynamicPropertySource import org.testcontainers.containers.Neo4jContainer import org.testcontainers.junit.jupiter.Container import org.testcontainers.junit.jupiter.Testcontainers @SpringBootTest @Testcontainers internal class MyIntegrationTests { @Test fun myTest() { // ... } companion object { @Container var neo4j: Neo4jContainer<\*> = Neo4jContainer<Nothing>("neo4j:4.2") @DynamicPropertySource fun neo4jProperties(registry: DynamicPropertyRegistry) { registry.add("spring.neo4j.uri") { neo4j.boltUrl } } } } ``` The above configuration allows Neo4j-related beans in the application to communicate with Neo4j running inside the Testcontainers-managed Docker container. ### 15.3. Structure `@Configuration` classes for inclusion in slice tests Slice tests work by restricting Spring Framework’s component scanning to a limited set of components based on their type. For any beans that are not created via component scanning, for example, beans that are created using the `@Bean` annotation, slice tests will not be able to include/exclude them from the application context. Consider this example: ``` import org.apache.commons.dbcp2.BasicDataSource; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.jdbc.DataSourceBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration(proxyBeanMethods = false) public class MyConfiguration { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { http.authorizeRequests().anyRequest().authenticated(); return http.build(); } @Bean @ConfigurationProperties("app.datasource.second") public BasicDataSource secondDataSource() { return DataSourceBuilder.create().type(BasicDataSource.class).build(); } } ``` For a `@WebMvcTest` for an application with the above `@Configuration` class, you might expect to have the `SecurityFilterChain` bean in the application context so that you can test if your controller endpoints are secured properly. However, `MyConfiguration` is not picked up by @WebMvcTest’s component scanning filter because it doesn’t match any of the types specified by the filter. You can include the configuration explicitly by annotating the test class with `@Import(MyConfiguration.class)`. This will load all the beans in `MyConfiguration` including the `BasicDataSource` bean which isn’t required when testing the web tier. Splitting the configuration class into two will enable importing just the security configuration. ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.web.SecurityFilterChain; @Configuration(proxyBeanMethods = false) public class MySecurityConfiguration { @Bean public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception { http.authorizeRequests().anyRequest().authenticated(); return http.build(); } } ``` ``` import org.apache.commons.dbcp2.BasicDataSource; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.jdbc.DataSourceBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyDatasourceConfiguration { @Bean @ConfigurationProperties("app.datasource.second") public BasicDataSource secondDataSource() { return DataSourceBuilder.create().type(BasicDataSource.class).build(); } } ``` Having a single configuration class can be inefficient when beans of a certain domain need to be included in slice tests. Instead, structuring the application’s configuration as multiple granular classes with beans for a specific domain can enable importing them only for specific slice tests. 16. Build ---------- Spring Boot includes build plugins for Maven and Gradle. This section answers common questions about these plugins. ### 16.1. Generate Build Information Both the Maven plugin and the Gradle plugin allow generating build information containing the coordinates, name, and version of the project. The plugins can also be configured to add additional properties through configuration. When such a file is present, Spring Boot auto-configures a `BuildProperties` bean. To generate build information with Maven, add an execution for the `build-info` goal, as shown in the following example: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <version>2.7.0</version> <executions> <execution> <goals> <goal>build-info</goal> </goals> </execution> </executions> </plugin> </plugins> </build> ``` | | | | --- | --- | | | See the [Spring Boot Maven Plugin documentation](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#goals-build-info) for more details. | The following example does the same with Gradle: ``` springBoot { buildInfo() } ``` | | | | --- | --- | | | See the [Spring Boot Gradle Plugin documentation](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#integrating-with-actuator-build-info) for more details. | ### 16.2. Generate Git Information Both Maven and Gradle allow generating a `git.properties` file containing information about the state of your `git` source code repository when the project was built. For Maven users, the `spring-boot-starter-parent` POM includes a pre-configured plugin to generate a `git.properties` file. To use it, add the following declaration for the [`Git Commit Id Plugin`](https://github.com/git-commit-id/git-commit-id-maven-plugin) to your POM: ``` <build> <plugins> <plugin> <groupId>pl.project13.maven</groupId> <artifactId>git-commit-id-plugin</artifactId> </plugin> </plugins> </build> ``` Gradle users can achieve the same result by using the [`gradle-git-properties`](https://plugins.gradle.org/plugin/com.gorylenko.gradle-git-properties) plugin, as shown in the following example: ``` plugins { id "com.gorylenko.gradle-git-properties" version "2.3.2" } ``` Both the Maven and Gradle plugins allow the properties that are included in `git.properties` to be configured. | | | | --- | --- | | | The commit time in `git.properties` is expected to match the following format: `yyyy-MM-dd’T’HH:mm:ssZ`. This is the default format for both plugins listed above. Using this format lets the time be parsed into a `Date` and its format, when serialized to JSON, to be controlled by Jackson’s date serialization configuration settings. | ### 16.3. Customize Dependency Versions The `spring-boot-dependencies` POM manages the versions of common dependencies. The Spring Boot plugins for Maven and Gradle allow these managed dependency versions to be customized using build properties. | | | | --- | --- | | | Each Spring Boot release is designed and tested against this specific set of third-party dependencies. Overriding versions may cause compatibility issues. | To override dependency versions with Maven, see [this section](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#using) of the Maven plugin’s documentation. To override dependency versions in Gradle, see [this section](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#managing-dependencies-dependency-management-plugin-customizing) of the Gradle plugin’s documentation. ### 16.4. Create an Executable JAR with Maven The `spring-boot-maven-plugin` can be used to create an executable “fat” JAR. If you use the `spring-boot-starter-parent` POM, you can declare the plugin and your jars are repackaged as follows: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> </plugins> </build> ``` If you do not use the parent POM, you can still use the plugin. However, you must additionally add an `<executions>` section, as follows: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <version>{spring-boot-version}</version> <executions> <execution> <goals> <goal>repackage</goal> </goals> </execution> </executions> </plugin> </plugins> </build> ``` See the [plugin documentation](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#repackage) for full usage details. ### 16.5. Use a Spring Boot Application as a Dependency Like a war file, a Spring Boot application is not intended to be used as a dependency. If your application contains classes that you want to share with other projects, the recommended approach is to move that code into a separate module. The separate module can then be depended upon by your application and other projects. If you cannot rearrange your code as recommended above, Spring Boot’s Maven and Gradle plugins must be configured to produce a separate artifact that is suitable for use as a dependency. The executable archive cannot be used as a dependency as the [executable jar format](executable-jar#appendix.executable-jar.nested-jars.jar-structure) packages application classes in `BOOT-INF/classes`. This means that they cannot be found when the executable jar is used as a dependency. To produce the two artifacts, one that can be used as a dependency and one that is executable, a classifier must be specified. This classifier is applied to the name of the executable archive, leaving the default archive for use as a dependency. To configure a classifier of `exec` in Maven, you can use the following configuration: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <classifier>exec</classifier> </configuration> </plugin> </plugins> </build> ``` ### 16.6. Extract Specific Libraries When an Executable Jar Runs Most nested libraries in an executable jar do not need to be unpacked in order to run. However, certain libraries can have problems. For example, JRuby includes its own nested jar support, which assumes that the `jruby-complete.jar` is always directly available as a file in its own right. To deal with any problematic libraries, you can flag that specific nested jars should be automatically unpacked when the executable jar first runs. Such nested jars are written beneath the temporary directory identified by the `java.io.tmpdir` system property. | | | | --- | --- | | | Care should be taken to ensure that your operating system is configured so that it will not delete the jars that have been unpacked to the temporary directory while the application is still running. | For example, to indicate that JRuby should be flagged for unpacking by using the Maven Plugin, you would add the following configuration: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <requiresUnpack> <dependency> <groupId>org.jruby</groupId> <artifactId>jruby-complete</artifactId> </dependency> </requiresUnpack> </configuration> </plugin> </plugins> </build> ``` ### 16.7. Create a Non-executable JAR with Exclusions Often, if you have an executable and a non-executable jar as two separate build products, the executable version has additional configuration files that are not needed in a library jar. For example, the `application.yml` configuration file might be excluded from the non-executable JAR. In Maven, the executable jar must be the main artifact and you can add a classified jar for the library, as follows: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> <plugin> <artifactId>maven-jar-plugin</artifactId> <executions> <execution> <id>lib</id> <phase>package</phase> <goals> <goal>jar</goal> </goals> <configuration> <classifier>lib</classifier> <excludes> <exclude>application.yml</exclude> </excludes> </configuration> </execution> </executions> </plugin> </plugins> </build> ``` ### 16.8. Remote Debug a Spring Boot Application Started with Maven To attach a remote debugger to a Spring Boot application that was started with Maven, you can use the `jvmArguments` property of the [maven plugin](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/). See [this example](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/#run-example-debug) for more details. ### 16.9. Build an Executable Archive from Ant without Using spring-boot-antlib To build with Ant, you need to grab dependencies, compile, and then create a jar or war archive. To make it executable, you can either use the `spring-boot-antlib` module or you can follow these instructions: 1. If you are building a jar, package the application’s classes and resources in a nested `BOOT-INF/classes` directory. If you are building a war, package the application’s classes in a nested `WEB-INF/classes` directory as usual. 2. Add the runtime dependencies in a nested `BOOT-INF/lib` directory for a jar or `WEB-INF/lib` for a war. Remember **not** to compress the entries in the archive. 3. Add the `provided` (embedded container) dependencies in a nested `BOOT-INF/lib` directory for a jar or `WEB-INF/lib-provided` for a war. Remember **not** to compress the entries in the archive. 4. Add the `spring-boot-loader` classes at the root of the archive (so that the `Main-Class` is available). 5. Use the appropriate launcher (such as `JarLauncher` for a jar file) as a `Main-Class` attribute in the manifest and specify the other properties it needs as manifest entries — principally, by setting a `Start-Class` property. The following example shows how to build an executable archive with Ant: ``` <target name="build" depends="compile"> <jar destfile="target/${ant.project.name}-${spring-boot.version}.jar" compress="false"> <mappedresources> <fileset dir="target/classes" /> <globmapper from="*" to="BOOT-INF/classes/*"/> </mappedresources> <mappedresources> <fileset dir="src/main/resources" erroronmissingdir="false"/> <globmapper from="*" to="BOOT-INF/classes/*"/> </mappedresources> <mappedresources> <fileset dir="${lib.dir}/runtime" /> <globmapper from="*" to="BOOT-INF/lib/*"/> </mappedresources> <zipfileset src="${lib.dir}/loader/spring-boot-loader-jar-${spring-boot.version}.jar" /> <manifest> <attribute name="Main-Class" value="org.springframework.boot.loader.JarLauncher" /> <attribute name="Start-Class" value="${start-class}" /> </manifest> </jar> </target> ``` 17. Traditional Deployment --------------------------- Spring Boot supports traditional deployment as well as more modern forms of deployment. This section answers common questions about traditional deployment. ### 17.1. Create a Deployable War File | | | | --- | --- | | | Because Spring WebFlux does not strictly depend on the servlet API and applications are deployed by default on an embedded Reactor Netty server, War deployment is not supported for WebFlux applications. | The first step in producing a deployable war file is to provide a `SpringBootServletInitializer` subclass and override its `configure` method. Doing so makes use of Spring Framework’s servlet 3.0 support and lets you configure your application when it is launched by the servlet container. Typically, you should update your application’s main class to extend `SpringBootServletInitializer`, as shown in the following example: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.builder.SpringApplicationBuilder; import org.springframework.boot.web.servlet.support.SpringBootServletInitializer; @SpringBootApplication public class MyApplication extends SpringBootServletInitializer { @Override protected SpringApplicationBuilder configure(SpringApplicationBuilder application) { return application.sources(MyApplication.class); } public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.builder.SpringApplicationBuilder import org.springframework.boot.runApplication import org.springframework.boot.web.servlet.support.SpringBootServletInitializer @SpringBootApplication class MyApplication : SpringBootServletInitializer() { override fun configure(application: SpringApplicationBuilder): SpringApplicationBuilder { return application.sources(MyApplication::class.java) } } fun main(args: Array<String>) { runApplication<MyApplication>(\*args) } ``` The next step is to update your build configuration such that your project produces a war file rather than a jar file. If you use Maven and `spring-boot-starter-parent` (which configures Maven’s war plugin for you), all you need to do is to modify `pom.xml` to change the packaging to war, as follows: ``` <packaging>war</packaging> ``` If you use Gradle, you need to modify `build.gradle` to apply the war plugin to the project, as follows: ``` apply plugin: 'war' ``` The final step in the process is to ensure that the embedded servlet container does not interfere with the servlet container to which the war file is deployed. To do so, you need to mark the embedded servlet container dependency as being provided. If you use Maven, the following example marks the servlet container (Tomcat, in this case) as being provided: ``` <dependencies> <!-- ... --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-tomcat</artifactId> <scope>provided</scope> </dependency> <!-- ... --> </dependencies> ``` If you use Gradle, the following example marks the servlet container (Tomcat, in this case) as being provided: ``` dependencies { // ... providedRuntime 'org.springframework.boot:spring-boot-starter-tomcat' // ... } ``` | | | | --- | --- | | | `providedRuntime` is preferred to Gradle’s `compileOnly` configuration. Among other limitations, `compileOnly` dependencies are not on the test classpath, so any web-based integration tests fail. | If you use the [Spring Boot build tools](build-tool-plugins#build-tool-plugins), marking the embedded servlet container dependency as provided produces an executable war file with the provided dependencies packaged in a `lib-provided` directory. This means that, in addition to being deployable to a servlet container, you can also run your application by using `java -jar` on the command line. ### 17.2. Convert an Existing Application to Spring Boot To convert an existing non-web Spring application to a Spring Boot application, replace the code that creates your `ApplicationContext` and replace it with calls to `SpringApplication` or `SpringApplicationBuilder`. Spring MVC web applications are generally amenable to first creating a deployable war application and then migrating it later to an executable war or jar. See the [Getting Started Guide on Converting a jar to a war](https://spring.io/guides/gs/convert-jar-to-war/). To create a deployable war by extending `SpringBootServletInitializer` (for example, in a class called `Application`) and adding the Spring Boot `@SpringBootApplication` annotation, use code similar to that shown in the following example: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.builder.SpringApplicationBuilder; import org.springframework.boot.web.servlet.support.SpringBootServletInitializer; @SpringBootApplication public class MyApplication extends SpringBootServletInitializer { @Override protected SpringApplicationBuilder configure(SpringApplicationBuilder application) { // Customize the application or call application.sources(...) to add sources // Since our example is itself a @Configuration class (via @SpringBootApplication) // we actually do not need to override this method. return application; } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.builder.SpringApplicationBuilder import org.springframework.boot.runApplication import org.springframework.boot.web.servlet.support.SpringBootServletInitializer @SpringBootApplication class MyApplication : SpringBootServletInitializer() { override fun configure(application: SpringApplicationBuilder): SpringApplicationBuilder { // Customize the application or call application.sources(...) to add sources // Since our example is itself a @Configuration class (via @SpringBootApplication) // we actually do not need to override this method. return application } } ``` Remember that, whatever you put in the `sources` is merely a Spring `ApplicationContext`. Normally, anything that already works should work here. There might be some beans you can remove later and let Spring Boot provide its own defaults for them, but it should be possible to get something working before you need to do that. Static resources can be moved to `/public` (or `/static` or `/resources` or `/META-INF/resources`) in the classpath root. The same applies to `messages.properties` (which Spring Boot automatically detects in the root of the classpath). Vanilla usage of Spring `DispatcherServlet` and Spring Security should require no further changes. If you have other features in your application (for instance, using other servlets or filters), you may need to add some configuration to your `Application` context, by replacing those elements from the `web.xml`, as follows: * A `@Bean` of type `Servlet` or `ServletRegistrationBean` installs that bean in the container as if it were a `<servlet/>` and `<servlet-mapping/>` in `web.xml`. * A `@Bean` of type `Filter` or `FilterRegistrationBean` behaves similarly (as a `<filter/>` and `<filter-mapping/>`). * An `ApplicationContext` in an XML file can be added through an `@ImportResource` in your `Application`. Alternatively, cases where annotation configuration is heavily used already can be recreated in a few lines as `@Bean` definitions. Once the war file is working, you can make it executable by adding a `main` method to your `Application`, as shown in the following example: Java ``` public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } ``` Kotlin ``` fun main(args: Array<String>) { runApplication<MyApplication>(*args) } ``` | | | | --- | --- | | | If you intend to start your application as a war or as an executable application, you need to share the customizations of the builder in a method that is both available to the `SpringBootServletInitializer` callback and in the `main` method in a class similar to the following: Java ``` import org.springframework.boot.Banner; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.builder.SpringApplicationBuilder; import org.springframework.boot.web.servlet.support.SpringBootServletInitializer; @SpringBootApplication public class MyApplication extends SpringBootServletInitializer { @Override protected SpringApplicationBuilder configure(SpringApplicationBuilder builder) { return customizerBuilder(builder); } public static void main(String[] args) { customizerBuilder(new SpringApplicationBuilder()).run(args); } private static SpringApplicationBuilder customizerBuilder(SpringApplicationBuilder builder) { return builder.sources(MyApplication.class).bannerMode(Banner.Mode.OFF); } } ``` Kotlin ``` import org.springframework.boot.Banner import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.builder.SpringApplicationBuilder import org.springframework.boot.web.servlet.support.SpringBootServletInitializer @SpringBootApplication class MyApplication : SpringBootServletInitializer() { override fun configure(builder: SpringApplicationBuilder): SpringApplicationBuilder { return customizerBuilder(builder) } companion object { @JvmStatic fun main(args: Array<String>) { customizerBuilder(SpringApplicationBuilder()).run(\*args) } private fun customizerBuilder(builder: SpringApplicationBuilder): SpringApplicationBuilder { return builder.sources(MyApplication::class.java).bannerMode(Banner.Mode.OFF) } } } ``` | Applications can fall into more than one category: * Servlet 3.0+ applications with no `web.xml`. * Applications with a `web.xml`. * Applications with a context hierarchy. * Applications without a context hierarchy. All of these should be amenable to translation, but each might require slightly different techniques. Servlet 3.0+ applications might translate pretty easily if they already use the Spring Servlet 3.0+ initializer support classes. Normally, all the code from an existing `WebApplicationInitializer` can be moved into a `SpringBootServletInitializer`. If your existing application has more than one `ApplicationContext` (for example, if it uses `AbstractDispatcherServletInitializer`) then you might be able to combine all your context sources into a single `SpringApplication`. The main complication you might encounter is if combining does not work and you need to maintain the context hierarchy. See the [entry on building a hierarchy](#howto.application.context-hierarchy) for examples. An existing parent context that contains web-specific features usually needs to be broken up so that all the `ServletContextAware` components are in the child context. Applications that are not already Spring applications might be convertible to Spring Boot applications, and the previously mentioned guidance may help. However, you may yet encounter problems. In that case, we suggest [asking questions on Stack Overflow with a tag of `spring-boot`](https://stackoverflow.com/questions/tagged/spring-boot). ### 17.3. Deploying a WAR to WebLogic To deploy a Spring Boot application to WebLogic, you must ensure that your servlet initializer **directly** implements `WebApplicationInitializer` (even if you extend from a base class that already implements it). A typical initializer for WebLogic should resemble the following example: Java ``` import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.web.servlet.support.SpringBootServletInitializer; import org.springframework.web.WebApplicationInitializer; @SpringBootApplication public class MyApplication extends SpringBootServletInitializer implements WebApplicationInitializer { } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.web.servlet.support.SpringBootServletInitializer import org.springframework.web.WebApplicationInitializer @SpringBootApplication class MyApplication : SpringBootServletInitializer(), WebApplicationInitializer ``` If you use Logback, you also need to tell WebLogic to prefer the packaged version rather than the version that was pre-installed with the server. You can do so by adding a `WEB-INF/weblogic.xml` file with the following contents: ``` <?xml version="1.0" encoding="UTF-8"?> <wls:weblogic-web-app xmlns:wls="http://xmlns.oracle.com/weblogic/weblogic-web-app" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee https://java.sun.com/xml/ns/javaee/ejb-jar_3_0.xsd http://xmlns.oracle.com/weblogic/weblogic-web-app https://xmlns.oracle.com/weblogic/weblogic-web-app/1.4/weblogic-web-app.xsd"> <wls:container-descriptor> <wls:prefer-application-packages> <wls:package-name>org.slf4j</wls:package-name> </wls:prefer-application-packages> </wls:container-descriptor> </wls:weblogic-web-app> ```
programming_docs
spring_boot Deploying Spring Boot Applications Deploying Spring Boot Applications ================================== Spring Boot’s flexible packaging options provide a great deal of choice when it comes to deploying your application. You can deploy Spring Boot applications to a variety of cloud platforms, to virtual/real machines, or make them fully executable for Unix systems. This section covers some of the more common deployment scenarios. 1. Deploying to the Cloud -------------------------- Spring Boot’s executable jars are ready-made for most popular cloud PaaS (Platform-as-a-Service) providers. These providers tend to require that you “bring your own container”. They manage application processes (not Java applications specifically), so they need an intermediary layer that adapts *your* application to the *cloud’s* notion of a running process. Two popular cloud providers, Heroku and Cloud Foundry, employ a “buildpack” approach. The buildpack wraps your deployed code in whatever is needed to *start* your application. It might be a JDK and a call to `java`, an embedded web server, or a full-fledged application server. A buildpack is pluggable, but ideally you should be able to get by with as few customizations to it as possible. This reduces the footprint of functionality that is not under your control. It minimizes divergence between development and production environments. Ideally, your application, like a Spring Boot executable jar, has everything that it needs to run packaged within it. In this section, we look at what it takes to get the [application that we developed](getting-started#getting-started.first-application) in the “Getting Started” section up and running in the Cloud. ### 1.1. Cloud Foundry Cloud Foundry provides default buildpacks that come into play if no other buildpack is specified. The Cloud Foundry [Java buildpack](https://github.com/cloudfoundry/java-buildpack) has excellent support for Spring applications, including Spring Boot. You can deploy stand-alone executable jar applications as well as traditional `.war` packaged applications. Once you have built your application (by using, for example, `mvn clean package`) and have [installed the `cf` command line tool](https://docs.cloudfoundry.org/cf-cli/install-go-cli.html), deploy your application by using the `cf push` command, substituting the path to your compiled `.jar`. Be sure to have [logged in with your `cf` command line client](https://docs.cloudfoundry.org/cf-cli/getting-started.html#login) before pushing an application. The following line shows using the `cf push` command to deploy an application: ``` $ cf push acloudyspringtime -p target/demo-0.0.1-SNAPSHOT.jar ``` | | | | --- | --- | | | In the preceding example, we substitute `acloudyspringtime` for whatever value you give `cf` as the name of your application. | See the [`cf push` documentation](https://docs.cloudfoundry.org/cf-cli/getting-started.html#push) for more options. If there is a Cloud Foundry [`manifest.yml`](https://docs.cloudfoundry.org/devguide/deploy-apps/manifest.html) file present in the same directory, it is considered. At this point, `cf` starts uploading your application, producing output similar to the following example: ``` Uploading acloudyspringtime... **OK** Preparing to start acloudyspringtime... **OK** -----> Downloaded app package (**8.9M**) -----> Java Buildpack Version: v3.12 (offline) | https://github.com/cloudfoundry/java-buildpack.git#6f25b7e -----> Downloading Open Jdk JRE 1.8.0_121 from https://java-buildpack.cloudfoundry.org/openjdk/trusty/x86_64/openjdk-1.8.0_121.tar.gz (found in cache) Expanding Open Jdk JRE to .java-buildpack/open_jdk_jre (1.6s) -----> Downloading Open JDK Like Memory Calculator 2.0.2_RELEASE from https://java-buildpack.cloudfoundry.org/memory-calculator/trusty/x86_64/memory-calculator-2.0.2_RELEASE.tar.gz (found in cache) Memory Settings: -Xss349K -Xmx681574K -XX:MaxMetaspaceSize=104857K -Xms681574K -XX:MetaspaceSize=104857K -----> Downloading Container Certificate Trust Store 1.0.0_RELEASE from https://java-buildpack.cloudfoundry.org/container-certificate-trust-store/container-certificate-trust-store-1.0.0_RELEASE.jar (found in cache) Adding certificates to .java-buildpack/container_certificate_trust_store/truststore.jks (0.6s) -----> Downloading Spring Auto Reconfiguration 1.10.0_RELEASE from https://java-buildpack.cloudfoundry.org/auto-reconfiguration/auto-reconfiguration-1.10.0_RELEASE.jar (found in cache) Checking status of app 'acloudyspringtime'... 0 of 1 instances running (1 starting) ... 0 of 1 instances running (1 starting) ... 0 of 1 instances running (1 starting) ... 1 of 1 instances running (1 running) App started ``` Congratulations! The application is now live! Once your application is live, you can verify the status of the deployed application by using the `cf apps` command, as shown in the following example: ``` $ cf apps Getting applications in ... OK name requested state instances memory disk urls ... acloudyspringtime started 1/1 512M 1G acloudyspringtime.cfapps.io ... ``` Once Cloud Foundry acknowledges that your application has been deployed, you should be able to find the application at the URI given. In the preceding example, you could find it at `https://acloudyspringtime.cfapps.io/`. #### 1.1.1. Binding to Services By default, metadata about the running application as well as service connection information is exposed to the application as environment variables (for example: `$VCAP_SERVICES`). This architecture decision is due to Cloud Foundry’s polyglot (any language and platform can be supported as a buildpack) nature. Process-scoped environment variables are language agnostic. Environment variables do not always make for the easiest API, so Spring Boot automatically extracts them and flattens the data into properties that can be accessed through Spring’s `Environment` abstraction, as shown in the following example: Java ``` import org.springframework.context.EnvironmentAware; import org.springframework.core.env.Environment; import org.springframework.stereotype.Component; @Component public class MyBean implements EnvironmentAware { private String instanceId; @Override public void setEnvironment(Environment environment) { this.instanceId = environment.getProperty("vcap.application.instance\_id"); } // ... } ``` Kotlin ``` import org.springframework.context.EnvironmentAware import org.springframework.core.env.Environment import org.springframework.stereotype.Component @Component class MyBean : EnvironmentAware { private var instanceId: String? = null override fun setEnvironment(environment: Environment) { instanceId = environment.getProperty("vcap.application.instance\_id") } // ... } ``` All Cloud Foundry properties are prefixed with `vcap`. You can use `vcap` properties to access application information (such as the public URL of the application) and service information (such as database credentials). See the [‘CloudFoundryVcapEnvironmentPostProcessor’](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/cloud/CloudFoundryVcapEnvironmentPostProcessor.html) Javadoc for complete details. | | | | --- | --- | | | The [Java CFEnv](https://github.com/pivotal-cf/java-cfenv/) project is a better fit for tasks such as configuring a DataSource. | ### 1.2. Kubernetes Spring Boot auto-detects Kubernetes deployment environments by checking the environment for `"*_SERVICE_HOST"` and `"*_SERVICE_PORT"` variables. You can override this detection with the `spring.main.cloud-platform` configuration property. Spring Boot helps you to [manage the state of your application](features#features.spring-application.application-availability) and export it with [HTTP Kubernetes Probes using Actuator](actuator#actuator.endpoints.kubernetes-probes). #### 1.2.1. Kubernetes Container Lifecycle When Kubernetes deletes an application instance, the shutdown process involves several subsystems concurrently: shutdown hooks, unregistering the service, removing the instance from the load-balancer…​ Because this shutdown processing happens in parallel (and due to the nature of distributed systems), there is a window during which traffic can be routed to a pod that has also begun its shutdown processing. You can configure a sleep execution in a preStop handler to avoid requests being routed to a pod that has already begun shutting down. This sleep should be long enough for new requests to stop being routed to the pod and its duration will vary from deployment to deployment. The preStop handler can be configured by using the PodSpec in the pod’s configuration file as follows: ``` spec: containers: - name: "example-container" image: "example-image" lifecycle: preStop: exec: command: ["sh", "-c", "sleep 10"] ``` Once the pre-stop hook has completed, SIGTERM will be sent to the container and [graceful shutdown](web#web.graceful-shutdown) will begin, allowing any remaining in-flight requests to complete. | | | | --- | --- | | | When Kubernetes sends a SIGTERM signal to the pod, it waits for a specified time called the termination grace period (the default for which is 30 seconds). If the containers are still running after the grace period, they are sent the SIGKILL signal and forcibly removed. If the pod takes longer than 30 seconds to shut down, which could be because you have increased `spring.lifecycle.timeout-per-shutdown-phase`, make sure to increase the termination grace period by setting the `terminationGracePeriodSeconds` option in the Pod YAML. | ### 1.3. Heroku Heroku is another popular PaaS platform. To customize Heroku builds, you provide a `Procfile`, which provides the incantation required to deploy an application. Heroku assigns a `port` for the Java application to use and then ensures that routing to the external URI works. You must configure your application to listen on the correct port. The following example shows the `Procfile` for our starter REST application: ``` web: java -Dserver.port=$PORT -jar target/demo-0.0.1-SNAPSHOT.jar ``` Spring Boot makes `-D` arguments available as properties accessible from a Spring `Environment` instance. The `server.port` configuration property is fed to the embedded Tomcat, Jetty, or Undertow instance, which then uses the port when it starts up. The `$PORT` environment variable is assigned to us by the Heroku PaaS. This should be everything you need. The most common deployment workflow for Heroku deployments is to `git push` the code to production, as shown in the following example: ``` $ git push heroku main ``` Which will result in the following: ``` Initializing repository, **done**. Counting objects: 95, **done**. Delta compression using up to 8 threads. Compressing objects: 100% (78/78), **done**. Writing objects: 100% (95/95), 8.66 MiB | 606.00 KiB/s, **done**. Total 95 (delta 31), reused 0 (delta 0) -----> Java app detected -----> Installing OpenJDK 1.8... **done** -----> Installing Maven 3.3.1... **done** -----> Installing settings.xml... **done** -----> Executing: mvn -B -DskipTests=true clean install [INFO] Scanning for projects... Downloading: https://repo.spring.io/... Downloaded: https://repo.spring.io/... (818 B at 1.8 KB/sec) .... Downloaded: https://s3pository.heroku.com/jvm/... (152 KB at 595.3 KB/sec) [INFO] Installing /tmp/build_0c35a5d2-a067-4abc-a232-14b1fb7a8229/target/... [INFO] Installing /tmp/build_0c35a5d2-a067-4abc-a232-14b1fb7a8229/pom.xml ... [INFO] ------------------------------------------------------------------------ [INFO] **BUILD SUCCESS** [INFO] ------------------------------------------------------------------------ [INFO] Total time: 59.358s [INFO] Finished at: Fri Mar 07 07:28:25 UTC 2014 [INFO] Final Memory: 20M/493M [INFO] ------------------------------------------------------------------------ -----> Discovering process types Procfile declares types -> **web** -----> Compressing... **done**, 70.4MB -----> Launching... **done**, v6 https://agile-sierra-1405.herokuapp.com/ **deployed to Heroku** To [email protected]:agile-sierra-1405.git * [new branch] main -> main ``` Your application should now be up and running on Heroku. For more details, see [Deploying Spring Boot Applications to Heroku](https://devcenter.heroku.com/articles/deploying-spring-boot-apps-to-heroku). ### 1.4. OpenShift [OpenShift](https://www.openshift.com/) has many resources describing how to deploy Spring Boot applications, including: * [Using the S2I builder](https://blog.openshift.com/using-openshift-enterprise-grade-spring-boot-deployments/) * [Architecture guide](https://access.redhat.com/documentation/en-us/reference_architectures/2017/html-single/spring_boot_microservices_on_red_hat_openshift_container_platform_3/) * [Running as a traditional web application on Wildfly](https://blog.openshift.com/using-spring-boot-on-openshift/) * [OpenShift Commons Briefing](https://blog.openshift.com/openshift-commons-briefing-96-cloud-native-applications-spring-rhoar/) ### 1.5. Amazon Web Services (AWS) Amazon Web Services offers multiple ways to install Spring Boot-based applications, either as traditional web applications (war) or as executable jar files with an embedded web server. The options include: * AWS Elastic Beanstalk * AWS Code Deploy * AWS OPS Works * AWS Cloud Formation * AWS Container Registry Each has different features and pricing models. In this document, we describe to approach using AWS Elastic Beanstalk. #### 1.5.1. AWS Elastic Beanstalk As described in the official [Elastic Beanstalk Java guide](https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_Java.html), there are two main options to deploy a Java application. You can either use the “Tomcat Platform” or the “Java SE platform”. ##### Using the Tomcat Platform This option applies to Spring Boot projects that produce a war file. No special configuration is required. You need only follow the official guide. ##### Using the Java SE Platform This option applies to Spring Boot projects that produce a jar file and run an embedded web container. Elastic Beanstalk environments run an nginx instance on port 80 to proxy the actual application, running on port 5000. To configure it, add the following line to your `application.properties` file: ``` server.port=5000 ``` | | | | --- | --- | | | Upload binaries instead of sources By default, Elastic Beanstalk uploads sources and compiles them in AWS. However, it is best to upload the binaries instead. To do so, add lines similar to the following to your `.elasticbeanstalk/config.yml` file: ``` deploy: artifact: target/demo-0.0.1-SNAPSHOT.jar ``` | | | | | --- | --- | | | Reduce costs by setting the environment type By default an Elastic Beanstalk environment is load balanced. The load balancer has a significant cost. To avoid that cost, set the environment type to “Single instance”, as described in [the Amazon documentation](https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/environments-create-wizard.html#environments-create-wizard-capacity). You can also create single instance environments by using the CLI and the following command: ``` eb create -s ``` | #### 1.5.2. Summary This is one of the easiest ways to get to AWS, but there are more things to cover, such as how to integrate Elastic Beanstalk into any CI / CD tool, use the Elastic Beanstalk Maven plugin instead of the CLI, and others. There is a [blog post](https://exampledriven.wordpress.com/2017/01/09/spring-boot-aws-elastic-beanstalk-example/) covering these topics more in detail. ### 1.6. CloudCaptain and Amazon Web Services [CloudCaptain](https://cloudcaptain.sh/) works by turning your Spring Boot executable jar or war into a minimal VM image that can be deployed unchanged either on VirtualBox or on AWS. CloudCaptain comes with deep integration for Spring Boot and uses the information from your Spring Boot configuration file to automatically configure ports and health check URLs. CloudCaptain leverages this information both for the images it produces as well as for all the resources it provisions (instances, security groups, elastic load balancers, and so on). Once you have created a [CloudCaptain account](https://console.cloudcaptain.sh), connected it to your AWS account, installed the latest version of the CloudCaptain Client, and ensured that the application has been built by Maven or Gradle (by using, for example, `mvn clean package`), you can deploy your Spring Boot application to AWS with a command similar to the following: ``` $ boxfuse run myapp-1.0.jar -env=prod ``` See the [`boxfuse run` documentation](https://cloudcaptain.sh/docs/commandline/run.html) for more options. If there is a [`boxfuse.conf`](https://cloudcaptain.sh/docs/commandline/#configuration) file present in the current directory, it is considered. | | | | --- | --- | | | By default, CloudCaptain activates a Spring profile named `boxfuse` on startup. If your executable jar or war contains an [`application-boxfuse.properties`](https://cloudcaptain.sh/docs/payloads/springboot.html#configuration) file, CloudCaptain bases its configuration on the properties it contains. | At this point, CloudCaptain creates an image for your application, uploads it, and configures and starts the necessary resources on AWS, resulting in output similar to the following example: ``` Fusing Image for myapp-1.0.jar ... Image fused in 00:06.838s (53937 K) -> axelfontaine/myapp:1.0 Creating axelfontaine/myapp ... Pushing axelfontaine/myapp:1.0 ... Verifying axelfontaine/myapp:1.0 ... Creating Elastic IP ... Mapping myapp-axelfontaine.boxfuse.io to 52.28.233.167 ... Waiting for AWS to create an AMI for axelfontaine/myapp:1.0 in eu-central-1 (this may take up to 50 seconds) ... AMI created in 00:23.557s -> ami-d23f38cf Creating security group boxfuse-sg_axelfontaine/myapp:1.0 ... Launching t2.micro instance of axelfontaine/myapp:1.0 (ami-d23f38cf) in eu-central-1 ... Instance launched in 00:30.306s -> i-92ef9f53 Waiting for AWS to boot Instance i-92ef9f53 and Payload to start at https://52.28.235.61/ ... Payload started in 00:29.266s -> https://52.28.235.61/ Remapping Elastic IP 52.28.233.167 to i-92ef9f53 ... Waiting 15s for AWS to complete Elastic IP Zero Downtime transition ... Deployment completed successfully. axelfontaine/myapp:1.0 is up and running at https://myapp-axelfontaine.boxfuse.io/ ``` Your application should now be up and running on AWS. See the blog post on [deploying Spring Boot apps on EC2](https://cloudcaptain.sh/blog/spring-boot-ec2.html) as well as the [documentation for the CloudCaptain Spring Boot integration](https://cloudcaptain.sh/docs/payloads/springboot.html) to get started with a Maven build to run the app. ### 1.7. Azure This [Getting Started guide](https://spring.io/guides/gs/spring-boot-for-azure/) walks you through deploying your Spring Boot application to either [Azure Spring Cloud](https://azure.microsoft.com/en-us/services/spring-cloud/) or [Azure App Service](https://docs.microsoft.com/en-us/azure/app-service/overview). ### 1.8. Google Cloud Google Cloud has several options that can be used to launch Spring Boot applications. The easiest to get started with is probably App Engine, but you could also find ways to run Spring Boot in a container with Container Engine or on a virtual machine with Compute Engine. To run in App Engine, you can create a project in the UI first, which sets up a unique identifier for you and also sets up HTTP routes. Add a Java app to the project and leave it empty and then use the [Google Cloud SDK](https://cloud.google.com/sdk/install) to push your Spring Boot app into that slot from the command line or CI build. App Engine Standard requires you to use WAR packaging. Follow [these steps](https://github.com/GoogleCloudPlatform/java-docs-samples/tree/master/appengine-java8/springboot-helloworld/README.md) to deploy App Engine Standard application to Google Cloud. Alternatively, App Engine Flex requires you to create an `app.yaml` file to describe the resources your app requires. Normally, you put this file in `src/main/appengine`, and it should resemble the following file: ``` service: "default" runtime: "java" env: "flex" runtime_config: jdk: "openjdk8" handlers: - url: "/.*" script: "this field is required, but ignored" manual_scaling: instances: 1 health_check: enable_health_check: false env_variables: ENCRYPT_KEY: "your_encryption_key_here" ``` You can deploy the app (for example, with a Maven plugin) by adding the project ID to the build configuration, as shown in the following example: ``` <plugin> <groupId>com.google.cloud.tools</groupId> <artifactId>appengine-maven-plugin</artifactId> <version>1.3.0</version> <configuration> <project>myproject</project> </configuration> </plugin> ``` Then deploy with `mvn appengine:deploy` (if you need to authenticate first, the build fails). 2. Installing Spring Boot Applications --------------------------------------- In addition to running Spring Boot applications by using `java -jar`, it is also possible to make fully executable applications for Unix systems. A fully executable jar can be executed like any other executable binary or it can be [registered with `init.d` or `systemd`](#deployment.installing.nix-services). This helps when installing and managing Spring Boot applications in common production environments. | | | | --- | --- | | | Fully executable jars work by embedding an extra script at the front of the file. Currently, some tools do not accept this format, so you may not always be able to use this technique. For example, `jar -xf` may silently fail to extract a jar or war that has been made fully executable. It is recommended that you make your jar or war fully executable only if you intend to execute it directly, rather than running it with `java -jar` or deploying it to a servlet container. | | | | | --- | --- | | | A zip64-format jar file cannot be made fully executable. Attempting to do so will result in a jar file that is reported as corrupt when executed directly or with `java -jar`. A standard-format jar file that contains one or more zip64-format nested jars can be fully executable. | To create a ‘fully executable’ jar with Maven, use the following plugin configuration: ``` <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <executable>true</executable> </configuration> </plugin> ``` The following example shows the equivalent Gradle configuration: ``` tasks.named('bootJar') { launchScript() } ``` You can then run your application by typing `./my-application.jar` (where `my-application` is the name of your artifact). The directory containing the jar is used as your application’s working directory. ### 2.1. Supported Operating Systems The default script supports most Linux distributions and is tested on CentOS and Ubuntu. Other platforms, such as OS X and FreeBSD, require the use of a custom `embeddedLaunchScript`. ### 2.2. Unix/Linux Services Spring Boot application can be easily started as Unix/Linux services by using either `init.d` or `systemd`. #### 2.2.1. Installation as an init.d Service (System V) If you configured Spring Boot’s Maven or Gradle plugin to generate a [fully executable jar](#deployment.installing), and you do not use a custom `embeddedLaunchScript`, your application can be used as an `init.d` service. To do so, symlink the jar to `init.d` to support the standard `start`, `stop`, `restart`, and `status` commands. The script supports the following features: * Starts the services as the user that owns the jar file * Tracks the application’s PID by using `/var/run/<appname>/<appname>.pid` * Writes console logs to `/var/log/<appname>.log` Assuming that you have a Spring Boot application installed in `/var/myapp`, to install a Spring Boot application as an `init.d` service, create a symlink, as follows: ``` $ sudo ln -s /var/myapp/myapp.jar /etc/init.d/myapp ``` Once installed, you can start and stop the service in the usual way. For example, on a Debian-based system, you could start it with the following command: ``` $ service myapp start ``` | | | | --- | --- | | | If your application fails to start, check the log file written to `/var/log/<appname>.log` for errors. | You can also flag the application to start automatically by using your standard operating system tools. For example, on Debian, you could use the following command: ``` $ update-rc.d myapp defaults <priority> ``` ##### Securing an init.d Service | | | | --- | --- | | | The following is a set of guidelines on how to secure a Spring Boot application that runs as an init.d service. It is not intended to be an exhaustive list of everything that should be done to harden an application and the environment in which it runs. | When executed as root, as is the case when root is being used to start an init.d service, the default executable script runs the application as the user specified in the `RUN_AS_USER` environment variable. When the environment variable is not set, the user who owns the jar file is used instead. You should never run a Spring Boot application as `root`, so `RUN_AS_USER` should never be root and your application’s jar file should never be owned by root. Instead, create a specific user to run your application and set the `RUN_AS_USER` environment variable or use `chown` to make it the owner of the jar file, as shown in the following example: ``` $ chown bootapp:bootapp your-app.jar ``` In this case, the default executable script runs the application as the `bootapp` user. | | | | --- | --- | | | To reduce the chances of the application’s user account being compromised, you should consider preventing it from using a login shell. For example, you can set the account’s shell to `/usr/sbin/nologin`. | You should also take steps to prevent the modification of your application’s jar file. Firstly, configure its permissions so that it cannot be written and can only be read or executed by its owner, as shown in the following example: ``` $ chmod 500 your-app.jar ``` Second, you should also take steps to limit the damage if your application or the account that is running it is compromised. If an attacker does gain access, they could make the jar file writable and change its contents. One way to protect against this is to make it immutable by using `chattr`, as shown in the following example: ``` $ sudo chattr +i your-app.jar ``` This will prevent any user, including root, from modifying the jar. If root is used to control the application’s service and you [use a `.conf` file](#deployment.installing.nix-services.script-customization.when-running.conf-file) to customize its startup, the `.conf` file is read and evaluated by the root user. It should be secured accordingly. Use `chmod` so that the file can only be read by the owner and use `chown` to make root the owner, as shown in the following example: ``` $ chmod 400 your-app.conf $ sudo chown root:root your-app.conf ``` #### 2.2.2. Installation as a systemd Service `systemd` is the successor of the System V init system and is now being used by many modern Linux distributions. Although you can continue to use `init.d` scripts with `systemd`, it is also possible to launch Spring Boot applications by using `systemd` ‘service’ scripts. Assuming that you have a Spring Boot application installed in `/var/myapp`, to install a Spring Boot application as a `systemd` service, create a script named `myapp.service` and place it in `/etc/systemd/system` directory. The following script offers an example: ``` [Unit] Description=myapp After=syslog.target [Service] User=myapp ExecStart=/var/myapp/myapp.jar SuccessExitStatus=143 [Install] WantedBy=multi-user.target ``` | | | | --- | --- | | | Remember to change the `Description`, `User`, and `ExecStart` fields for your application. | | | | | --- | --- | | | The `ExecStart` field does not declare the script action command, which means that the `run` command is used by default. | Note that, unlike when running as an `init.d` service, the user that runs the application, the PID file, and the console log file are managed by `systemd` itself and therefore must be configured by using appropriate fields in the ‘service’ script. Consult the [service unit configuration man page](https://www.freedesktop.org/software/systemd/man/systemd.service.html) for more details. To flag the application to start automatically on system boot, use the following command: ``` $ systemctl enable myapp.service ``` Run `man systemctl` for more details. #### 2.2.3. Customizing the Startup Script The default embedded startup script written by the Maven or Gradle plugin can be customized in a number of ways. For most people, using the default script along with a few customizations is usually enough. If you find you cannot customize something that you need to, use the `embeddedLaunchScript` option to write your own file entirely. ##### Customizing the Start Script When It Is Written It often makes sense to customize elements of the start script as it is written into the jar file. For example, init.d scripts can provide a “description”. Since you know the description up front (and it need not change), you may as well provide it when the jar is generated. To customize written elements, use the `embeddedLaunchScriptProperties` option of the Spring Boot Maven plugin or the [`properties` property of the Spring Boot Gradle plugin’s `launchScript`](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#packaging-executable-configuring-launch-script). The following property substitutions are supported with the default script: | Name | Description | Gradle default | Maven default | | --- | --- | --- | --- | | `mode` | The script mode. | `auto` | `auto` | | `initInfoProvides` | The `Provides` section of “INIT INFO” | `${task.baseName}` | `${project.artifactId}` | | `initInfoRequiredStart` | `Required-Start` section of “INIT INFO”. | `$remote_fs $syslog $network` | `$remote_fs $syslog $network` | | `initInfoRequiredStop` | `Required-Stop` section of “INIT INFO”. | `$remote_fs $syslog $network` | `$remote_fs $syslog $network` | | `initInfoDefaultStart` | `Default-Start` section of “INIT INFO”. | `2 3 4 5` | `2 3 4 5` | | `initInfoDefaultStop` | `Default-Stop` section of “INIT INFO”. | `0 1 6` | `0 1 6` | | `initInfoShortDescription` | `Short-Description` section of “INIT INFO”. | Single-line version of `${project.description}` (falling back to `${task.baseName}`) | `${project.name}` | | `initInfoDescription` | `Description` section of “INIT INFO”. | `${project.description}` (falling back to `${task.baseName}`) | `${project.description}` (falling back to `${project.name}`) | | `initInfoChkconfig` | `chkconfig` section of “INIT INFO” | `2345 99 01` | `2345 99 01` | | `confFolder` | The default value for `CONF_FOLDER` | Folder containing the jar | Folder containing the jar | | `inlinedConfScript` | Reference to a file script that should be inlined in the default launch script. This can be used to set environmental variables such as `JAVA_OPTS` before any external config files are loaded | | | | `logFolder` | Default value for `LOG_FOLDER`. Only valid for an `init.d` service | | | | `logFilename` | Default value for `LOG_FILENAME`. Only valid for an `init.d` service | | | | `pidFolder` | Default value for `PID_FOLDER`. Only valid for an `init.d` service | | | | `pidFilename` | Default value for the name of the PID file in `PID_FOLDER`. Only valid for an `init.d` service | | | | `useStartStopDaemon` | Whether the `start-stop-daemon` command, when it is available, should be used to control the process | `true` | `true` | | `stopWaitTime` | Default value for `STOP_WAIT_TIME` in seconds. Only valid for an `init.d` service | 60 | 60 | ##### Customizing a Script When It Runs For items of the script that need to be customized *after* the jar has been written, you can use environment variables or a [config file](#deployment.installing.nix-services.script-customization.when-running.conf-file). The following environment properties are supported with the default script: | Variable | Description | | --- | --- | | `MODE` | The “mode” of operation. The default depends on the way the jar was built but is usually `auto` (meaning it tries to guess if it is an init script by checking if it is a symlink in a directory called `init.d`). You can explicitly set it to `service` so that the `stop|start|status|restart` commands work or to `run` if you want to run the script in the foreground. | | `RUN_AS_USER` | The user that will be used to run the application. When not set, the user that owns the jar file will be used. | | `USE_START_STOP_DAEMON` | Whether the `start-stop-daemon` command, when it is available, should be used to control the process. Defaults to `true`. | | `PID_FOLDER` | The root name of the pid folder (`/var/run` by default). | | `LOG_FOLDER` | The name of the folder in which to put log files (`/var/log` by default). | | `CONF_FOLDER` | The name of the folder from which to read .conf files (same folder as jar-file by default). | | `LOG_FILENAME` | The name of the log file in the `LOG_FOLDER` (`<appname>.log` by default). | | `APP_NAME` | The name of the app. If the jar is run from a symlink, the script guesses the app name. If it is not a symlink or you want to explicitly set the app name, this can be useful. | | `RUN_ARGS` | The arguments to pass to the program (the Spring Boot app). | | `JAVA_HOME` | The location of the `java` executable is discovered by using the `PATH` by default, but you can set it explicitly if there is an executable file at `$JAVA_HOME/bin/java`. | | `JAVA_OPTS` | Options that are passed to the JVM when it is launched. | | `JARFILE` | The explicit location of the jar file, in case the script is being used to launch a jar that it is not actually embedded. | | `DEBUG` | If not empty, sets the `-x` flag on the shell process, allowing you to see the logic in the script. | | `STOP_WAIT_TIME` | The time in seconds to wait when stopping the application before forcing a shutdown (`60` by default). | | | | | --- | --- | | | The `PID_FOLDER`, `LOG_FOLDER`, and `LOG_FILENAME` variables are only valid for an `init.d` service. For `systemd`, the equivalent customizations are made by using the ‘service’ script. See the [service unit configuration man page](https://www.freedesktop.org/software/systemd/man/systemd.service.html) for more details. | With the exception of `JARFILE` and `APP_NAME`, the settings listed in the preceding section can be configured by using a `.conf` file. The file is expected to be next to the jar file and have the same name but suffixed with `.conf` rather than `.jar`. For example, a jar named `/var/myapp/myapp.jar` uses the configuration file named `/var/myapp/myapp.conf`, as shown in the following example: myapp.conf ``` JAVA_OPTS=-Xmx1024M LOG_FOLDER=/custom/log/folder ``` | | | | --- | --- | | | If you do not like having the config file next to the jar file, you can set a `CONF_FOLDER` environment variable to customize the location of the config file. | To learn about securing this file appropriately, see [the guidelines for securing an init.d service](#deployment.installing.nix-services.init-d.securing). ### 2.3. Microsoft Windows Services A Spring Boot application can be started as a Windows service by using [`winsw`](https://github.com/kohsuke/winsw). A ([separately maintained sample](https://github.com/snicoll/spring-boot-daemon)) describes step-by-step how you can create a Windows service for your Spring Boot application. 3. What to Read Next --------------------- See the [Cloud Foundry](https://www.cloudfoundry.org/), [Heroku](https://www.heroku.com/), [OpenShift](https://www.openshift.com), and [Boxfuse](https://boxfuse.com) web sites for more information about the kinds of features that a PaaS can offer. These are just four of the most popular Java PaaS providers. Since Spring Boot is so amenable to cloud-based deployment, you can freely consider other providers as well. The next section goes on to cover the *[Spring Boot CLI](cli#cli)*, or you can jump ahead to read about *[build tool plugins](build-tool-plugins#build-tool-plugins)*.
programming_docs
spring_boot Developing with Spring Boot Developing with Spring Boot =========================== This section goes into more detail about how you should use Spring Boot. It covers topics such as build systems, auto-configuration, and how to run your applications. We also cover some Spring Boot best practices. Although there is nothing particularly special about Spring Boot (it is just another library that you can consume), there are a few recommendations that, when followed, make your development process a little easier. If you are starting out with Spring Boot, you should probably read the *[Getting Started](getting-started#getting-started)* guide before diving into this section. 1. Build Systems ----------------- It is strongly recommended that you choose a build system that supports [*dependency management*](#using.build-systems.dependency-management) and that can consume artifacts published to the “Maven Central” repository. We would recommend that you choose Maven or Gradle. It is possible to get Spring Boot to work with other build systems (Ant, for example), but they are not particularly well supported. ### 1.1. Dependency Management Each release of Spring Boot provides a curated list of dependencies that it supports. In practice, you do not need to provide a version for any of these dependencies in your build configuration, as Spring Boot manages that for you. When you upgrade Spring Boot itself, these dependencies are upgraded as well in a consistent way. | | | | --- | --- | | | You can still specify a version and override Spring Boot’s recommendations if you need to do so. | The curated list contains all the Spring modules that you can use with Spring Boot as well as a refined list of third party libraries. The list is available as a standard Bills of Materials (`spring-boot-dependencies`) that can be used with both [Maven](#using.build-systems.maven) and [Gradle](#using.build-systems.gradle). | | | | --- | --- | | | Each release of Spring Boot is associated with a base version of the Spring Framework. We **highly** recommend that you not specify its version. | ### 1.2. Maven To learn about using Spring Boot with Maven, see the documentation for Spring Boot’s Maven plugin: * Reference ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/) and [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/pdf/spring-boot-maven-plugin-reference.pdf)) * [API](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/api/) ### 1.3. Gradle To learn about using Spring Boot with Gradle, see the documentation for Spring Boot’s Gradle plugin: * Reference ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/) and [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/pdf/spring-boot-gradle-plugin-reference.pdf)) * [API](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/api/) ### 1.4. Ant It is possible to build a Spring Boot project using Apache Ant+Ivy. The `spring-boot-antlib` “AntLib” module is also available to help Ant create executable jars. To declare dependencies, a typical `ivy.xml` file looks something like the following example: ``` <ivy-module version="2.0"> <info organisation="org.springframework.boot" module="spring-boot-sample-ant" /> <configurations> <conf name="compile" description="everything needed to compile this module" /> <conf name="runtime" extends="compile" description="everything needed to run this module" /> </configurations> <dependencies> <dependency org="org.springframework.boot" name="spring-boot-starter" rev="${spring-boot.version}" conf="compile" /> </dependencies> </ivy-module> ``` A typical `build.xml` looks like the following example: ``` <project xmlns:ivy="antlib:org.apache.ivy.ant" xmlns:spring-boot="antlib:org.springframework.boot.ant" name="myapp" default="build"> <property name="spring-boot.version" value="2.7.0" /> <target name="resolve" description="--> retrieve dependencies with ivy"> <ivy:retrieve pattern="lib/[conf]/[artifact]-[type]-[revision].[ext]" /> </target> <target name="classpaths" depends="resolve"> <path id="compile.classpath"> <fileset dir="lib/compile" includes="*.jar" /> </path> </target> <target name="init" depends="classpaths"> <mkdir dir="build/classes" /> </target> <target name="compile" depends="init" description="compile"> <javac srcdir="src/main/java" destdir="build/classes" classpathref="compile.classpath" /> </target> <target name="build" depends="compile"> <spring-boot:exejar destfile="build/myapp.jar" classes="build/classes"> <spring-boot:lib> <fileset dir="lib/runtime" /> </spring-boot:lib> </spring-boot:exejar> </target> </project> ``` | | | | --- | --- | | | If you do not want to use the `spring-boot-antlib` module, see the *[howto.html](howto#howto.build.build-an-executable-archive-with-ant-without-using-spring-boot-antlib)* “How-to” . | ### 1.5. Starters Starters are a set of convenient dependency descriptors that you can include in your application. You get a one-stop shop for all the Spring and related technologies that you need without having to hunt through sample code and copy-paste loads of dependency descriptors. For example, if you want to get started using Spring and JPA for database access, include the `spring-boot-starter-data-jpa` dependency in your project. The starters contain a lot of the dependencies that you need to get a project up and running quickly and with a consistent, supported set of managed transitive dependencies. What is in a name All **official** starters follow a similar naming pattern; `spring-boot-starter-*`, where `*` is a particular type of application. This naming structure is intended to help when you need to find a starter. The Maven integration in many IDEs lets you search dependencies by name. For example, with the appropriate Eclipse or Spring Tools plugin installed, you can press `ctrl-space` in the POM editor and type “spring-boot-starter” for a complete list. As explained in the “[Creating Your Own Starter](features#features.developing-auto-configuration.custom-starter)” section, third party starters should not start with `spring-boot`, as it is reserved for official Spring Boot artifacts. Rather, a third-party starter typically starts with the name of the project. For example, a third-party starter project called `thirdpartyproject` would typically be named `thirdpartyproject-spring-boot-starter`. The following application starters are provided by Spring Boot under the `org.springframework.boot` group: Table 1. Spring Boot application starters | Name | Description | | --- | --- | | `spring-boot-starter` | Core starter, including auto-configuration support, logging and YAML | | `spring-boot-starter-activemq` | Starter for JMS messaging using Apache ActiveMQ | | `spring-boot-starter-amqp` | Starter for using Spring AMQP and Rabbit MQ | | `spring-boot-starter-aop` | Starter for aspect-oriented programming with Spring AOP and AspectJ | | `spring-boot-starter-artemis` | Starter for JMS messaging using Apache Artemis | | `spring-boot-starter-batch` | Starter for using Spring Batch | | `spring-boot-starter-cache` | Starter for using Spring Framework’s caching support | | `spring-boot-starter-data-cassandra` | Starter for using Cassandra distributed database and Spring Data Cassandra | | `spring-boot-starter-data-cassandra-reactive` | Starter for using Cassandra distributed database and Spring Data Cassandra Reactive | | `spring-boot-starter-data-couchbase` | Starter for using Couchbase document-oriented database and Spring Data Couchbase | | `spring-boot-starter-data-couchbase-reactive` | Starter for using Couchbase document-oriented database and Spring Data Couchbase Reactive | | `spring-boot-starter-data-elasticsearch` | Starter for using Elasticsearch search and analytics engine and Spring Data Elasticsearch | | `spring-boot-starter-data-jdbc` | Starter for using Spring Data JDBC | | `spring-boot-starter-data-jpa` | Starter for using Spring Data JPA with Hibernate | | `spring-boot-starter-data-ldap` | Starter for using Spring Data LDAP | | `spring-boot-starter-data-mongodb` | Starter for using MongoDB document-oriented database and Spring Data MongoDB | | `spring-boot-starter-data-mongodb-reactive` | Starter for using MongoDB document-oriented database and Spring Data MongoDB Reactive | | `spring-boot-starter-data-neo4j` | Starter for using Neo4j graph database and Spring Data Neo4j | | `spring-boot-starter-data-r2dbc` | Starter for using Spring Data R2DBC | | `spring-boot-starter-data-redis` | Starter for using Redis key-value data store with Spring Data Redis and the Lettuce client | | `spring-boot-starter-data-redis-reactive` | Starter for using Redis key-value data store with Spring Data Redis reactive and the Lettuce client | | `spring-boot-starter-data-rest` | Starter for exposing Spring Data repositories over REST using Spring Data REST | | `spring-boot-starter-freemarker` | Starter for building MVC web applications using FreeMarker views | | `spring-boot-starter-graphql` | Starter for building GraphQL applications with Spring GraphQL | | `spring-boot-starter-groovy-templates` | Starter for building MVC web applications using Groovy Templates views | | `spring-boot-starter-hateoas` | Starter for building hypermedia-based RESTful web application with Spring MVC and Spring HATEOAS | | `spring-boot-starter-integration` | Starter for using Spring Integration | | `spring-boot-starter-jdbc` | Starter for using JDBC with the HikariCP connection pool | | `spring-boot-starter-jersey` | Starter for building RESTful web applications using JAX-RS and Jersey. An alternative to [`spring-boot-starter-web`](#spring-boot-starter-web) | | `spring-boot-starter-jooq` | Starter for using jOOQ to access SQL databases with JDBC. An alternative to [`spring-boot-starter-data-jpa`](#spring-boot-starter-data-jpa) or [`spring-boot-starter-jdbc`](#spring-boot-starter-jdbc) | | `spring-boot-starter-json` | Starter for reading and writing json | | `spring-boot-starter-jta-atomikos` | Starter for JTA transactions using Atomikos | | `spring-boot-starter-mail` | Starter for using Java Mail and Spring Framework’s email sending support | | `spring-boot-starter-mustache` | Starter for building web applications using Mustache views | | `spring-boot-starter-oauth2-client` | Starter for using Spring Security’s OAuth2/OpenID Connect client features | | `spring-boot-starter-oauth2-resource-server` | Starter for using Spring Security’s OAuth2 resource server features | | `spring-boot-starter-quartz` | Starter for using the Quartz scheduler | | `spring-boot-starter-rsocket` | Starter for building RSocket clients and servers | | `spring-boot-starter-security` | Starter for using Spring Security | | `spring-boot-starter-test` | Starter for testing Spring Boot applications with libraries including JUnit Jupiter, Hamcrest and Mockito | | `spring-boot-starter-thymeleaf` | Starter for building MVC web applications using Thymeleaf views | | `spring-boot-starter-validation` | Starter for using Java Bean Validation with Hibernate Validator | | `spring-boot-starter-web` | Starter for building web, including RESTful, applications using Spring MVC. Uses Tomcat as the default embedded container | | `spring-boot-starter-web-services` | Starter for using Spring Web Services | | `spring-boot-starter-webflux` | Starter for building WebFlux applications using Spring Framework’s Reactive Web support | | `spring-boot-starter-websocket` | Starter for building WebSocket applications using Spring Framework’s WebSocket support | In addition to the application starters, the following starters can be used to add *[production ready](actuator#actuator)* features: Table 2. Spring Boot production starters | Name | Description | | --- | --- | | `spring-boot-starter-actuator` | Starter for using Spring Boot’s Actuator which provides production ready features to help you monitor and manage your application | Finally, Spring Boot also includes the following starters that can be used if you want to exclude or swap specific technical facets: Table 3. Spring Boot technical starters | Name | Description | | --- | --- | | `spring-boot-starter-jetty` | Starter for using Jetty as the embedded servlet container. An alternative to [`spring-boot-starter-tomcat`](#spring-boot-starter-tomcat) | | `spring-boot-starter-log4j2` | Starter for using Log4j2 for logging. An alternative to [`spring-boot-starter-logging`](#spring-boot-starter-logging) | | `spring-boot-starter-logging` | Starter for logging using Logback. Default logging starter | | `spring-boot-starter-reactor-netty` | Starter for using Reactor Netty as the embedded reactive HTTP server. | | `spring-boot-starter-tomcat` | Starter for using Tomcat as the embedded servlet container. Default servlet container starter used by [`spring-boot-starter-web`](#spring-boot-starter-web) | | `spring-boot-starter-undertow` | Starter for using Undertow as the embedded servlet container. An alternative to [`spring-boot-starter-tomcat`](#spring-boot-starter-tomcat) | To learn how to swap technical facets, please see the how-to documentation for [swapping web server](howto#howto.webserver.use-another) and [logging system](howto#howto.logging.log4j). | | | | --- | --- | | | For a list of additional community contributed starters, see the [README file](https://github.com/spring-projects/spring-boot/tree/main/spring-boot-project/spring-boot-starters/README.adoc) in the `spring-boot-starters` module on GitHub. | 2. Structuring Your Code ------------------------- Spring Boot does not require any specific code layout to work. However, there are some best practices that help. ### 2.1. Using the “default” Package When a class does not include a `package` declaration, it is considered to be in the “default package”. The use of the “default package” is generally discouraged and should be avoided. It can cause particular problems for Spring Boot applications that use the `@ComponentScan`, `@ConfigurationPropertiesScan`, `@EntityScan`, or `@SpringBootApplication` annotations, since every class from every jar is read. | | | | --- | --- | | | We recommend that you follow Java’s recommended package naming conventions and use a reversed domain name (for example, `com.example.project`). | ### 2.2. Locating the Main Application Class We generally recommend that you locate your main application class in a root package above other classes. The [`@SpringBootApplication` annotation](#using.using-the-springbootapplication-annotation) is often placed on your main class, and it implicitly defines a base “search package” for certain items. For example, if you are writing a JPA application, the package of the `@SpringBootApplication` annotated class is used to search for `@Entity` items. Using a root package also allows component scan to apply only on your project. | | | | --- | --- | | | If you do not want to use `@SpringBootApplication`, the `@EnableAutoConfiguration` and `@ComponentScan` annotations that it imports defines that behavior so you can also use those instead. | The following listing shows a typical layout: ``` com +- example +- myapplication +- MyApplication.java | +- customer | +- Customer.java | +- CustomerController.java | +- CustomerService.java | +- CustomerRepository.java | +- order +- Order.java +- OrderController.java +- OrderService.java +- OrderRepository.java ``` The `MyApplication.java` file would declare the `main` method, along with the basic `@SpringBootApplication`, as follows: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class MyApplication { public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.runApplication @SpringBootApplication class MyApplication fun main(args: Array<String>) { runApplication<MyApplication>(\*args) } ``` 3. Configuration Classes ------------------------- Spring Boot favors Java-based configuration. Although it is possible to use `SpringApplication` with XML sources, we generally recommend that your primary source be a single `@Configuration` class. Usually the class that defines the `main` method is a good candidate as the primary `@Configuration`. | | | | --- | --- | | | Many Spring configuration examples have been published on the Internet that use XML configuration. If possible, always try to use the equivalent Java-based configuration. Searching for `Enable*` annotations can be a good starting point. | ### 3.1. Importing Additional Configuration Classes You need not put all your `@Configuration` into a single class. The `@Import` annotation can be used to import additional configuration classes. Alternatively, you can use `@ComponentScan` to automatically pick up all Spring components, including `@Configuration` classes. ### 3.2. Importing XML Configuration If you absolutely must use XML based configuration, we recommend that you still start with a `@Configuration` class. You can then use an `@ImportResource` annotation to load XML configuration files. 4. Auto-configuration ---------------------- Spring Boot auto-configuration attempts to automatically configure your Spring application based on the jar dependencies that you have added. For example, if `HSQLDB` is on your classpath, and you have not manually configured any database connection beans, then Spring Boot auto-configures an in-memory database. You need to opt-in to auto-configuration by adding the `@EnableAutoConfiguration` or `@SpringBootApplication` annotations to one of your `@Configuration` classes. | | | | --- | --- | | | You should only ever add one `@SpringBootApplication` or `@EnableAutoConfiguration` annotation. We generally recommend that you add one or the other to your primary `@Configuration` class only. | ### 4.1. Gradually Replacing Auto-configuration Auto-configuration is non-invasive. At any point, you can start to define your own configuration to replace specific parts of the auto-configuration. For example, if you add your own `DataSource` bean, the default embedded database support backs away. If you need to find out what auto-configuration is currently being applied, and why, start your application with the `--debug` switch. Doing so enables debug logs for a selection of core loggers and logs a conditions report to the console. ### 4.2. Disabling Specific Auto-configuration Classes If you find that specific auto-configuration classes that you do not want are being applied, you can use the exclude attribute of `@SpringBootApplication` to disable them, as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration; @SpringBootApplication(exclude = { DataSourceAutoConfiguration.class }) public class MyApplication { } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration @SpringBootApplication(exclude = [DataSourceAutoConfiguration::class]) class MyApplication ``` If the class is not on the classpath, you can use the `excludeName` attribute of the annotation and specify the fully qualified name instead. If you prefer to use `@EnableAutoConfiguration` rather than `@SpringBootApplication`, `exclude` and `excludeName` are also available. Finally, you can also control the list of auto-configuration classes to exclude by using the `spring.autoconfigure.exclude` property. | | | | --- | --- | | | You can define exclusions both at the annotation level and by using the property. | | | | | --- | --- | | | Even though auto-configuration classes are `public`, the only aspect of the class that is considered public API is the name of the class which can be used for disabling the auto-configuration. The actual contents of those classes, such as nested configuration classes or bean methods are for internal use only and we do not recommend using those directly. | 5. Spring Beans and Dependency Injection ----------------------------------------- You are free to use any of the standard Spring Framework techniques to define your beans and their injected dependencies. We generally recommend using constructor injection to wire up dependencies and `@ComponentScan` to find beans. If you structure your code as suggested above (locating your application class in a top package), you can add `@ComponentScan` without any arguments or use the `@SpringBootApplication` annotation which implicitly includes it. All of your application components (`@Component`, `@Service`, `@Repository`, `@Controller`, and others) are automatically registered as Spring Beans. The following example shows a `@Service` Bean that uses constructor injection to obtain a required `RiskAssessor` bean: Java ``` import org.springframework.stereotype.Service; @Service public class MyAccountService implements AccountService { private final RiskAssessor riskAssessor; public MyAccountService(RiskAssessor riskAssessor) { this.riskAssessor = riskAssessor; } // ... } ``` Kotlin ``` import org.springframework.stereotype.Service @Service class MyAccountService(private val riskAssessor: RiskAssessor) : AccountService ``` If a bean has more than one constructor, you will need to mark the one you want Spring to use with `@Autowired`: Java ``` import java.io.PrintStream; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; @Service public class MyAccountService implements AccountService { private final RiskAssessor riskAssessor; private final PrintStream out; @Autowired public MyAccountService(RiskAssessor riskAssessor) { this.riskAssessor = riskAssessor; this.out = System.out; } public MyAccountService(RiskAssessor riskAssessor, PrintStream out) { this.riskAssessor = riskAssessor; this.out = out; } // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.stereotype.Service import java.io.PrintStream @Service class MyAccountService : AccountService { private val riskAssessor: RiskAssessor private val out: PrintStream @Autowired constructor(riskAssessor: RiskAssessor) { this.riskAssessor = riskAssessor out = System.out } constructor(riskAssessor: RiskAssessor, out: PrintStream) { this.riskAssessor = riskAssessor this.out = out } // ... } ``` | | | | --- | --- | | | Notice how using constructor injection lets the `riskAssessor` field be marked as `final`, indicating that it cannot be subsequently changed. | 6. Using the @SpringBootApplication Annotation ----------------------------------------------- Many Spring Boot developers like their apps to use auto-configuration, component scan and be able to define extra configuration on their "application class". A single `@SpringBootApplication` annotation can be used to enable those three features, that is: * `@EnableAutoConfiguration`: enable [Spring Boot’s auto-configuration mechanism](#using.auto-configuration) * `@ComponentScan`: enable `@Component` scan on the package where the application is located (see [the best practices](#using.structuring-your-code)) * `@SpringBootConfiguration`: enable registration of extra beans in the context or the import of additional configuration classes. An alternative to Spring’s standard `@Configuration` that aids [configuration detection](features#features.testing.spring-boot-applications.detecting-configuration) in your integration tests. Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; // Same as @SpringBootConfiguration @EnableAutoConfiguration @ComponentScan @SpringBootApplication public class MyApplication { public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.runApplication // same as @SpringBootConfiguration @EnableAutoConfiguration @ComponentScan @SpringBootApplication class MyApplication fun main(args: Array<String>) { runApplication<MyApplication>(\*args) } ``` | | | | --- | --- | | | `@SpringBootApplication` also provides aliases to customize the attributes of `@EnableAutoConfiguration` and `@ComponentScan`. | | | | | --- | --- | | | None of these features are mandatory and you may choose to replace this single annotation by any of the features that it enables. For instance, you may not want to use component scan or configuration properties scan in your application: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.SpringBootConfiguration; import org.springframework.boot.autoconfigure.EnableAutoConfiguration; import org.springframework.context.annotation.Import; @SpringBootConfiguration(proxyBeanMethods = false) @EnableAutoConfiguration @Import({ SomeConfiguration.class, AnotherConfiguration.class }) public class MyApplication { public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.SpringBootConfiguration import org.springframework.boot.autoconfigure.EnableAutoConfiguration import org.springframework.boot.docs.using.structuringyourcode.locatingthemainclass.MyApplication import org.springframework.boot.runApplication import org.springframework.context.annotation.Import @SpringBootConfiguration(proxyBeanMethods = false) @EnableAutoConfiguration @Import(SomeConfiguration::class, AnotherConfiguration::class) class MyApplication fun main(args: Array<String>) { runApplication<MyApplication>(\*args) } ``` In this example, `MyApplication` is just like any other Spring Boot application except that `@Component`-annotated classes and `@ConfigurationProperties`-annotated classes are not detected automatically and the user-defined beans are imported explicitly (see `@Import`). | 7. Running Your Application ---------------------------- One of the biggest advantages of packaging your application as a jar and using an embedded HTTP server is that you can run your application as you would any other. The sample applies to debugging Spring Boot applications. You do not need any special IDE plugins or extensions. | | | | --- | --- | | | This section only covers jar-based packaging. If you choose to package your application as a war file, see your server and IDE documentation. | ### 7.1. Running from an IDE You can run a Spring Boot application from your IDE as a Java application. However, you first need to import your project. Import steps vary depending on your IDE and build system. Most IDEs can import Maven projects directly. For example, Eclipse users can select `Import…​` → `Existing Maven Projects` from the `File` menu. If you cannot directly import your project into your IDE, you may be able to generate IDE metadata by using a build plugin. Maven includes plugins for [Eclipse](https://maven.apache.org/plugins/maven-eclipse-plugin/) and [IDEA](https://maven.apache.org/plugins/maven-idea-plugin/). Gradle offers plugins for [various IDEs](https://docs.gradle.org/current/userguide/userguide.html). | | | | --- | --- | | | If you accidentally run a web application twice, you see a “Port already in use” error. Spring Tools users can use the `Relaunch` button rather than the `Run` button to ensure that any existing instance is closed. | ### 7.2. Running as a Packaged Application If you use the Spring Boot Maven or Gradle plugins to create an executable jar, you can run your application using `java -jar`, as shown in the following example: ``` $ java -jar target/myapplication-0.0.1-SNAPSHOT.jar ``` It is also possible to run a packaged application with remote debugging support enabled. Doing so lets you attach a debugger to your packaged application, as shown in the following example: ``` $ java -Xdebug -Xrunjdwp:server=y,transport=dt_socket,address=8000,suspend=n \ -jar target/myapplication-0.0.1-SNAPSHOT.jar ``` ### 7.3. Using the Maven Plugin The Spring Boot Maven plugin includes a `run` goal that can be used to quickly compile and run your application. Applications run in an exploded form, as they do in your IDE. The following example shows a typical Maven command to run a Spring Boot application: ``` $ mvn spring-boot:run ``` You might also want to use the `MAVEN_OPTS` operating system environment variable, as shown in the following example: ``` $ export MAVEN_OPTS=-Xmx1024m ``` ### 7.4. Using the Gradle Plugin The Spring Boot Gradle plugin also includes a `bootRun` task that can be used to run your application in an exploded form. The `bootRun` task is added whenever you apply the `org.springframework.boot` and `java` plugins and is shown in the following example: ``` $ gradle bootRun ``` You might also want to use the `JAVA_OPTS` operating system environment variable, as shown in the following example: ``` $ export JAVA_OPTS=-Xmx1024m ``` ### 7.5. Hot Swapping Since Spring Boot applications are plain Java applications, JVM hot-swapping should work out of the box. JVM hot swapping is somewhat limited with the bytecode that it can replace. For a more complete solution, [JRebel](https://www.jrebel.com/products/jrebel) can be used. The `spring-boot-devtools` module also includes support for quick application restarts. See the [Hot swapping “How-to”](howto#howto.hotswapping) for details. 8. Developer Tools ------------------- Spring Boot includes an additional set of tools that can make the application development experience a little more pleasant. The `spring-boot-devtools` module can be included in any project to provide additional development-time features. To include devtools support, add the module dependency to your build, as shown in the following listings for Maven and Gradle: Maven ``` <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-devtools</artifactId> <optional>true</optional> </dependency> </dependencies> ``` Gradle ``` dependencies { developmentOnly("org.springframework.boot:spring-boot-devtools") } ``` | | | | --- | --- | | | Devtools might cause classloading issues, in particular in multi-module projects. [Diagnosing Classloading Issues](#using.devtools.diagnosing-classloading-issues) explains how to diagnose and solve them. | | | | | --- | --- | | | Developer tools are automatically disabled when running a fully packaged application. If your application is launched from `java -jar` or if it is started from a special classloader, then it is considered a “production application”. You can control this behavior by using the `spring.devtools.restart.enabled` system property. To enable devtools, irrespective of the classloader used to launch your application, set the `-Dspring.devtools.restart.enabled=true` system property. This must not be done in a production environment where running devtools is a security risk. To disable devtools, exclude the dependency or set the `-Dspring.devtools.restart.enabled=false` system property. | | | | | --- | --- | | | Flagging the dependency as optional in Maven or using the `developmentOnly` configuration in Gradle (as shown above) prevents devtools from being transitively applied to other modules that use your project. | | | | | --- | --- | | | Repackaged archives do not contain devtools by default. If you want to use a [certain remote devtools feature](#using.devtools.remote-applications), you need to include it. When using the Maven plugin, set the `excludeDevtools` property to `false`. When using the Gradle plugin, [configure the task’s classpath to include the `developmentOnly` configuration](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/#packaging-executable-configuring-including-development-only-dependencies). | ### 8.1. Diagnosing Classloading Issues As described in the [Restart vs Reload](#using.devtools.restart.restart-vs-reload) section, restart functionality is implemented by using two classloaders. For most applications, this approach works well. However, it can sometimes cause classloading issues, in particular in multi-module projects. To diagnose whether the classloading issues are indeed caused by devtools and its two classloaders, [try disabling restart](#using.devtools.restart.disable). If this solves your problems, [customize the restart classloader](#using.devtools.restart.customizing-the-classload) to include your entire project. ### 8.2. Property Defaults Several of the libraries supported by Spring Boot use caches to improve performance. For example, [template engines](web#web.servlet.spring-mvc.template-engines) cache compiled templates to avoid repeatedly parsing template files. Also, Spring MVC can add HTTP caching headers to responses when serving static resources. While caching is very beneficial in production, it can be counter-productive during development, preventing you from seeing the changes you just made in your application. For this reason, spring-boot-devtools disables the caching options by default. Cache options are usually configured by settings in your `application.properties` file. For example, Thymeleaf offers the `spring.thymeleaf.cache` property. Rather than needing to set these properties manually, the `spring-boot-devtools` module automatically applies sensible development-time configuration. The following table lists all the properties that are applied: | Name | Default Value | | --- | --- | | `server.error.include-binding-errors` | `always` | | `server.error.include-message` | `always` | | `server.error.include-stacktrace` | `always` | | `server.servlet.jsp.init-parameters.development` | `true` | | `server.servlet.session.persistent` | `true` | | `spring.freemarker.cache` | `false` | | `spring.graphql.graphiql.enabled` | `true` | | `spring.groovy.template.cache` | `false` | | `spring.h2.console.enabled` | `true` | | `spring.mustache.servlet.cache` | `false` | | `spring.mvc.log-resolved-exception` | `true` | | `spring.reactor.debug` | `true` | | `spring.template.provider.cache` | `false` | | `spring.thymeleaf.cache` | `false` | | `spring.web.resources.cache.period` | `0` | | `spring.web.resources.chain.cache` | `false` | | | | | --- | --- | | | If you do not want property defaults to be applied you can set `spring.devtools.add-properties` to `false` in your `application.properties`. | Because you need more information about web requests while developing Spring MVC and Spring WebFlux applications, developer tools suggests you to enable `DEBUG` logging for the `web` logging group. This will give you information about the incoming request, which handler is processing it, the response outcome, and other details. If you wish to log all request details (including potentially sensitive information), you can turn on the `spring.mvc.log-request-details` or `spring.codec.log-request-details` configuration properties. ### 8.3. Automatic Restart Applications that use `spring-boot-devtools` automatically restart whenever files on the classpath change. This can be a useful feature when working in an IDE, as it gives a very fast feedback loop for code changes. By default, any entry on the classpath that points to a directory is monitored for changes. Note that certain resources, such as static assets and view templates, [do not need to restart the application](#using.devtools.restart.excluding-resources). Triggering a restart As DevTools monitors classpath resources, the only way to trigger a restart is to update the classpath. Whether you’re using an IDE or one of the build plugins, the modified files have to be recompiled to trigger a restart. The way in which you cause the classpath to be updated depends on the tool that you are using: * In Eclipse, saving a modified file causes the classpath to be updated and triggers a restart. * In IntelliJ IDEA, building the project (`Build +→+ Build Project`) has the same effect. * If using a build plugin, running `mvn compile` for Maven or `gradle build` for Gradle will trigger a restart. | | | | --- | --- | | | If you are restarting with Maven or Gradle using the build plugin you must leave the `forking` set to `enabled`. If you disable forking, the isolated application classloader used by devtools will not be created and restarts will not operate properly. | | | | | --- | --- | | | Automatic restart works very well when used with LiveReload. [See the LiveReload section](#using.devtools.livereload) for details. If you use JRebel, automatic restarts are disabled in favor of dynamic class reloading. Other devtools features (such as LiveReload and property overrides) can still be used. | | | | | --- | --- | | | DevTools relies on the application context’s shutdown hook to close it during a restart. It does not work correctly if you have disabled the shutdown hook (`SpringApplication.setRegisterShutdownHook(false)`). | | | | | --- | --- | | | DevTools needs to customize the `ResourceLoader` used by the `ApplicationContext`. If your application provides one already, it is going to be wrapped. Direct override of the `getResource` method on the `ApplicationContext` is not supported. | | | | | --- | --- | | | Automatic restart is not supported when using AspectJ weaving. | Restart vs Reload The restart technology provided by Spring Boot works by using two classloaders. Classes that do not change (for example, those from third-party jars) are loaded into a *base* classloader. Classes that you are actively developing are loaded into a *restart* classloader. When the application is restarted, the *restart* classloader is thrown away and a new one is created. This approach means that application restarts are typically much faster than “cold starts”, since the *base* classloader is already available and populated. If you find that restarts are not quick enough for your applications or you encounter classloading issues, you could consider reloading technologies such as [JRebel](https://jrebel.com/software/jrebel/) from ZeroTurnaround. These work by rewriting classes as they are loaded to make them more amenable to reloading. #### 8.3.1. Logging changes in condition evaluation By default, each time your application restarts, a report showing the condition evaluation delta is logged. The report shows the changes to your application’s auto-configuration as you make changes such as adding or removing beans and setting configuration properties. To disable the logging of the report, set the following property: Properties ``` spring.devtools.restart.log-condition-evaluation-delta=false ``` Yaml ``` spring: devtools: restart: log-condition-evaluation-delta: false ``` #### 8.3.2. Excluding Resources Certain resources do not necessarily need to trigger a restart when they are changed. For example, Thymeleaf templates can be edited in-place. By default, changing resources in `/META-INF/maven`, `/META-INF/resources`, `/resources`, `/static`, `/public`, or `/templates` does not trigger a restart but does trigger a [live reload](#using.devtools.livereload). If you want to customize these exclusions, you can use the `spring.devtools.restart.exclude` property. For example, to exclude only `/static` and `/public` you would set the following property: Properties ``` spring.devtools.restart.exclude=static/**,public/** ``` Yaml ``` spring: devtools: restart: exclude: "static/**,public/**" ``` | | | | --- | --- | | | If you want to keep those defaults and *add* additional exclusions, use the `spring.devtools.restart.additional-exclude` property instead. | #### 8.3.3. Watching Additional Paths You may want your application to be restarted or reloaded when you make changes to files that are not on the classpath. To do so, use the `spring.devtools.restart.additional-paths` property to configure additional paths to watch for changes. You can use the `spring.devtools.restart.exclude` property [described earlier](#using.devtools.restart.excluding-resources) to control whether changes beneath the additional paths trigger a full restart or a [live reload](#using.devtools.livereload). #### 8.3.4. Disabling Restart If you do not want to use the restart feature, you can disable it by using the `spring.devtools.restart.enabled` property. In most cases, you can set this property in your `application.properties` (doing so still initializes the restart classloader, but it does not watch for file changes). If you need to *completely* disable restart support (for example, because it does not work with a specific library), you need to set the `spring.devtools.restart.enabled` `System` property to `false` before calling `SpringApplication.run(…​)`, as shown in the following example: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class MyApplication { public static void main(String[] args) { System.setProperty("spring.devtools.restart.enabled", "false"); SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.SpringApplication import org.springframework.boot.autoconfigure.SpringBootApplication @SpringBootApplication object MyApplication { @JvmStatic fun main(args: Array<String>) { System.setProperty("spring.devtools.restart.enabled", "false") SpringApplication.run(MyApplication::class.java, \*args) } } ``` #### 8.3.5. Using a Trigger File If you work with an IDE that continuously compiles changed files, you might prefer to trigger restarts only at specific times. To do so, you can use a “trigger file”, which is a special file that must be modified when you want to actually trigger a restart check. | | | | --- | --- | | | Any update to the file will trigger a check, but restart only actually occurs if Devtools has detected it has something to do. | To use a trigger file, set the `spring.devtools.restart.trigger-file` property to the name (excluding any path) of your trigger file. The trigger file must appear somewhere on your classpath. For example, if you have a project with the following structure: ``` src +- main +- resources +- .reloadtrigger ``` Then your `trigger-file` property would be: Properties ``` spring.devtools.restart.trigger-file=.reloadtrigger ``` Yaml ``` spring: devtools: restart: trigger-file: ".reloadtrigger" ``` Restarts will now only happen when the `src/main/resources/.reloadtrigger` is updated. | | | | --- | --- | | | You might want to set `spring.devtools.restart.trigger-file` as a [global setting](#using.devtools.globalsettings), so that all your projects behave in the same way. | Some IDEs have features that save you from needing to update your trigger file manually. [Spring Tools for Eclipse](https://spring.io/tools) and [IntelliJ IDEA (Ultimate Edition)](https://www.jetbrains.com/idea/) both have such support. With Spring Tools, you can use the “reload” button from the console view (as long as your `trigger-file` is named `.reloadtrigger`). For IntelliJ IDEA, you can follow the [instructions in their documentation](https://www.jetbrains.com/help/idea/spring-boot.html#application-update-policies). #### 8.3.6. Customizing the Restart Classloader As described earlier in the [Restart vs Reload](#using.devtools.restart.restart-vs-reload) section, restart functionality is implemented by using two classloaders. If this causes issues, you might need to customize what gets loaded by which classloader. By default, any open project in your IDE is loaded with the “restart” classloader, and any regular `.jar` file is loaded with the “base” classloader. The same is true if you use `mvn spring-boot:run` or `gradle bootRun`: the project containing your `@SpringBootApplication` is loaded with the “restart” classloader, and everything else with the “base” classloader. You can instruct Spring Boot to load parts of your project with a different classloader by creating a `META-INF/spring-devtools.properties` file. The `spring-devtools.properties` file can contain properties prefixed with `restart.exclude` and `restart.include`. The `include` elements are items that should be pulled up into the “restart” classloader, and the `exclude` elements are items that should be pushed down into the “base” classloader. The value of the property is a regex pattern that is applied to the classpath, as shown in the following example: Properties ``` restart.exclude.companycommonlibs=/mycorp-common-[\\w\\d-\\.]+\\.jar restart.include.projectcommon=/mycorp-myproj-[\\w\\d-\\.]+\\.jar ``` Yaml ``` restart: exclude: companycommonlibs: "/mycorp-common-[\\w\\d-\\.]+\\.jar" include: projectcommon: "/mycorp-myproj-[\\w\\d-\\.]+\\.jar" ``` | | | | --- | --- | | | All property keys must be unique. As long as a property starts with `restart.include.` or `restart.exclude.` it is considered. | | | | | --- | --- | | | All `META-INF/spring-devtools.properties` from the classpath are loaded. You can package files inside your project, or in the libraries that the project consumes. | #### 8.3.7. Known Limitations Restart functionality does not work well with objects that are deserialized by using a standard `ObjectInputStream`. If you need to deserialize data, you may need to use Spring’s `ConfigurableObjectInputStream` in combination with `Thread.currentThread().getContextClassLoader()`. Unfortunately, several third-party libraries deserialize without considering the context classloader. If you find such a problem, you need to request a fix with the original authors. ### 8.4. LiveReload The `spring-boot-devtools` module includes an embedded LiveReload server that can be used to trigger a browser refresh when a resource is changed. LiveReload browser extensions are freely available for Chrome, Firefox and Safari from [livereload.com](http://livereload.com/extensions/). If you do not want to start the LiveReload server when your application runs, you can set the `spring.devtools.livereload.enabled` property to `false`. | | | | --- | --- | | | You can only run one LiveReload server at a time. Before starting your application, ensure that no other LiveReload servers are running. If you start multiple applications from your IDE, only the first has LiveReload support. | | | | | --- | --- | | | To trigger LiveReload when a file changes, [Automatic Restart](#using.devtools.restart) must be enabled. | ### 8.5. Global Settings You can configure global devtools settings by adding any of the following files to the `$HOME/.config/spring-boot` directory: 1. `spring-boot-devtools.properties` 2. `spring-boot-devtools.yaml` 3. `spring-boot-devtools.yml` Any properties added to these files apply to *all* Spring Boot applications on your machine that use devtools. For example, to configure restart to always use a [trigger file](#using.devtools.restart.triggerfile), you would add the following property to your `spring-boot-devtools` file: Properties ``` spring.devtools.restart.trigger-file=.reloadtrigger ``` Yaml ``` spring: devtools: restart: trigger-file: ".reloadtrigger" ``` By default, `$HOME` is the user’s home directory. To customize this location, set the `SPRING_DEVTOOLS_HOME` environment variable or the `spring.devtools.home` system property. | | | | --- | --- | | | If devtools configuration files are not found in `$HOME/.config/spring-boot`, the root of the `$HOME` directory is searched for the presence of a `.spring-boot-devtools.properties` file. This allows you to share the devtools global configuration with applications that are on an older version of Spring Boot that does not support the `$HOME/.config/spring-boot` location. | | | | | --- | --- | | | Profiles are not supported in devtools properties/yaml files. Any profiles activated in `.spring-boot-devtools.properties` will not affect the loading of [profile-specific configuration files](features#features.external-config.files.profile-specific). Profile specific filenames (of the form `spring-boot-devtools-<profile>.properties`) and `spring.config.activate.on-profile` documents in both YAML and Properties files are not supported. | #### 8.5.1. Configuring File System Watcher [FileSystemWatcher](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-devtools/src/main/java/org/springframework/boot/devtools/filewatch/FileSystemWatcher.java) works by polling the class changes with a certain time interval, and then waiting for a predefined quiet period to make sure there are no more changes. Since Spring Boot relies entirely on the IDE to compile and copy files into the location from where Spring Boot can read them, you might find that there are times when certain changes are not reflected when devtools restarts the application. If you observe such problems constantly, try increasing the `spring.devtools.restart.poll-interval` and `spring.devtools.restart.quiet-period` parameters to the values that fit your development environment: Properties ``` spring.devtools.restart.poll-interval=2s spring.devtools.restart.quiet-period=1s ``` Yaml ``` spring: devtools: restart: poll-interval: "2s" quiet-period: "1s" ``` The monitored classpath directories are now polled every 2 seconds for changes, and a 1 second quiet period is maintained to make sure there are no additional class changes. ### 8.6. Remote Applications The Spring Boot developer tools are not limited to local development. You can also use several features when running applications remotely. Remote support is opt-in as enabling it can be a security risk. It should only be enabled when running on a trusted network or when secured with SSL. If neither of these options is available to you, you should not use DevTools' remote support. You should never enable support on a production deployment. To enable it, you need to make sure that `devtools` is included in the repackaged archive, as shown in the following listing: ``` <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <excludeDevtools>false</excludeDevtools> </configuration> </plugin> </plugins> </build> ``` Then you need to set the `spring.devtools.remote.secret` property. Like any important password or secret, the value should be unique and strong such that it cannot be guessed or brute-forced. Remote devtools support is provided in two parts: a server-side endpoint that accepts connections and a client application that you run in your IDE. The server component is automatically enabled when the `spring.devtools.remote.secret` property is set. The client component must be launched manually. | | | | --- | --- | | | Remote devtools is not supported for Spring WebFlux applications. | #### 8.6.1. Running the Remote Client Application The remote client application is designed to be run from within your IDE. You need to run `org.springframework.boot.devtools.RemoteSpringApplication` with the same classpath as the remote project that you connect to. The application’s single required argument is the remote URL to which it connects. For example, if you are using Eclipse or Spring Tools and you have a project named `my-app` that you have deployed to Cloud Foundry, you would do the following: * Select `Run Configurations…​` from the `Run` menu. * Create a new `Java Application` “launch configuration”. * Browse for the `my-app` project. * Use `org.springframework.boot.devtools.RemoteSpringApplication` as the main class. * Add `https://myapp.cfapps.io` to the `Program arguments` (or whatever your remote URL is). A running remote client might resemble the following listing: ``` . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ ___ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | | _ \___ _ __ ___| |_ ___ \ \ \ \ \\/ ___)| |_)| | | | | || (_| []::::::[] / -_) ' \/ _ \ _/ -_) ) ) ) ) ' |____| .__|_| |_|_| |_\__, | |_|_\___|_|_|_\___/\__\___|/ / / / =========|_|==============|___/===================================/_/_/_/ :: Spring Boot Remote :: 2.7.0 2015-06-10 18:25:06.632 INFO 14938 --- [ main] o.s.b.devtools.RemoteSpringApplication : Starting RemoteSpringApplication on pwmbp with PID 14938 (/Users/pwebb/projects/spring-boot/code/spring-boot-project/spring-boot-devtools/target/classes started by pwebb in /Users/pwebb/projects/spring-boot/code) 2015-06-10 18:25:06.671 INFO 14938 --- [ main] s.c.a.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@2a17b7b6: startup date [Wed Jun 10 18:25:06 PDT 2015]; root of context hierarchy 2015-06-10 18:25:07.043 WARN 14938 --- [ main] o.s.b.d.r.c.RemoteClientConfiguration : The connection to http://localhost:8080 is insecure. You should use a URL starting with 'https://'. 2015-06-10 18:25:07.074 INFO 14938 --- [ main] o.s.b.d.a.OptionalLiveReloadServer : LiveReload server is running on port 35729 2015-06-10 18:25:07.130 INFO 14938 --- [ main] o.s.b.devtools.RemoteSpringApplication : Started RemoteSpringApplication in 0.74 seconds (JVM running for 1.105) ``` | | | | --- | --- | | | Because the remote client is using the same classpath as the real application it can directly read application properties. This is how the `spring.devtools.remote.secret` property is read and passed to the server for authentication. | | | | | --- | --- | | | It is always advisable to use `https://` as the connection protocol, so that traffic is encrypted and passwords cannot be intercepted. | | | | | --- | --- | | | If you need to use a proxy to access the remote application, configure the `spring.devtools.remote.proxy.host` and `spring.devtools.remote.proxy.port` properties. | #### 8.6.2. Remote Update The remote client monitors your application classpath for changes in the same way as the [local restart](#using.devtools.restart). Any updated resource is pushed to the remote application and (*if required*) triggers a restart. This can be helpful if you iterate on a feature that uses a cloud service that you do not have locally. Generally, remote updates and restarts are much quicker than a full rebuild and deploy cycle. On a slower development environment, it may happen that the quiet period is not enough, and the changes in the classes may be split into batches. The server is restarted after the first batch of class changes is uploaded. The next batch can’t be sent to the application, since the server is restarting. This is typically manifested by a warning in the `RemoteSpringApplication` logs about failing to upload some of the classes, and a consequent retry. But it may also lead to application code inconsistency and failure to restart after the first batch of changes is uploaded. If you observe such problems constantly, try increasing the `spring.devtools.restart.poll-interval` and `spring.devtools.restart.quiet-period` parameters to the values that fit your development environment. See the [Configuring File System Watcher](#using.devtools.globalsettings.configuring-file-system-watcher) section for configuring these properties. | | | | --- | --- | | | Files are only monitored when the remote client is running. If you change a file before starting the remote client, it is not pushed to the remote server. | 9. Packaging Your Application for Production --------------------------------------------- Executable jars can be used for production deployment. As they are self-contained, they are also ideally suited for cloud-based deployment. For additional “production ready” features, such as health, auditing, and metric REST or JMX end-points, consider adding `spring-boot-actuator`. See *[actuator.html](actuator#actuator)* for details. 10. What to Read Next ---------------------- You should now understand how you can use Spring Boot and some best practices that you should follow. You can now go on to learn about specific *[Spring Boot features](features#features)* in depth, or you could skip ahead and read about the “[production ready](actuator#actuator)” aspects of Spring Boot.
programming_docs
spring_boot Core Features Core Features ============= This section dives into the details of Spring Boot. Here you can learn about the key features that you may want to use and customize. If you have not already done so, you might want to read the "[getting-started.html](getting-started#getting-started)" and "[using.html](using#using)" sections, so that you have a good grounding of the basics. 1. SpringApplication --------------------- The `SpringApplication` class provides a convenient way to bootstrap a Spring application that is started from a `main()` method. In many situations, you can delegate to the static `SpringApplication.run` method, as shown in the following example: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class MyApplication { public static void main(String[] args) { SpringApplication.run(MyApplication.class, args); } } ``` Kotlin ``` import org.springframework.boot.SpringApplication import org.springframework.boot.autoconfigure.SpringBootApplication @SpringBootApplication class MyApplication fun main(args: Array<String>) { SpringApplication.run(MyApplication::class.java, \*args) } ``` When your application starts, you should see something similar to the following output: ``` . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: v2.7.0 2021-02-03 10:33:25.224 INFO 17321 --- [ main] o.s.b.d.s.s.SpringApplicationExample : Starting SpringApplicationExample using Java 1.8.0_232 on mycomputer with PID 17321 (/apps/myjar.jar started by pwebb) 2021-02-03 10:33:25.226 INFO 17900 --- [ main] o.s.b.d.s.s.SpringApplicationExample : No active profile set, falling back to default profiles: default 2021-02-03 10:33:26.046 INFO 17321 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http) 2021-02-03 10:33:26.054 INFO 17900 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2021-02-03 10:33:26.055 INFO 17900 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.41] 2021-02-03 10:33:26.097 INFO 17900 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2021-02-03 10:33:26.097 INFO 17900 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 821 ms 2021-02-03 10:33:26.144 INFO 17900 --- [ main] s.tomcat.SampleTomcatApplication : ServletContext initialized 2021-02-03 10:33:26.376 INFO 17900 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path '' 2021-02-03 10:33:26.384 INFO 17900 --- [ main] o.s.b.d.s.s.SpringApplicationExample : Started SampleTomcatApplication in 1.514 seconds (JVM running for 1.823) ``` By default, `INFO` logging messages are shown, including some relevant startup details, such as the user that launched the application. If you need a log level other than `INFO`, you can set it, as described in [Log Levels](#features.logging.log-levels). The application version is determined using the implementation version from the main application class’s package. Startup information logging can be turned off by setting `spring.main.log-startup-info` to `false`. This will also turn off logging of the application’s active profiles. | | | | --- | --- | | | To add additional logging during startup, you can override `logStartupInfo(boolean)` in a subclass of `SpringApplication`. | ### 1.1. Startup Failure If your application fails to start, registered `FailureAnalyzers` get a chance to provide a dedicated error message and a concrete action to fix the problem. For instance, if you start a web application on port `8080` and that port is already in use, you should see something similar to the following message: ``` *************************** APPLICATION FAILED TO START *************************** Description: Embedded servlet container failed to start. Port 8080 was already in use. Action: Identify and stop the process that is listening on port 8080 or configure this application to listen on another port. ``` | | | | --- | --- | | | Spring Boot provides numerous `FailureAnalyzer` implementations, and you can [add your own](howto#howto.application.failure-analyzer). | If no failure analyzers are able to handle the exception, you can still display the full conditions report to better understand what went wrong. To do so, you need to [enable the `debug` property](#features.external-config) or [enable `DEBUG` logging](#features.logging.log-levels) for `org.springframework.boot.autoconfigure.logging.ConditionEvaluationReportLoggingListener`. For instance, if you are running your application by using `java -jar`, you can enable the `debug` property as follows: ``` $ java -jar myproject-0.0.1-SNAPSHOT.jar --debug ``` ### 1.2. Lazy Initialization `SpringApplication` allows an application to be initialized lazily. When lazy initialization is enabled, beans are created as they are needed rather than during application startup. As a result, enabling lazy initialization can reduce the time that it takes your application to start. In a web application, enabling lazy initialization will result in many web-related beans not being initialized until an HTTP request is received. A downside of lazy initialization is that it can delay the discovery of a problem with the application. If a misconfigured bean is initialized lazily, a failure will no longer occur during startup and the problem will only become apparent when the bean is initialized. Care must also be taken to ensure that the JVM has sufficient memory to accommodate all of the application’s beans and not just those that are initialized during startup. For these reasons, lazy initialization is not enabled by default and it is recommended that fine-tuning of the JVM’s heap size is done before enabling lazy initialization. Lazy initialization can be enabled programmatically using the `lazyInitialization` method on `SpringApplicationBuilder` or the `setLazyInitialization` method on `SpringApplication`. Alternatively, it can be enabled using the `spring.main.lazy-initialization` property as shown in the following example: Properties ``` spring.main.lazy-initialization=true ``` Yaml ``` spring: main: lazy-initialization: true ``` | | | | --- | --- | | | If you want to disable lazy initialization for certain beans while using lazy initialization for the rest of the application, you can explicitly set their lazy attribute to false using the `@Lazy(false)` annotation. | ### 1.3. Customizing the Banner The banner that is printed on start up can be changed by adding a `banner.txt` file to your classpath or by setting the `spring.banner.location` property to the location of such a file. If the file has an encoding other than UTF-8, you can set `spring.banner.charset`. In addition to a text file, you can also add a `banner.gif`, `banner.jpg`, or `banner.png` image file to your classpath or set the `spring.banner.image.location` property. Images are converted into an ASCII art representation and printed above any text banner. Inside your `banner.txt` file, you can use any key available in the `Environment` as well as any of the following placeholders: Table 1. Banner variables | Variable | Description | | --- | --- | | `${application.version}` | The version number of your application, as declared in `MANIFEST.MF`. For example, `Implementation-Version: 1.0` is printed as `1.0`. | | `${application.formatted-version}` | The version number of your application, as declared in `MANIFEST.MF` and formatted for display (surrounded with brackets and prefixed with `v`). For example `(v1.0)`. | | `${spring-boot.version}` | The Spring Boot version that you are using. For example `2.7.0`. | | `${spring-boot.formatted-version}` | The Spring Boot version that you are using, formatted for display (surrounded with brackets and prefixed with `v`). For example `(v2.7.0)`. | | `${Ansi.NAME}` (or `${AnsiColor.NAME}`, `${AnsiBackground.NAME}`, `${AnsiStyle.NAME}`) | Where `NAME` is the name of an ANSI escape code. See [`AnsiPropertySource`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/ansi/AnsiPropertySource.java) for details. | | `${application.title}` | The title of your application, as declared in `MANIFEST.MF`. For example `Implementation-Title: MyApp` is printed as `MyApp`. | | | | | --- | --- | | | The `SpringApplication.setBanner(…​)` method can be used if you want to generate a banner programmatically. Use the `org.springframework.boot.Banner` interface and implement your own `printBanner()` method. | You can also use the `spring.main.banner-mode` property to determine if the banner has to be printed on `System.out` (`console`), sent to the configured logger (`log`), or not produced at all (`off`). The printed banner is registered as a singleton bean under the following name: `springBootBanner`. | | | | --- | --- | | | The `${application.version}` and `${application.formatted-version}` properties are only available if you are using Spring Boot launchers. The values will not be resolved if you are running an unpacked jar and starting it with `java -cp <classpath> <mainclass>`. This is why we recommend that you always launch unpacked jars using `java org.springframework.boot.loader.JarLauncher`. This will initialize the `application.*` banner variables before building the classpath and launching your app. | ### 1.4. Customizing SpringApplication If the `SpringApplication` defaults are not to your taste, you can instead create a local instance and customize it. For example, to turn off the banner, you could write: Java ``` import org.springframework.boot.Banner; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class MyApplication { public static void main(String[] args) { SpringApplication application = new SpringApplication(MyApplication.class); application.setBannerMode(Banner.Mode.OFF); application.run(args); } } ``` Kotlin ``` import org.springframework.boot.Banner import org.springframework.boot.SpringApplication import org.springframework.boot.autoconfigure.SpringBootApplication @SpringBootApplication class MyApplication fun main(args: Array<String>) { val application = SpringApplication(MyApplication::class.java) application.setBannerMode(Banner.Mode.OFF) application.run(\*args) } ``` | | | | --- | --- | | | The constructor arguments passed to `SpringApplication` are configuration sources for Spring beans. In most cases, these are references to `@Configuration` classes, but they could also be direct references `@Component` classes. | It is also possible to configure the `SpringApplication` by using an `application.properties` file. See *[Externalized Configuration](#features.external-config)* for details. For a complete list of the configuration options, see the [`SpringApplication` Javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/SpringApplication.html). ### 1.5. Fluent Builder API If you need to build an `ApplicationContext` hierarchy (multiple contexts with a parent/child relationship) or if you prefer using a “fluent” builder API, you can use the `SpringApplicationBuilder`. The `SpringApplicationBuilder` lets you chain together multiple method calls and includes `parent` and `child` methods that let you create a hierarchy, as shown in the following example: Java ``` new SpringApplicationBuilder() .sources(Parent.class) .child(Application.class) .bannerMode(Banner.Mode.OFF) .run(args); ``` Kotlin ``` SpringApplicationBuilder() .sources(Parent::class.java) .child(Application::class.java) .bannerMode(Banner.Mode.OFF) .run(*args) ``` | | | | --- | --- | | | There are some restrictions when creating an `ApplicationContext` hierarchy. For example, Web components **must** be contained within the child context, and the same `Environment` is used for both parent and child contexts. See the [`SpringApplicationBuilder` Javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/builder/SpringApplicationBuilder.html) for full details. | ### 1.6. Application Availability When deployed on platforms, applications can provide information about their availability to the platform using infrastructure such as [Kubernetes Probes](https://kubernetes.io/docs/tasks/configure-pod-container/configure-liveness-readiness-startup-probes/). Spring Boot includes out-of-the box support for the commonly used “liveness” and “readiness” availability states. If you are using Spring Boot’s “actuator” support then these states are exposed as health endpoint groups. In addition, you can also obtain availability states by injecting the `ApplicationAvailability` interface into your own beans. #### 1.6.1. Liveness State The “Liveness” state of an application tells whether its internal state allows it to work correctly, or recover by itself if it is currently failing. A broken “Liveness” state means that the application is in a state that it cannot recover from, and the infrastructure should restart the application. | | | | --- | --- | | | In general, the "Liveness" state should not be based on external checks, such as [Health checks](actuator#actuator.endpoints.health). If it did, a failing external system (a database, a Web API, an external cache) would trigger massive restarts and cascading failures across the platform. | The internal state of Spring Boot applications is mostly represented by the Spring `ApplicationContext`. If the application context has started successfully, Spring Boot assumes that the application is in a valid state. An application is considered live as soon as the context has been refreshed, see [Spring Boot application lifecycle and related Application Events](#features.spring-application.application-events-and-listeners). #### 1.6.2. Readiness State The “Readiness” state of an application tells whether the application is ready to handle traffic. A failing “Readiness” state tells the platform that it should not route traffic to the application for now. This typically happens during startup, while `CommandLineRunner` and `ApplicationRunner` components are being processed, or at any time if the application decides that it is too busy for additional traffic. An application is considered ready as soon as application and command-line runners have been called, see [Spring Boot application lifecycle and related Application Events](#features.spring-application.application-events-and-listeners). | | | | --- | --- | | | Tasks expected to run during startup should be executed by `CommandLineRunner` and `ApplicationRunner` components instead of using Spring component lifecycle callbacks such as `@PostConstruct`. | #### 1.6.3. Managing the Application Availability State Application components can retrieve the current availability state at any time, by injecting the `ApplicationAvailability` interface and calling methods on it. More often, applications will want to listen to state updates or update the state of the application. For example, we can export the "Readiness" state of the application to a file so that a Kubernetes "exec Probe" can look at this file: Java ``` import org.springframework.boot.availability.AvailabilityChangeEvent; import org.springframework.boot.availability.ReadinessState; import org.springframework.context.event.EventListener; import org.springframework.stereotype.Component; @Component public class MyReadinessStateExporter { @EventListener public void onStateChange(AvailabilityChangeEvent<ReadinessState> event) { switch (event.getState()) { case ACCEPTING\_TRAFFIC: // create file /tmp/healthy break; case REFUSING\_TRAFFIC: // remove file /tmp/healthy break; } } } ``` Kotlin ``` import org.springframework.boot.availability.AvailabilityChangeEvent import org.springframework.boot.availability.ReadinessState import org.springframework.context.event.EventListener import org.springframework.stereotype.Component @Component class MyReadinessStateExporter { @EventListener fun onStateChange(event: AvailabilityChangeEvent<ReadinessState?>) { when (event.state) { ReadinessState.ACCEPTING\_TRAFFIC -> { // create file /tmp/healthy } ReadinessState.REFUSING\_TRAFFIC -> { // remove file /tmp/healthy } else -> { // ... } } } } ``` We can also update the state of the application, when the application breaks and cannot recover: Java ``` import org.springframework.boot.availability.AvailabilityChangeEvent; import org.springframework.boot.availability.LivenessState; import org.springframework.context.ApplicationEventPublisher; import org.springframework.stereotype.Component; @Component public class MyLocalCacheVerifier { private final ApplicationEventPublisher eventPublisher; public MyLocalCacheVerifier(ApplicationEventPublisher eventPublisher) { this.eventPublisher = eventPublisher; } public void checkLocalCache() { try { // ... } catch (CacheCompletelyBrokenException ex) { AvailabilityChangeEvent.publish(this.eventPublisher, ex, LivenessState.BROKEN); } } } ``` Kotlin ``` import org.springframework.boot.availability.AvailabilityChangeEvent import org.springframework.boot.availability.LivenessState import org.springframework.context.ApplicationEventPublisher import org.springframework.stereotype.Component @Component class MyLocalCacheVerifier(private val eventPublisher: ApplicationEventPublisher) { fun checkLocalCache() { try { // ... } catch (ex: CacheCompletelyBrokenException) { AvailabilityChangeEvent.publish(eventPublisher, ex, LivenessState.BROKEN) } } } ``` Spring Boot provides [Kubernetes HTTP probes for "Liveness" and "Readiness" with Actuator Health Endpoints](actuator#actuator.endpoints.kubernetes-probes). You can get more guidance about [deploying Spring Boot applications on Kubernetes in the dedicated section](deployment#deployment.cloud.kubernetes). ### 1.7. Application Events and Listeners In addition to the usual Spring Framework events, such as [`ContextRefreshedEvent`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/context/event/ContextRefreshedEvent.html), a `SpringApplication` sends some additional application events. | | | | --- | --- | | | Some events are actually triggered before the `ApplicationContext` is created, so you cannot register a listener on those as a `@Bean`. You can register them with the `SpringApplication.addListeners(…​)` method or the `SpringApplicationBuilder.listeners(…​)` method. If you want those listeners to be registered automatically, regardless of the way the application is created, you can add a `META-INF/spring.factories` file to your project and reference your listener(s) by using the `org.springframework.context.ApplicationListener` key, as shown in the following example: ``` org.springframework.context.ApplicationListener=com.example.project.MyListener ``` | Application events are sent in the following order, as your application runs: 1. An `ApplicationStartingEvent` is sent at the start of a run but before any processing, except for the registration of listeners and initializers. 2. An `ApplicationEnvironmentPreparedEvent` is sent when the `Environment` to be used in the context is known but before the context is created. 3. An `ApplicationContextInitializedEvent` is sent when the `ApplicationContext` is prepared and ApplicationContextInitializers have been called but before any bean definitions are loaded. 4. An `ApplicationPreparedEvent` is sent just before the refresh is started but after bean definitions have been loaded. 5. An `ApplicationStartedEvent` is sent after the context has been refreshed but before any application and command-line runners have been called. 6. An `AvailabilityChangeEvent` is sent right after with `LivenessState.CORRECT` to indicate that the application is considered as live. 7. An `ApplicationReadyEvent` is sent after any [application and command-line runners](#features.spring-application.command-line-runner) have been called. 8. An `AvailabilityChangeEvent` is sent right after with `ReadinessState.ACCEPTING_TRAFFIC` to indicate that the application is ready to service requests. 9. An `ApplicationFailedEvent` is sent if there is an exception on startup. The above list only includes `SpringApplicationEvent`s that are tied to a `SpringApplication`. In addition to these, the following events are also published after `ApplicationPreparedEvent` and before `ApplicationStartedEvent`: * A `WebServerInitializedEvent` is sent after the `WebServer` is ready. `ServletWebServerInitializedEvent` and `ReactiveWebServerInitializedEvent` are the servlet and reactive variants respectively. * A `ContextRefreshedEvent` is sent when an `ApplicationContext` is refreshed. | | | | --- | --- | | | You often need not use application events, but it can be handy to know that they exist. Internally, Spring Boot uses events to handle a variety of tasks. | | | | | --- | --- | | | Event listeners should not run potentially lengthy tasks as they execute in the same thread by default. Consider using [application and command-line runners](#features.spring-application.command-line-runner) instead. | Application events are sent by using Spring Framework’s event publishing mechanism. Part of this mechanism ensures that an event published to the listeners in a child context is also published to the listeners in any ancestor contexts. As a result of this, if your application uses a hierarchy of `SpringApplication` instances, a listener may receive multiple instances of the same type of application event. To allow your listener to distinguish between an event for its context and an event for a descendant context, it should request that its application context is injected and then compare the injected context with the context of the event. The context can be injected by implementing `ApplicationContextAware` or, if the listener is a bean, by using `@Autowired`. ### 1.8. Web Environment A `SpringApplication` attempts to create the right type of `ApplicationContext` on your behalf. The algorithm used to determine a `WebApplicationType` is the following: * If Spring MVC is present, an `AnnotationConfigServletWebServerApplicationContext` is used * If Spring MVC is not present and Spring WebFlux is present, an `AnnotationConfigReactiveWebServerApplicationContext` is used * Otherwise, `AnnotationConfigApplicationContext` is used This means that if you are using Spring MVC and the new `WebClient` from Spring WebFlux in the same application, Spring MVC will be used by default. You can override that easily by calling `setWebApplicationType(WebApplicationType)`. It is also possible to take complete control of the `ApplicationContext` type that is used by calling `setApplicationContextClass(…​)`. | | | | --- | --- | | | It is often desirable to call `setWebApplicationType(WebApplicationType.NONE)` when using `SpringApplication` within a JUnit test. | ### 1.9. Accessing Application Arguments If you need to access the application arguments that were passed to `SpringApplication.run(…​)`, you can inject a `org.springframework.boot.ApplicationArguments` bean. The `ApplicationArguments` interface provides access to both the raw `String[]` arguments as well as parsed `option` and `non-option` arguments, as shown in the following example: Java ``` import java.util.List; import org.springframework.boot.ApplicationArguments; import org.springframework.stereotype.Component; @Component public class MyBean { public MyBean(ApplicationArguments args) { boolean debug = args.containsOption("debug"); List<String> files = args.getNonOptionArgs(); if (debug) { System.out.println(files); } // if run with "--debug logfile.txt" prints ["logfile.txt"] } } ``` Kotlin ``` import org.springframework.boot.ApplicationArguments import org.springframework.stereotype.Component @Component class MyBean(args: ApplicationArguments) { init { val debug = args.containsOption("debug") val files = args.nonOptionArgs if (debug) { println(files) } // if run with "--debug logfile.txt" prints ["logfile.txt"] } } ``` | | | | --- | --- | | | Spring Boot also registers a `CommandLinePropertySource` with the Spring `Environment`. This lets you also inject single application arguments by using the `@Value` annotation. | ### 1.10. Using the ApplicationRunner or CommandLineRunner If you need to run some specific code once the `SpringApplication` has started, you can implement the `ApplicationRunner` or `CommandLineRunner` interfaces. Both interfaces work in the same way and offer a single `run` method, which is called just before `SpringApplication.run(…​)` completes. | | | | --- | --- | | | This contract is well suited for tasks that should run after application startup but before it starts accepting traffic. | The `CommandLineRunner` interfaces provides access to application arguments as a string array, whereas the `ApplicationRunner` uses the `ApplicationArguments` interface discussed earlier. The following example shows a `CommandLineRunner` with a `run` method: Java ``` import org.springframework.boot.CommandLineRunner; import org.springframework.stereotype.Component; @Component public class MyCommandLineRunner implements CommandLineRunner { @Override public void run(String... args) { // Do something... } } ``` Kotlin ``` import org.springframework.boot.CommandLineRunner import org.springframework.stereotype.Component @Component class MyCommandLineRunner : CommandLineRunner { override fun run(vararg args: String) { // Do something... } } ``` If several `CommandLineRunner` or `ApplicationRunner` beans are defined that must be called in a specific order, you can additionally implement the `org.springframework.core.Ordered` interface or use the `org.springframework.core.annotation.Order` annotation. ### 1.11. Application Exit Each `SpringApplication` registers a shutdown hook with the JVM to ensure that the `ApplicationContext` closes gracefully on exit. All the standard Spring lifecycle callbacks (such as the `DisposableBean` interface or the `@PreDestroy` annotation) can be used. In addition, beans may implement the `org.springframework.boot.ExitCodeGenerator` interface if they wish to return a specific exit code when `SpringApplication.exit()` is called. This exit code can then be passed to `System.exit()` to return it as a status code, as shown in the following example: Java ``` import org.springframework.boot.ExitCodeGenerator; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.context.annotation.Bean; @SpringBootApplication public class MyApplication { @Bean public ExitCodeGenerator exitCodeGenerator() { return () -> 42; } public static void main(String[] args) { System.exit(SpringApplication.exit(SpringApplication.run(MyApplication.class, args))); } } ``` Kotlin ``` import org.springframework.boot.ExitCodeGenerator import org.springframework.boot.SpringApplication import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.context.annotation.Bean @SpringBootApplication class MyApplication { @Bean fun exitCodeGenerator(): ExitCodeGenerator? { return ExitCodeGenerator { 42 } } } fun main(args: Array<String>) { SpringApplication.run(MyApplication::class.java, \*args) } ``` Also, the `ExitCodeGenerator` interface may be implemented by exceptions. When such an exception is encountered, Spring Boot returns the exit code provided by the implemented `getExitCode()` method. If there is more than `ExitCodeGenerator`, the first non-zero exit code that is generated is used. To control the order in which the generators are called, additionally implement the `org.springframework.core.Ordered` interface or use the `org.springframework.core.annotation.Order` annotation. ### 1.12. Admin Features It is possible to enable admin-related features for the application by specifying the `spring.application.admin.enabled` property. This exposes the [`SpringApplicationAdminMXBean`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/admin/SpringApplicationAdminMXBean.java) on the platform `MBeanServer`. You could use this feature to administer your Spring Boot application remotely. This feature could also be useful for any service wrapper implementation. | | | | --- | --- | | | If you want to know on which HTTP port the application is running, get the property with a key of `local.server.port`. | ### 1.13. Application Startup tracking During the application startup, the `SpringApplication` and the `ApplicationContext` perform many tasks related to the application lifecycle, the beans lifecycle or even processing application events. With [`ApplicationStartup`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/core/metrics/ApplicationStartup.html), Spring Framework [allows you to track the application startup sequence with `StartupStep` objects](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/core.html#context-functionality-startup). This data can be collected for profiling purposes, or just to have a better understanding of an application startup process. You can choose an `ApplicationStartup` implementation when setting up the `SpringApplication` instance. For example, to use the `BufferingApplicationStartup`, you could write: Java ``` import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.context.metrics.buffering.BufferingApplicationStartup; @SpringBootApplication public class MyApplication { public static void main(String[] args) { SpringApplication application = new SpringApplication(MyApplication.class); application.setApplicationStartup(new BufferingApplicationStartup(2048)); application.run(args); } } ``` Kotlin ``` import org.springframework.boot.SpringApplication import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.context.metrics.buffering.BufferingApplicationStartup @SpringBootApplication class MyApplication fun main(args: Array<String>) { val application = SpringApplication(MyApplication::class.java) application.applicationStartup = BufferingApplicationStartup(2048) application.run(\*args) } ``` The first available implementation, `FlightRecorderApplicationStartup` is provided by Spring Framework. It adds Spring-specific startup events to a Java Flight Recorder session and is meant for profiling applications and correlating their Spring context lifecycle with JVM events (such as allocations, GCs, class loading…​). Once configured, you can record data by running the application with the Flight Recorder enabled: ``` $ java -XX:StartFlightRecording:filename=recording.jfr,duration=10s -jar demo.jar ``` Spring Boot ships with the `BufferingApplicationStartup` variant; this implementation is meant for buffering the startup steps and draining them into an external metrics system. Applications can ask for the bean of type `BufferingApplicationStartup` in any component. Spring Boot can also be configured to expose a [`startup` endpoint](https://docs.spring.io/spring-boot/docs/2.7.0/actuator-api/htmlsingle/#startup) that provides this information as a JSON document. 2. Externalized Configuration ------------------------------ Spring Boot lets you externalize your configuration so that you can work with the same application code in different environments. You can use a variety of external configuration sources, include Java properties files, YAML files, environment variables, and command-line arguments. Property values can be injected directly into your beans by using the `@Value` annotation, accessed through Spring’s `Environment` abstraction, or be [bound to structured objects](#features.external-config.typesafe-configuration-properties) through `@ConfigurationProperties`. Spring Boot uses a very particular `PropertySource` order that is designed to allow sensible overriding of values. Properties are considered in the following order (with values from lower items overriding earlier ones): 1. Default properties (specified by setting `SpringApplication.setDefaultProperties`). 2. [`@PropertySource`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/context/annotation/PropertySource.html) annotations on your `@Configuration` classes. Please note that such property sources are not added to the `Environment` until the application context is being refreshed. This is too late to configure certain properties such as `logging.*` and `spring.main.*` which are read before refresh begins. 3. Config data (such as `application.properties` files). 4. A `RandomValuePropertySource` that has properties only in `random.*`. 5. OS environment variables. 6. Java System properties (`System.getProperties()`). 7. JNDI attributes from `java:comp/env`. 8. `ServletContext` init parameters. 9. `ServletConfig` init parameters. 10. Properties from `SPRING_APPLICATION_JSON` (inline JSON embedded in an environment variable or system property). 11. Command line arguments. 12. `properties` attribute on your tests. Available on [`@SpringBootTest`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/test/context/SpringBootTest.html) and the [test annotations for testing a particular slice of your application](#features.testing.spring-boot-applications.autoconfigured-tests). 13. [`@TestPropertySource`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/test/context/TestPropertySource.html) annotations on your tests. 14. [Devtools global settings properties](using#using.devtools.globalsettings) in the `$HOME/.config/spring-boot` directory when devtools is active. Config data files are considered in the following order: 1. [Application properties](#features.external-config.files) packaged inside your jar (`application.properties` and YAML variants). 2. [Profile-specific application properties](#features.external-config.files.profile-specific) packaged inside your jar (`application-{profile}.properties` and YAML variants). 3. [Application properties](#features.external-config.files) outside of your packaged jar (`application.properties` and YAML variants). 4. [Profile-specific application properties](#features.external-config.files.profile-specific) outside of your packaged jar (`application-{profile}.properties` and YAML variants). | | | | --- | --- | | | It is recommended to stick with one format for your entire application. If you have configuration files with both `.properties` and `.yml` format in the same location, `.properties` takes precedence. | To provide a concrete example, suppose you develop a `@Component` that uses a `name` property, as shown in the following example: Java ``` import org.springframework.beans.factory.annotation.Value; import org.springframework.stereotype.Component; @Component public class MyBean { @Value("${name}") private String name; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Value import org.springframework.stereotype.Component @Component class MyBean { @Value("\${name}") private val name: String? = null // ... } ``` On your application classpath (for example, inside your jar) you can have an `application.properties` file that provides a sensible default property value for `name`. When running in a new environment, an `application.properties` file can be provided outside of your jar that overrides the `name`. For one-off testing, you can launch with a specific command line switch (for example, `java -jar app.jar --name="Spring"`). | | | | --- | --- | | | The `env` and `configprops` endpoints can be useful in determining why a property has a particular value. You can use these two endpoints to diagnose unexpected property values. See the "[Production ready features](actuator#actuator.endpoints)" section for details. | ### 2.1. Accessing Command Line Properties By default, `SpringApplication` converts any command line option arguments (that is, arguments starting with `--`, such as `--server.port=9000`) to a `property` and adds them to the Spring `Environment`. As mentioned previously, command line properties always take precedence over file-based property sources. If you do not want command line properties to be added to the `Environment`, you can disable them by using `SpringApplication.setAddCommandLineProperties(false)`. ### 2.2. JSON Application Properties Environment variables and system properties often have restrictions that mean some property names cannot be used. To help with this, Spring Boot allows you to encode a block of properties into a single JSON structure. When your application starts, any `spring.application.json` or `SPRING_APPLICATION_JSON` properties will be parsed and added to the `Environment`. For example, the `SPRING_APPLICATION_JSON` property can be supplied on the command line in a UN\*X shell as an environment variable: ``` $ SPRING_APPLICATION_JSON='{"my":{"name":"test"}}' java -jar myapp.jar ``` In the preceding example, you end up with `my.name=test` in the Spring `Environment`. The same JSON can also be provided as a system property: ``` $ java -Dspring.application.json='{"my":{"name":"test"}}' -jar myapp.jar ``` Or you could supply the JSON by using a command line argument: ``` $ java -jar myapp.jar --spring.application.json='{"my":{"name":"test"}}' ``` If you are deploying to a classic Application Server, you could also use a JNDI variable named `java:comp/env/spring.application.json`. | | | | --- | --- | | | Although `null` values from the JSON will be added to the resulting property source, the `PropertySourcesPropertyResolver` treats `null` properties as missing values. This means that the JSON cannot override properties from lower order property sources with a `null` value. | ### 2.3. External Application Properties Spring Boot will automatically find and load `application.properties` and `application.yaml` files from the following locations when your application starts: 1. From the classpath 1. The classpath root 2. The classpath `/config` package 2. From the current directory 1. The current directory 2. The `/config` subdirectory in the current directory 3. Immediate child directories of the `/config` subdirectory The list is ordered by precedence (with values from lower items overriding earlier ones). Documents from the loaded files are added as `PropertySources` to the Spring `Environment`. If you do not like `application` as the configuration file name, you can switch to another file name by specifying a `spring.config.name` environment property. For example, to look for `myproject.properties` and `myproject.yaml` files you can run your application as follows: ``` $ java -jar myproject.jar --spring.config.name=myproject ``` You can also refer to an explicit location by using the `spring.config.location` environment property. This property accepts a comma-separated list of one or more locations to check. The following example shows how to specify two distinct files: ``` $ java -jar myproject.jar --spring.config.location=\ optional:classpath:/default.properties,\ optional:classpath:/override.properties ``` | | | | --- | --- | | | Use the prefix `optional:` if the [locations are optional](#features.external-config.files.optional-prefix) and you do not mind if they do not exist. | | | | | --- | --- | | | `spring.config.name`, `spring.config.location`, and `spring.config.additional-location` are used very early to determine which files have to be loaded. They must be defined as an environment property (typically an OS environment variable, a system property, or a command-line argument). | If `spring.config.location` contains directories (as opposed to files), they should end in `/`. At runtime they will be appended with the names generated from `spring.config.name` before being loaded. Files specified in `spring.config.location` are imported directly. | | | | --- | --- | | | Both directory and file location values are also expanded to check for [profile-specific files](#features.external-config.files.profile-specific). For example, if you have a `spring.config.location` of `classpath:myconfig.properties`, you will also find appropriate `classpath:myconfig-<profile>.properties` files are loaded. | In most situations, each `spring.config.location` item you add will reference a single file or directory. Locations are processed in the order that they are defined and later ones can override the values of earlier ones. If you have a complex location setup, and you use profile-specific configuration files, you may need to provide further hints so that Spring Boot knows how they should be grouped. A location group is a collection of locations that are all considered at the same level. For example, you might want to group all classpath locations, then all external locations. Items within a location group should be separated with `;`. See the example in the “[Profile Specific Files](#features.external-config.files.profile-specific)” section for more details. Locations configured by using `spring.config.location` replace the default locations. For example, if `spring.config.location` is configured with the value `optional:classpath:/custom-config/,optional:file:./custom-config/`, the complete set of locations considered is: 1. `optional:classpath:custom-config/` 2. `optional:file:./custom-config/` If you prefer to add additional locations, rather than replacing them, you can use `spring.config.additional-location`. Properties loaded from additional locations can override those in the default locations. For example, if `spring.config.additional-location` is configured with the value `optional:classpath:/custom-config/,optional:file:./custom-config/`, the complete set of locations considered is: 1. `optional:classpath:/;optional:classpath:/config/` 2. `optional:file:./;optional:file:./config/;optional:file:./config/*/` 3. `optional:classpath:custom-config/` 4. `optional:file:./custom-config/` This search ordering lets you specify default values in one configuration file and then selectively override those values in another. You can provide default values for your application in `application.properties` (or whatever other basename you choose with `spring.config.name`) in one of the default locations. These default values can then be overridden at runtime with a different file located in one of the custom locations. | | | | --- | --- | | | If you use environment variables rather than system properties, most operating systems disallow period-separated key names, but you can use underscores instead (for example, `SPRING_CONFIG_NAME` instead of `spring.config.name`). See [Binding from Environment Variables](#features.external-config.typesafe-configuration-properties.relaxed-binding.environment-variables) for details. | | | | | --- | --- | | | If your application runs in a servlet container or application server, then JNDI properties (in `java:comp/env`) or servlet context initialization parameters can be used instead of, or as well as, environment variables or system properties. | #### 2.3.1. Optional Locations By default, when a specified config data location does not exist, Spring Boot will throw a `ConfigDataLocationNotFoundException` and your application will not start. If you want to specify a location, but you do not mind if it does not always exist, you can use the `optional:` prefix. You can use this prefix with the `spring.config.location` and `spring.config.additional-location` properties, as well as with [`spring.config.import`](#features.external-config.files.importing) declarations. For example, a `spring.config.import` value of `optional:file:./myconfig.properties` allows your application to start, even if the `myconfig.properties` file is missing. If you want to ignore all `ConfigDataLocationNotFoundExceptions` and always continue to start your application, you can use the `spring.config.on-not-found` property. Set the value to `ignore` using `SpringApplication.setDefaultProperties(…​)` or with a system/environment variable. #### 2.3.2. Wildcard Locations If a config file location includes the `*` character for the last path segment, it is considered a wildcard location. Wildcards are expanded when the config is loaded so that immediate subdirectories are also checked. Wildcard locations are particularly useful in an environment such as Kubernetes when there are multiple sources of config properties. For example, if you have some Redis configuration and some MySQL configuration, you might want to keep those two pieces of configuration separate, while requiring that both those are present in an `application.properties` file. This might result in two separate `application.properties` files mounted at different locations such as `/config/redis/application.properties` and `/config/mysql/application.properties`. In such a case, having a wildcard location of `config/*/`, will result in both files being processed. By default, Spring Boot includes `config/*/` in the default search locations. It means that all subdirectories of the `/config` directory outside of your jar will be searched. You can use wildcard locations yourself with the `spring.config.location` and `spring.config.additional-location` properties. | | | | --- | --- | | | A wildcard location must contain only one `*` and end with `*/` for search locations that are directories or `*/<filename>` for search locations that are files. Locations with wildcards are sorted alphabetically based on the absolute path of the file names. | | | | | --- | --- | | | Wildcard locations only work with external directories. You cannot use a wildcard in a `classpath:` location. | #### 2.3.3. Profile Specific Files As well as `application` property files, Spring Boot will also attempt to load profile-specific files using the naming convention `application-{profile}`. For example, if your application activates a profile named `prod` and uses YAML files, then both `application.yml` and `application-prod.yml` will be considered. Profile-specific properties are loaded from the same locations as standard `application.properties`, with profile-specific files always overriding the non-specific ones. If several profiles are specified, a last-wins strategy applies. For example, if profiles `prod,live` are specified by the `spring.profiles.active` property, values in `application-prod.properties` can be overridden by those in `application-live.properties`. | | | | --- | --- | | | The last-wins strategy applies at the [location group](#features.external-config.files.location-groups) level. A `spring.config.location` of `classpath:/cfg/,classpath:/ext/` will not have the same override rules as `classpath:/cfg/;classpath:/ext/`. For example, continuing our `prod,live` example above, we might have the following files: ``` /cfg application-live.properties /ext application-live.properties application-prod.properties ``` When we have a `spring.config.location` of `classpath:/cfg/,classpath:/ext/` we process all `/cfg` files before all `/ext` files: 1. `/cfg/application-live.properties` 2. `/ext/application-prod.properties` 3. `/ext/application-live.properties` When we have `classpath:/cfg/;classpath:/ext/` instead (with a `;` delimiter) we process `/cfg` and `/ext` at the same level: 1. `/ext/application-prod.properties` 2. `/cfg/application-live.properties` 3. `/ext/application-live.properties` | The `Environment` has a set of default profiles (by default, `[default]`) that are used if no active profiles are set. In other words, if no profiles are explicitly activated, then properties from `application-default` are considered. | | | | --- | --- | | | Properties files are only ever loaded once. If you have already directly [imported](#features.external-config.files.importing) a profile specific property files then it will not be imported a second time. | #### 2.3.4. Importing Additional Data Application properties may import further config data from other locations using the `spring.config.import` property. Imports are processed as they are discovered, and are treated as additional documents inserted immediately below the one that declares the import. For example, you might have the following in your classpath `application.properties` file: Properties ``` spring.application.name=myapp spring.config.import=optional:file:./dev.properties ``` Yaml ``` spring: application: name: "myapp" config: import: "optional:file:./dev.properties" ``` This will trigger the import of a `dev.properties` file in current directory (if such a file exists). Values from the imported `dev.properties` will take precedence over the file that triggered the import. In the above example, the `dev.properties` could redefine `spring.application.name` to a different value. An import will only be imported once no matter how many times it is declared. The order an import is defined inside a single document within the properties/yaml file does not matter. For instance, the two examples below produce the same result: Properties ``` spring.config.import=my.properties my.property=value ``` Yaml ``` spring: config: import: "my.properties" my: property: "value" ``` Properties ``` my.property=value spring.config.import=my.properties ``` Yaml ``` my: property: "value" spring: config: import: "my.properties" ``` In both of the above examples, the values from the `my.properties` file will take precedence over the file that triggered its import. Several locations can be specified under a single `spring.config.import` key. Locations will be processed in the order that they are defined, with later imports taking precedence. | | | | --- | --- | | | When appropriate, [Profile-specific variants](#features.external-config.files.profile-specific) are also considered for import. The example above would import both `my.properties` as well as any `my-<profile>.properties` variants. | | | | | --- | --- | | | Spring Boot includes pluggable API that allows various different location addresses to be supported. By default you can import Java Properties, YAML and “[configuration trees](#features.external-config.files.configtree)”. Third-party jars can offer support for additional technologies (there is no requirement for files to be local). For example, you can imagine config data being from external stores such as Consul, Apache ZooKeeper or Netflix Archaius. If you want to support your own locations, see the `ConfigDataLocationResolver` and `ConfigDataLoader` classes in the `org.springframework.boot.context.config` package. | #### 2.3.5. Importing Extensionless Files Some cloud platforms cannot add a file extension to volume mounted files. To import these extensionless files, you need to give Spring Boot a hint so that it knows how to load them. You can do this by putting an extension hint in square brackets. For example, suppose you have a `/etc/config/myconfig` file that you wish to import as yaml. You can import it from your `application.properties` using the following: Properties ``` spring.config.import=file:/etc/config/myconfig[.yaml] ``` Yaml ``` spring: config: import: "file:/etc/config/myconfig[.yaml]" ``` #### 2.3.6. Using Configuration Trees When running applications on a cloud platform (such as Kubernetes) you often need to read config values that the platform supplies. It is not uncommon to use environment variables for such purposes, but this can have drawbacks, especially if the value is supposed to be kept secret. As an alternative to environment variables, many cloud platforms now allow you to map configuration into mounted data volumes. For example, Kubernetes can volume mount both [`ConfigMaps`](https://kubernetes.io/docs/tasks/configure-pod-container/configure-pod-configmap/#populate-a-volume-with-data-stored-in-a-configmap) and [`Secrets`](https://kubernetes.io/docs/concepts/configuration/secret/#using-secrets-as-files-from-a-pod). There are two common volume mount patterns that can be used: 1. A single file contains a complete set of properties (usually written as YAML). 2. Multiple files are written to a directory tree, with the filename becoming the ‘key’ and the contents becoming the ‘value’. For the first case, you can import the YAML or Properties file directly using `spring.config.import` as described [above](#features.external-config.files.importing). For the second case, you need to use the `configtree:` prefix so that Spring Boot knows it needs to expose all the files as properties. As an example, let’s imagine that Kubernetes has mounted the following volume: ``` etc/ config/ myapp/ username password ``` The contents of the `username` file would be a config value, and the contents of `password` would be a secret. To import these properties, you can add the following to your `application.properties` or `application.yaml` file: Properties ``` spring.config.import=optional:configtree:/etc/config/ ``` Yaml ``` spring: config: import: "optional:configtree:/etc/config/" ``` You can then access or inject `myapp.username` and `myapp.password` properties from the `Environment` in the usual way. | | | | --- | --- | | | The folders under the config tree form the property name. In the above example, to access the properties as `username` and `password`, you can set `spring.config.import` to `optional:configtree:/etc/config/myapp`. | | | | | --- | --- | | | Filenames with dot notation are also correctly mapped. For example, in the above example, a file named `myapp.username` in `/etc/config` would result in a `myapp.username` property in the `Environment`. | | | | | --- | --- | | | Configuration tree values can be bound to both string `String` and `byte[]` types depending on the contents expected. | If you have multiple config trees to import from the same parent folder you can use a wildcard shortcut. Any `configtree:` location that ends with `/*/` will import all immediate children as config trees. For example, given the following volume: ``` etc/ config/ dbconfig/ db/ username password mqconfig/ mq/ username password ``` You can use `configtree:/etc/config/*/` as the import location: Properties ``` spring.config.import=optional:configtree:/etc/config/*/ ``` Yaml ``` spring: config: import: "optional:configtree:/etc/config/*/" ``` This will add `db.username`, `db.password`, `mq.username` and `mq.password` properties. | | | | --- | --- | | | Directories loaded using a wildcard are sorted alphabetically. If you need a different order, then you should list each location as a separate import | Configuration trees can also be used for Docker secrets. When a Docker swarm service is granted access to a secret, the secret gets mounted into the container. For example, if a secret named `db.password` is mounted at location `/run/secrets/`, you can make `db.password` available to the Spring environment using the following: Properties ``` spring.config.import=optional:configtree:/run/secrets/ ``` Yaml ``` spring: config: import: "optional:configtree:/run/secrets/" ``` #### 2.3.7. Property Placeholders The values in `application.properties` and `application.yml` are filtered through the existing `Environment` when they are used, so you can refer back to previously defined values (for example, from System properties or environment variables). The standard `${name}` property-placeholder syntax can be used anywhere within a value. Property placeholders can also specify a default value using a `:` to separate the default value from the property name, for example `${name:default}`. The use of placeholders with and without defaults is shown in the following example: Properties ``` app.name=MyApp app.description=${app.name} is a Spring Boot application written by ${username:Unknown} ``` Yaml ``` app: name: "MyApp" description: "${app.name} is a Spring Boot application written by ${username:Unknown}" ``` Assuming that the `username` property has not been set elsewhere, `app.description` will have the value `MyApp is a Spring Boot application written by Unknown`. | | | | --- | --- | | | You can also use this technique to create “short” variants of existing Spring Boot properties. See the *[howto.html](howto#howto.properties-and-configuration.short-command-line-arguments)* how-to for details. | #### 2.3.8. Working with Multi-Document Files Spring Boot allows you to split a single physical file into multiple logical documents which are each added independently. Documents are processed in order, from top to bottom. Later documents can override the properties defined in earlier ones. For `application.yml` files, the standard YAML multi-document syntax is used. Three consecutive hyphens represent the end of one document, and the start of the next. For example, the following file has two logical documents: ``` spring: application: name: "MyApp" --- spring: application: name: "MyCloudApp" config: activate: on-cloud-platform: "kubernetes" ``` For `application.properties` files a special `#---` comment is used to mark the document splits: ``` spring.application.name=MyApp #--- spring.application.name=MyCloudApp spring.config.activate.on-cloud-platform=kubernetes ``` | | | | --- | --- | | | Property file separators must not have any leading whitespace and must have exactly three hyphen characters. The lines immediately before and after the separator must not be comments. | | | | | --- | --- | | | Multi-document property files are often used in conjunction with activation properties such as `spring.config.activate.on-profile`. See the [next section](#features.external-config.files.activation-properties) for details. | | | | | --- | --- | | | Multi-document property files cannot be loaded by using the `@PropertySource` or `@TestPropertySource` annotations. | #### 2.3.9. Activation Properties It is sometimes useful to only activate a given set of properties when certain conditions are met. For example, you might have properties that are only relevant when a specific profile is active. You can conditionally activate a properties document using `spring.config.activate.*`. The following activation properties are available: Table 2. activation properties | Property | Note | | --- | --- | | `on-profile` | A profile expression that must match for the document to be active. | | `on-cloud-platform` | The `CloudPlatform` that must be detected for the document to be active. | For example, the following specifies that the second document is only active when running on Kubernetes, and only when either the “prod” or “staging” profiles are active: Properties ``` myprop=always-set #--- spring.config.activate.on-cloud-platform=kubernetes spring.config.activate.on-profile=prod | staging myotherprop=sometimes-set ``` Yaml ``` myprop: "always-set" --- spring: config: activate: on-cloud-platform: "kubernetes" on-profile: "prod | staging" myotherprop: "sometimes-set" ``` ### 2.4. Encrypting Properties Spring Boot does not provide any built in support for encrypting property values, however, it does provide the hook points necessary to modify values contained in the Spring `Environment`. The `EnvironmentPostProcessor` interface allows you to manipulate the `Environment` before the application starts. See [howto.html](howto#howto.application.customize-the-environment-or-application-context) for details. If you need a secure way to store credentials and passwords, the [Spring Cloud Vault](https://cloud.spring.io/spring-cloud-vault/) project provides support for storing externalized configuration in [HashiCorp Vault](https://www.vaultproject.io/). ### 2.5. Working with YAML [YAML](https://yaml.org) is a superset of JSON and, as such, is a convenient format for specifying hierarchical configuration data. The `SpringApplication` class automatically supports YAML as an alternative to properties whenever you have the [SnakeYAML](https://bitbucket.org/asomov/snakeyaml) library on your classpath. | | | | --- | --- | | | If you use “Starters”, SnakeYAML is automatically provided by `spring-boot-starter`. | #### 2.5.1. Mapping YAML to Properties YAML documents need to be converted from their hierarchical format to a flat structure that can be used with the Spring `Environment`. For example, consider the following YAML document: ``` environments: dev: url: "https://dev.example.com" name: "Developer Setup" prod: url: "https://another.example.com" name: "My Cool App" ``` In order to access these properties from the `Environment`, they would be flattened as follows: ``` environments.dev.url=https://dev.example.com environments.dev.name=Developer Setup environments.prod.url=https://another.example.com environments.prod.name=My Cool App ``` Likewise, YAML lists also need to be flattened. They are represented as property keys with `[index]` dereferencers. For example, consider the following YAML: ``` my: servers: - "dev.example.com" - "another.example.com" ``` The preceding example would be transformed into these properties: ``` my.servers[0]=dev.example.com my.servers[1]=another.example.com ``` | | | | --- | --- | | | Properties that use the `[index]` notation can be bound to Java `List` or `Set` objects using Spring Boot’s `Binder` class. For more details see the “[Type-safe Configuration Properties](#features.external-config.typesafe-configuration-properties)” section below. | | | | | --- | --- | | | YAML files cannot be loaded by using the `@PropertySource` or `@TestPropertySource` annotations. So, in the case that you need to load values that way, you need to use a properties file. | #### 2.5.2. Directly Loading YAML Spring Framework provides two convenient classes that can be used to load YAML documents. The `YamlPropertiesFactoryBean` loads YAML as `Properties` and the `YamlMapFactoryBean` loads YAML as a `Map`. You can also use the `YamlPropertySourceLoader` class if you want to load YAML as a Spring `PropertySource`. ### 2.6. Configuring Random Values The `RandomValuePropertySource` is useful for injecting random values (for example, into secrets or test cases). It can produce integers, longs, uuids, or strings, as shown in the following example: Properties ``` my.secret=${random.value} my.number=${random.int} my.bignumber=${random.long} my.uuid=${random.uuid} my.number-less-than-ten=${random.int(10)} my.number-in-range=${random.int[1024,65536]} ``` Yaml ``` my: secret: "${random.value}" number: "${random.int}" bignumber: "${random.long}" uuid: "${random.uuid}" number-less-than-ten: "${random.int(10)}" number-in-range: "${random.int[1024,65536]}" ``` The `random.int*` syntax is `OPEN value (,max) CLOSE` where the `OPEN,CLOSE` are any character and `value,max` are integers. If `max` is provided, then `value` is the minimum value and `max` is the maximum value (exclusive). ### 2.7. Configuring System Environment Properties Spring Boot supports setting a prefix for environment properties. This is useful if the system environment is shared by multiple Spring Boot applications with different configuration requirements. The prefix for system environment properties can be set directly on `SpringApplication`. For example, if you set the prefix to `input`, a property such as `remote.timeout` will also be resolved as `input.remote.timeout` in the system environment. ### 2.8. Type-safe Configuration Properties Using the `@Value("${property}")` annotation to inject configuration properties can sometimes be cumbersome, especially if you are working with multiple properties or your data is hierarchical in nature. Spring Boot provides an alternative method of working with properties that lets strongly typed beans govern and validate the configuration of your application. | | | | --- | --- | | | See also the [differences between `@Value` and type-safe configuration properties](#features.external-config.typesafe-configuration-properties.vs-value-annotation). | #### 2.8.1. JavaBean properties binding It is possible to bind a bean declaring standard JavaBean properties as shown in the following example: Java ``` import java.net.InetAddress; import java.util.ArrayList; import java.util.Collections; import java.util.List; import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties("my.service") public class MyProperties { private boolean enabled; private InetAddress remoteAddress; private final Security security = new Security(); // getters / setters... public boolean isEnabled() { return this.enabled; } public void setEnabled(boolean enabled) { this.enabled = enabled; } public InetAddress getRemoteAddress() { return this.remoteAddress; } public void setRemoteAddress(InetAddress remoteAddress) { this.remoteAddress = remoteAddress; } public Security getSecurity() { return this.security; } public static class Security { private String username; private String password; private List<String> roles = new ArrayList<>(Collections.singleton("USER")); // getters / setters... public String getUsername() { return this.username; } public void setUsername(String username) { this.username = username; } public String getPassword() { return this.password; } public void setPassword(String password) { this.password = password; } public List<String> getRoles() { return this.roles; } public void setRoles(List<String> roles) { this.roles = roles; } } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import java.net.InetAddress @ConfigurationProperties("my.service") class MyProperties { var isEnabled = false var remoteAddress: InetAddress? = null val security = Security() class Security { var username: String? = null var password: String? = null var roles: List<String> = ArrayList(setOf("USER")) } } ``` The preceding POJO defines the following properties: * `my.service.enabled`, with a value of `false` by default. * `my.service.remote-address`, with a type that can be coerced from `String`. * `my.service.security.username`, with a nested "security" object whose name is determined by the name of the property. In particular, the type is not used at all there and could have been `SecurityProperties`. * `my.service.security.password`. * `my.service.security.roles`, with a collection of `String` that defaults to `USER`. | | | | --- | --- | | | The properties that map to `@ConfigurationProperties` classes available in Spring Boot, which are configured through properties files, YAML files, environment variables, and other mechanisms, are public API but the accessors (getters/setters) of the class itself are not meant to be used directly. | | | | | --- | --- | | | Such arrangement relies on a default empty constructor and getters and setters are usually mandatory, since binding is through standard Java Beans property descriptors, just like in Spring MVC. A setter may be omitted in the following cases: * Maps, as long as they are initialized, need a getter but not necessarily a setter, since they can be mutated by the binder. * Collections and arrays can be accessed either through an index (typically with YAML) or by using a single comma-separated value (properties). In the latter case, a setter is mandatory. We recommend to always add a setter for such types. If you initialize a collection, make sure it is not immutable (as in the preceding example). * If nested POJO properties are initialized (like the `Security` field in the preceding example), a setter is not required. If you want the binder to create the instance on the fly by using its default constructor, you need a setter. Some people use Project Lombok to add getters and setters automatically. Make sure that Lombok does not generate any particular constructor for such a type, as it is used automatically by the container to instantiate the object. Finally, only standard Java Bean properties are considered and binding on static properties is not supported. | #### 2.8.2. Constructor binding The example in the previous section can be rewritten in an immutable fashion as shown in the following example: Java ``` import java.net.InetAddress; import java.util.List; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.context.properties.ConstructorBinding; import org.springframework.boot.context.properties.bind.DefaultValue; @ConstructorBinding @ConfigurationProperties("my.service") public class MyProperties { // fields... private final boolean enabled; private final InetAddress remoteAddress; private final Security security; public MyProperties(boolean enabled, InetAddress remoteAddress, Security security) { this.enabled = enabled; this.remoteAddress = remoteAddress; this.security = security; } // getters... public boolean isEnabled() { return this.enabled; } public InetAddress getRemoteAddress() { return this.remoteAddress; } public Security getSecurity() { return this.security; } public static class Security { // fields... private final String username; private final String password; private final List<String> roles; public Security(String username, String password, @DefaultValue("USER") List<String> roles) { this.username = username; this.password = password; this.roles = roles; } // getters... public String getUsername() { return this.username; } public String getPassword() { return this.password; } public List<String> getRoles() { return this.roles; } } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.context.properties.ConstructorBinding import org.springframework.boot.context.properties.bind.DefaultValue import java.net.InetAddress @ConstructorBinding @ConfigurationProperties("my.service") class MyProperties(val enabled: Boolean, val remoteAddress: InetAddress, val security: Security) { class Security(val username: String, val password: String, @param:DefaultValue("USER") val roles: List<String>) } ``` In this setup, the `@ConstructorBinding` annotation is used to indicate that constructor binding should be used. This means that the binder will expect to find a constructor with the parameters that you wish to have bound. If you are using Java 16 or later, constructor binding can be used with records. In this case, unless your record has multiple constructors, there is no need to use `@ConstructorBinding`. Nested members of a `@ConstructorBinding` class (such as `Security` in the example above) will also be bound through their constructor. Default values can be specified using `@DefaultValue` on a constructor parameter or, when using Java 16 or later, a record component. The conversion service will be applied to coerce the `String` value to the target type of a missing property. Referring to the previous example, if no properties are bound to `Security`, the `MyProperties` instance will contain a `null` value for `security`. If you wish you return a non-null instance of `Security` even when no properties are bound to it, you can use an empty `@DefaultValue` annotation to do so: Java ``` public MyProperties(boolean enabled, InetAddress remoteAddress, @DefaultValue Security security) { this.enabled = enabled; this.remoteAddress = remoteAddress; this.security = security; } ``` Kotlin | | | | --- | --- | | | To use constructor binding the class must be enabled using `@EnableConfigurationProperties` or configuration property scanning. You cannot use constructor binding with beans that are created by the regular Spring mechanisms (for example `@Component` beans, beans created by using `@Bean` methods or beans loaded by using `@Import`) | | | | | --- | --- | | | If you have more than one constructor for your class you can also use `@ConstructorBinding` directly on the constructor that should be bound. | | | | | --- | --- | | | The use of `java.util.Optional` with `@ConfigurationProperties` is not recommended as it is primarily intended for use as a return type. As such, it is not well-suited to configuration property injection. For consistency with properties of other types, if you do declare an `Optional` property and it has no value, `null` rather than an empty `Optional` will be bound. | #### 2.8.3. Enabling @ConfigurationProperties-annotated types Spring Boot provides infrastructure to bind `@ConfigurationProperties` types and register them as beans. You can either enable configuration properties on a class-by-class basis or enable configuration property scanning that works in a similar manner to component scanning. Sometimes, classes annotated with `@ConfigurationProperties` might not be suitable for scanning, for example, if you’re developing your own auto-configuration or you want to enable them conditionally. In these cases, specify the list of types to process using the `@EnableConfigurationProperties` annotation. This can be done on any `@Configuration` class, as shown in the following example: Java ``` import org.springframework.boot.context.properties.EnableConfigurationProperties; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) @EnableConfigurationProperties(SomeProperties.class) public class MyConfiguration { } ``` Kotlin ``` import org.springframework.boot.context.properties.EnableConfigurationProperties import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) @EnableConfigurationProperties(SomeProperties::class) class MyConfiguration ``` To use configuration property scanning, add the `@ConfigurationPropertiesScan` annotation to your application. Typically, it is added to the main application class that is annotated with `@SpringBootApplication` but it can be added to any `@Configuration` class. By default, scanning will occur from the package of the class that declares the annotation. If you want to define specific packages to scan, you can do so as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.context.properties.ConfigurationPropertiesScan; @SpringBootApplication @ConfigurationPropertiesScan({ "com.example.app", "com.example.another" }) public class MyApplication { } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.context.properties.ConfigurationPropertiesScan @SpringBootApplication @ConfigurationPropertiesScan("com.example.app", "com.example.another") class MyApplication ``` | | | | --- | --- | | | When the `@ConfigurationProperties` bean is registered using configuration property scanning or through `@EnableConfigurationProperties`, the bean has a conventional name: `<prefix>-<fqn>`, where `<prefix>` is the environment key prefix specified in the `@ConfigurationProperties` annotation and `<fqn>` is the fully qualified name of the bean. If the annotation does not provide any prefix, only the fully qualified name of the bean is used. The bean name in the example above is `com.example.app-com.example.app.SomeProperties`. | We recommend that `@ConfigurationProperties` only deal with the environment and, in particular, does not inject other beans from the context. For corner cases, setter injection can be used or any of the `*Aware` interfaces provided by the framework (such as `EnvironmentAware` if you need access to the `Environment`). If you still want to inject other beans using the constructor, the configuration properties bean must be annotated with `@Component` and use JavaBean-based property binding. #### 2.8.4. Using @ConfigurationProperties-annotated types This style of configuration works particularly well with the `SpringApplication` external YAML configuration, as shown in the following example: ``` my: service: remote-address: 192.168.1.1 security: username: "admin" roles: - "USER" - "ADMIN" ``` To work with `@ConfigurationProperties` beans, you can inject them in the same way as any other bean, as shown in the following example: Java ``` import org.springframework.stereotype.Service; @Service public class MyService { private final SomeProperties properties; public MyService(SomeProperties properties) { this.properties = properties; } public void openConnection() { Server server = new Server(this.properties.getRemoteAddress()); server.start(); // ... } // ... } ``` Kotlin ``` import org.springframework.stereotype.Service @Service class MyService(val properties: SomeProperties) { fun openConnection() { val server = Server(properties.remoteAddress) server.start() // ... } // ... } ``` | | | | --- | --- | | | Using `@ConfigurationProperties` also lets you generate metadata files that can be used by IDEs to offer auto-completion for your own keys. See the [appendix](configuration-metadata#appendix.configuration-metadata) for details. | #### 2.8.5. Third-party Configuration As well as using `@ConfigurationProperties` to annotate a class, you can also use it on public `@Bean` methods. Doing so can be particularly useful when you want to bind properties to third-party components that are outside of your control. To configure a bean from the `Environment` properties, add `@ConfigurationProperties` to its bean registration, as shown in the following example: Java ``` import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class ThirdPartyConfiguration { @Bean @ConfigurationProperties(prefix = "another") public AnotherComponent anotherComponent() { return new AnotherComponent(); } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class ThirdPartyConfiguration { @Bean @ConfigurationProperties(prefix = "another") fun anotherComponent(): AnotherComponent = AnotherComponent() } ``` Any JavaBean property defined with the `another` prefix is mapped onto that `AnotherComponent` bean in manner similar to the preceding `SomeProperties` example. #### 2.8.6. Relaxed Binding Spring Boot uses some relaxed rules for binding `Environment` properties to `@ConfigurationProperties` beans, so there does not need to be an exact match between the `Environment` property name and the bean property name. Common examples where this is useful include dash-separated environment properties (for example, `context-path` binds to `contextPath`), and capitalized environment properties (for example, `PORT` binds to `port`). As an example, consider the following `@ConfigurationProperties` class: Java ``` import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties(prefix = "my.main-project.person") public class MyPersonProperties { private String firstName; public String getFirstName() { return this.firstName; } public void setFirstName(String firstName) { this.firstName = firstName; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties @ConfigurationProperties(prefix = "my.main-project.person") class MyPersonProperties { var firstName: String? = null } ``` With the preceding code, the following properties names can all be used: Table 3. relaxed binding | Property | Note | | --- | --- | | `my.main-project.person.first-name` | Kebab case, which is recommended for use in `.properties` and `.yml` files. | | `my.main-project.person.firstName` | Standard camel case syntax. | | `my.main-project.person.first_name` | Underscore notation, which is an alternative format for use in `.properties` and `.yml` files. | | `MY_MAINPROJECT_PERSON_FIRSTNAME` | Upper case format, which is recommended when using system environment variables. | | | | | --- | --- | | | The `prefix` value for the annotation *must* be in kebab case (lowercase and separated by `-`, such as `my.main-project.person`). | Table 4. relaxed binding rules per property source | Property Source | Simple | List | | --- | --- | --- | | Properties Files | Camel case, kebab case, or underscore notation | Standard list syntax using `[ ]` or comma-separated values | | YAML Files | Camel case, kebab case, or underscore notation | Standard YAML list syntax or comma-separated values | | Environment Variables | Upper case format with underscore as the delimiter (see [Binding from Environment Variables](#features.external-config.typesafe-configuration-properties.relaxed-binding.environment-variables)). | Numeric values surrounded by underscores (see [Binding from Environment Variables](#features.external-config.typesafe-configuration-properties.relaxed-binding.environment-variables)) | | System properties | Camel case, kebab case, or underscore notation | Standard list syntax using `[ ]` or comma-separated values | | | | | --- | --- | | | We recommend that, when possible, properties are stored in lower-case kebab format, such as `my.person.first-name=Rod`. | ##### Binding Maps When binding to `Map` properties you may need to use a special bracket notation so that the original `key` value is preserved. If the key is not surrounded by `[]`, any characters that are not alpha-numeric, `-` or `.` are removed. For example, consider binding the following properties to a `Map<String,String>`: Properties ``` my.map.[/key1]=value1 my.map.[/key2]=value2 my.map./key3=value3 ``` Yaml ``` my: map: "[/key1]": "value1" "[/key2]": "value2" "/key3": "value3" ``` | | | | --- | --- | | | For YAML files, the brackets need to be surrounded by quotes for the keys to be parsed properly. | The properties above will bind to a `Map` with `/key1`, `/key2` and `key3` as the keys in the map. The slash has been removed from `key3` because it was not surrounded by square brackets. When binding to scalar values, keys with `.` in them do not need to be surrounded by `[]`. Scalar values include enums and all types in the `java.lang` package except for `Object`. Binding `a.b=c` to `Map<String, String>` will preserve the `.` in the key and return a Map with the entry `{"a.b"="c"}`. For any other types you need to use the bracket notation if your `key` contains a `.`. For example, binding `a.b=c` to `Map<String, Object>` will return a Map with the entry `{"a"={"b"="c"}}` whereas `[a.b]=c` will return a Map with the entry `{"a.b"="c"}`. ##### Binding from Environment Variables Most operating systems impose strict rules around the names that can be used for environment variables. For example, Linux shell variables can contain only letters (`a` to `z` or `A` to `Z`), numbers (`0` to `9`) or the underscore character (`_`). By convention, Unix shell variables will also have their names in UPPERCASE. Spring Boot’s relaxed binding rules are, as much as possible, designed to be compatible with these naming restrictions. To convert a property name in the canonical-form to an environment variable name you can follow these rules: * Replace dots (`.`) with underscores (`_`). * Remove any dashes (`-`). * Convert to uppercase. For example, the configuration property `spring.main.log-startup-info` would be an environment variable named `SPRING_MAIN_LOGSTARTUPINFO`. Environment variables can also be used when binding to object lists. To bind to a `List`, the element number should be surrounded with underscores in the variable name. For example, the configuration property `my.service[0].other` would use an environment variable named `MY_SERVICE_0_OTHER`. #### 2.8.7. Merging Complex Types When lists are configured in more than one place, overriding works by replacing the entire list. For example, assume a `MyPojo` object with `name` and `description` attributes that are `null` by default. The following example exposes a list of `MyPojo` objects from `MyProperties`: Java ``` import java.util.ArrayList; import java.util.List; import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties("my") public class MyProperties { private final List<MyPojo> list = new ArrayList<>(); public List<MyPojo> getList() { return this.list; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties @ConfigurationProperties("my") class MyProperties { val list: List<MyPojo> = ArrayList() } ``` Consider the following configuration: Properties ``` my.list[0].name=my name my.list[0].description=my description #--- spring.config.activate.on-profile=dev my.list[0].name=my another name ``` Yaml ``` my: list: - name: "my name" description: "my description" --- spring: config: activate: on-profile: "dev" my: list: - name: "my another name" ``` If the `dev` profile is not active, `MyProperties.list` contains one `MyPojo` entry, as previously defined. If the `dev` profile is enabled, however, the `list` *still* contains only one entry (with a name of `my another name` and a description of `null`). This configuration *does not* add a second `MyPojo` instance to the list, and it does not merge the items. When a `List` is specified in multiple profiles, the one with the highest priority (and only that one) is used. Consider the following example: Properties ``` my.list[0].name=my name my.list[0].description=my description my.list[1].name=another name my.list[1].description=another description #--- spring.config.activate.on-profile=dev my.list[0].name=my another name ``` Yaml ``` my: list: - name: "my name" description: "my description" - name: "another name" description: "another description" --- spring: config: activate: on-profile: "dev" my: list: - name: "my another name" ``` In the preceding example, if the `dev` profile is active, `MyProperties.list` contains *one* `MyPojo` entry (with a name of `my another name` and a description of `null`). For YAML, both comma-separated lists and YAML lists can be used for completely overriding the contents of the list. For `Map` properties, you can bind with property values drawn from multiple sources. However, for the same property in multiple sources, the one with the highest priority is used. The following example exposes a `Map<String, MyPojo>` from `MyProperties`: Java ``` import java.util.LinkedHashMap; import java.util.Map; import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties("my") public class MyProperties { private final Map<String, MyPojo> map = new LinkedHashMap<>(); public Map<String, MyPojo> getMap() { return this.map; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties @ConfigurationProperties("my") class MyProperties { val map: Map<String, MyPojo> = LinkedHashMap() } ``` Consider the following configuration: Properties ``` my.map.key1.name=my name 1 my.map.key1.description=my description 1 #--- spring.config.activate.on-profile=dev my.map.key1.name=dev name 1 my.map.key2.name=dev name 2 my.map.key2.description=dev description 2 ``` Yaml ``` my: map: key1: name: "my name 1" description: "my description 1" --- spring: config: activate: on-profile: "dev" my: map: key1: name: "dev name 1" key2: name: "dev name 2" description: "dev description 2" ``` If the `dev` profile is not active, `MyProperties.map` contains one entry with key `key1` (with a name of `my name 1` and a description of `my description 1`). If the `dev` profile is enabled, however, `map` contains two entries with keys `key1` (with a name of `dev name 1` and a description of `my description 1`) and `key2` (with a name of `dev name 2` and a description of `dev description 2`). | | | | --- | --- | | | The preceding merging rules apply to properties from all property sources, and not just files. | #### 2.8.8. Properties Conversion Spring Boot attempts to coerce the external application properties to the right type when it binds to the `@ConfigurationProperties` beans. If you need custom type conversion, you can provide a `ConversionService` bean (with a bean named `conversionService`) or custom property editors (through a `CustomEditorConfigurer` bean) or custom `Converters` (with bean definitions annotated as `@ConfigurationPropertiesBinding`). | | | | --- | --- | | | As this bean is requested very early during the application lifecycle, make sure to limit the dependencies that your `ConversionService` is using. Typically, any dependency that you require may not be fully initialized at creation time. You may want to rename your custom `ConversionService` if it is not required for configuration keys coercion and only rely on custom converters qualified with `@ConfigurationPropertiesBinding`. | ##### Converting Durations Spring Boot has dedicated support for expressing durations. If you expose a `java.time.Duration` property, the following formats in application properties are available: * A regular `long` representation (using milliseconds as the default unit unless a `@DurationUnit` has been specified) * The standard ISO-8601 format [used by `java.time.Duration`](https://docs.oracle.com/javase/8/docs/api/java/time/Duration.html#parse-java.lang.CharSequence-) * A more readable format where the value and the unit are coupled (`10s` means 10 seconds) Consider the following example: Java ``` import java.time.Duration; import java.time.temporal.ChronoUnit; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.convert.DurationUnit; @ConfigurationProperties("my") public class MyProperties { @DurationUnit(ChronoUnit.SECONDS) private Duration sessionTimeout = Duration.ofSeconds(30); private Duration readTimeout = Duration.ofMillis(1000); // getters / setters... public Duration getSessionTimeout() { return this.sessionTimeout; } public void setSessionTimeout(Duration sessionTimeout) { this.sessionTimeout = sessionTimeout; } public Duration getReadTimeout() { return this.readTimeout; } public void setReadTimeout(Duration readTimeout) { this.readTimeout = readTimeout; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.convert.DurationUnit import java.time.Duration import java.time.temporal.ChronoUnit @ConfigurationProperties("my") class MyProperties { @DurationUnit(ChronoUnit.SECONDS) var sessionTimeout = Duration.ofSeconds(30) var readTimeout = Duration.ofMillis(1000) } ``` To specify a session timeout of 30 seconds, `30`, `PT30S` and `30s` are all equivalent. A read timeout of 500ms can be specified in any of the following form: `500`, `PT0.5S` and `500ms`. You can also use any of the supported units. These are: * `ns` for nanoseconds * `us` for microseconds * `ms` for milliseconds * `s` for seconds * `m` for minutes * `h` for hours * `d` for days The default unit is milliseconds and can be overridden using `@DurationUnit` as illustrated in the sample above. If you prefer to use constructor binding, the same properties can be exposed, as shown in the following example: Java ``` import java.time.Duration; import java.time.temporal.ChronoUnit; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.context.properties.ConstructorBinding; import org.springframework.boot.context.properties.bind.DefaultValue; import org.springframework.boot.convert.DurationUnit; @ConfigurationProperties("my") @ConstructorBinding public class MyProperties { // fields... private final Duration sessionTimeout; private final Duration readTimeout; public MyProperties(@DurationUnit(ChronoUnit.SECONDS) @DefaultValue("30s") Duration sessionTimeout, @DefaultValue("1000ms") Duration readTimeout) { this.sessionTimeout = sessionTimeout; this.readTimeout = readTimeout; } // getters... public Duration getSessionTimeout() { return this.sessionTimeout; } public Duration getReadTimeout() { return this.readTimeout; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.context.properties.ConstructorBinding import org.springframework.boot.context.properties.bind.DefaultValue import org.springframework.boot.convert.DurationUnit import java.time.Duration import java.time.temporal.ChronoUnit @ConfigurationProperties("my") @ConstructorBinding class MyProperties(@param:DurationUnit(ChronoUnit.SECONDS) @param:DefaultValue("30s") val sessionTimeout: Duration, @param:DefaultValue("1000ms") val readTimeout: Duration) ``` | | | | --- | --- | | | If you are upgrading a `Long` property, make sure to define the unit (using `@DurationUnit`) if it is not milliseconds. Doing so gives a transparent upgrade path while supporting a much richer format. | ##### Converting periods In addition to durations, Spring Boot can also work with `java.time.Period` type. The following formats can be used in application properties: * An regular `int` representation (using days as the default unit unless a `@PeriodUnit` has been specified) * The standard ISO-8601 format [used by `java.time.Period`](https://docs.oracle.com/javase/8/docs/api/java/time/Period.html#parse-java.lang.CharSequence-) * A simpler format where the value and the unit pairs are coupled (`1y3d` means 1 year and 3 days) The following units are supported with the simple format: * `y` for years * `m` for months * `w` for weeks * `d` for days | | | | --- | --- | | | The `java.time.Period` type never actually stores the number of weeks, it is a shortcut that means “7 days”. | ##### Converting Data Sizes Spring Framework has a `DataSize` value type that expresses a size in bytes. If you expose a `DataSize` property, the following formats in application properties are available: * A regular `long` representation (using bytes as the default unit unless a `@DataSizeUnit` has been specified) * A more readable format where the value and the unit are coupled (`10MB` means 10 megabytes) Consider the following example: Java ``` import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.convert.DataSizeUnit; import org.springframework.util.unit.DataSize; import org.springframework.util.unit.DataUnit; @ConfigurationProperties("my") public class MyProperties { @DataSizeUnit(DataUnit.MEGABYTES) private DataSize bufferSize = DataSize.ofMegabytes(2); private DataSize sizeThreshold = DataSize.ofBytes(512); // getters/setters... public DataSize getBufferSize() { return this.bufferSize; } public void setBufferSize(DataSize bufferSize) { this.bufferSize = bufferSize; } public DataSize getSizeThreshold() { return this.sizeThreshold; } public void setSizeThreshold(DataSize sizeThreshold) { this.sizeThreshold = sizeThreshold; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.convert.DataSizeUnit import org.springframework.util.unit.DataSize import org.springframework.util.unit.DataUnit @ConfigurationProperties("my") class MyProperties { @DataSizeUnit(DataUnit.MEGABYTES) var bufferSize = DataSize.ofMegabytes(2) var sizeThreshold = DataSize.ofBytes(512) } ``` To specify a buffer size of 10 megabytes, `10` and `10MB` are equivalent. A size threshold of 256 bytes can be specified as `256` or `256B`. You can also use any of the supported units. These are: * `B` for bytes * `KB` for kilobytes * `MB` for megabytes * `GB` for gigabytes * `TB` for terabytes The default unit is bytes and can be overridden using `@DataSizeUnit` as illustrated in the sample above. If you prefer to use constructor binding, the same properties can be exposed, as shown in the following example: Java ``` import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.context.properties.ConstructorBinding; import org.springframework.boot.context.properties.bind.DefaultValue; import org.springframework.boot.convert.DataSizeUnit; import org.springframework.util.unit.DataSize; import org.springframework.util.unit.DataUnit; @ConfigurationProperties("my") @ConstructorBinding public class MyProperties { // fields... private final DataSize bufferSize; private final DataSize sizeThreshold; public MyProperties(@DataSizeUnit(DataUnit.MEGABYTES) @DefaultValue("2MB") DataSize bufferSize, @DefaultValue("512B") DataSize sizeThreshold) { this.bufferSize = bufferSize; this.sizeThreshold = sizeThreshold; } // getters... public DataSize getBufferSize() { return this.bufferSize; } public DataSize getSizeThreshold() { return this.sizeThreshold; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.boot.context.properties.ConstructorBinding import org.springframework.boot.context.properties.bind.DefaultValue import org.springframework.boot.convert.DataSizeUnit import org.springframework.util.unit.DataSize import org.springframework.util.unit.DataUnit @ConfigurationProperties("my") @ConstructorBinding class MyProperties(@param:DataSizeUnit(DataUnit.MEGABYTES) @param:DefaultValue("2MB") val bufferSize: DataSize, @param:DefaultValue("512B") val sizeThreshold: DataSize) ``` | | | | --- | --- | | | If you are upgrading a `Long` property, make sure to define the unit (using `@DataSizeUnit`) if it is not bytes. Doing so gives a transparent upgrade path while supporting a much richer format. | #### 2.8.9. @ConfigurationProperties Validation Spring Boot attempts to validate `@ConfigurationProperties` classes whenever they are annotated with Spring’s `@Validated` annotation. You can use JSR-303 `javax.validation` constraint annotations directly on your configuration class. To do so, ensure that a compliant JSR-303 implementation is on your classpath and then add constraint annotations to your fields, as shown in the following example: Java ``` import java.net.InetAddress; import javax.validation.constraints.NotNull; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.validation.annotation.Validated; @ConfigurationProperties("my.service") @Validated public class MyProperties { @NotNull private InetAddress remoteAddress; // getters/setters... public InetAddress getRemoteAddress() { return this.remoteAddress; } public void setRemoteAddress(InetAddress remoteAddress) { this.remoteAddress = remoteAddress; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.validation.annotation.Validated import java.net.InetAddress import javax.validation.constraints.NotNull @ConfigurationProperties("my.service") @Validated class MyProperties { var remoteAddress: @NotNull InetAddress? = null } ``` | | | | --- | --- | | | You can also trigger validation by annotating the `@Bean` method that creates the configuration properties with `@Validated`. | To ensure that validation is always triggered for nested properties, even when no properties are found, the associated field must be annotated with `@Valid`. The following example builds on the preceding `MyProperties` example: Java ``` import java.net.InetAddress; import javax.validation.Valid; import javax.validation.constraints.NotEmpty; import javax.validation.constraints.NotNull; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.validation.annotation.Validated; @ConfigurationProperties("my.service") @Validated public class MyProperties { @NotNull private InetAddress remoteAddress; @Valid private final Security security = new Security(); // getters/setters... public InetAddress getRemoteAddress() { return this.remoteAddress; } public void setRemoteAddress(InetAddress remoteAddress) { this.remoteAddress = remoteAddress; } public Security getSecurity() { return this.security; } public static class Security { @NotEmpty private String username; // getters/setters... public String getUsername() { return this.username; } public void setUsername(String username) { this.username = username; } } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import org.springframework.validation.annotation.Validated import java.net.InetAddress import javax.validation.Valid import javax.validation.constraints.NotEmpty import javax.validation.constraints.NotNull @ConfigurationProperties("my.service") @Validated class MyProperties { var remoteAddress: @NotNull InetAddress? = null @Valid val security = Security() class Security { @NotEmpty var username: String? = null } } ``` You can also add a custom Spring `Validator` by creating a bean definition called `configurationPropertiesValidator`. The `@Bean` method should be declared `static`. The configuration properties validator is created very early in the application’s lifecycle, and declaring the `@Bean` method as static lets the bean be created without having to instantiate the `@Configuration` class. Doing so avoids any problems that may be caused by early instantiation. | | | | --- | --- | | | The `spring-boot-actuator` module includes an endpoint that exposes all `@ConfigurationProperties` beans. Point your web browser to `/actuator/configprops` or use the equivalent JMX endpoint. See the "[Production ready features](actuator#actuator.endpoints)" section for details. | #### 2.8.10. @ConfigurationProperties vs. @Value The `@Value` annotation is a core container feature, and it does not provide the same features as type-safe configuration properties. The following table summarizes the features that are supported by `@ConfigurationProperties` and `@Value`: | Feature | `@ConfigurationProperties` | `@Value` | | --- | --- | --- | | [Relaxed binding](#features.external-config.typesafe-configuration-properties.relaxed-binding) | Yes | Limited (see [note below](#features.external-config.typesafe-configuration-properties.vs-value-annotation.note)) | | [Meta-data support](configuration-metadata#appendix.configuration-metadata) | Yes | No | | `SpEL` evaluation | No | Yes | | | | | --- | --- | | | If you do want to use `@Value`, we recommend that you refer to property names using their canonical form (kebab-case using only lowercase letters). This will allow Spring Boot to use the same logic as it does when relaxed binding `@ConfigurationProperties`. For example, `@Value("{demo.item-price}")` will pick up `demo.item-price` and `demo.itemPrice` forms from the `application.properties` file, as well as `DEMO_ITEMPRICE` from the system environment. If you used `@Value("{demo.itemPrice}")` instead, `demo.item-price` and `DEMO_ITEMPRICE` would not be considered. | If you define a set of configuration keys for your own components, we recommend you group them in a POJO annotated with `@ConfigurationProperties`. Doing so will provide you with structured, type-safe object that you can inject into your own beans. `SpEL` expressions from [application property files](#features.external-config.files) are not processed at time of parsing these files and populating the environment. However, it is possible to write a `SpEL` expression in `@Value`. If the value of a property from an application property file is a `SpEL` expression, it will be evaluated when consumed through `@Value`. 3. Profiles ------------ Spring Profiles provide a way to segregate parts of your application configuration and make it be available only in certain environments. Any `@Component`, `@Configuration` or `@ConfigurationProperties` can be marked with `@Profile` to limit when it is loaded, as shown in the following example: Java ``` import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Profile; @Configuration(proxyBeanMethods = false) @Profile("production") public class ProductionConfiguration { // ... } ``` Kotlin ``` import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Profile @Configuration(proxyBeanMethods = false) @Profile("production") class ProductionConfiguration { // ... } ``` | | | | --- | --- | | | If `@ConfigurationProperties` beans are registered through `@EnableConfigurationProperties` instead of automatic scanning, the `@Profile` annotation needs to be specified on the `@Configuration` class that has the `@EnableConfigurationProperties` annotation. In the case where `@ConfigurationProperties` are scanned, `@Profile` can be specified on the `@ConfigurationProperties` class itself. | You can use a `spring.profiles.active` `Environment` property to specify which profiles are active. You can specify the property in any of the ways described earlier in this chapter. For example, you could include it in your `application.properties`, as shown in the following example: Properties ``` spring.profiles.active=dev,hsqldb ``` Yaml ``` spring: profiles: active: "dev,hsqldb" ``` You could also specify it on the command line by using the following switch: `--spring.profiles.active=dev,hsqldb`. If no profile is active, a default profile is enabled. The name of the default profile is `default` and it can be tuned using the `spring.profiles.default` `Environment` property, as shown in the following example: Properties ``` spring.profiles.default=none ``` Yaml ``` spring: profiles: default: "none" ``` `spring.profiles.active` and `spring.profiles.default` can only be used in non-profile specific documents. This means they cannot be included in [profile specific files](#features.external-config.files.profile-specific) or [documents activated](#features.external-config.files.activation-properties) by `spring.config.activate.on-profile`. For example, the second document configuration is invalid: Properties ``` # this document is valid spring.profiles.active=prod #--- # this document is invalid spring.config.activate.on-profile=prod spring.profiles.active=metrics ``` Yaml ``` # this document is valid spring: profiles: active: "prod" --- # this document is invalid spring: config: activate: on-profile: "prod" profiles: active: "metrics" ``` ### 3.1. Adding Active Profiles The `spring.profiles.active` property follows the same ordering rules as other properties: The highest `PropertySource` wins. This means that you can specify active profiles in `application.properties` and then **replace** them by using the command line switch. Sometimes, it is useful to have properties that **add** to the active profiles rather than replace them. The `spring.profiles.include` property can be used to add active profiles on top of those activated by the `spring.profiles.active` property. The `SpringApplication` entry point also has a Java API for setting additional profiles. See the `setAdditionalProfiles()` method in [SpringApplication](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/SpringApplication.html). For example, when an application with the following properties is run, the common and local profiles will be activated even when it runs using the --spring.profiles.active switch: Properties ``` spring.profiles.include[0]=common spring.profiles.include[1]=local ``` Yaml ``` spring: profiles: include: - "common" - "local" ``` | | | | --- | --- | | | Similar to `spring.profiles.active`, `spring.profiles.include` can only be used in non-profile specific documents. This means it cannot be included in [profile specific files](#features.external-config.files.profile-specific) or [documents activated](#features.external-config.files.activation-properties) by `spring.config.activate.on-profile`. | Profile groups, which are described in the [next section](#features.profiles.groups) can also be used to add active profiles if a given profile is active. ### 3.2. Profile Groups Occasionally the profiles that you define and use in your application are too fine-grained and become cumbersome to use. For example, you might have `proddb` and `prodmq` profiles that you use to enable database and messaging features independently. To help with this, Spring Boot lets you define profile groups. A profile group allows you to define a logical name for a related group of profiles. For example, we can create a `production` group that consists of our `proddb` and `prodmq` profiles. Properties ``` spring.profiles.group.production[0]=proddb spring.profiles.group.production[1]=prodmq ``` Yaml ``` spring: profiles: group: production: - "proddb" - "prodmq" ``` Our application can now be started using `--spring.profiles.active=production` to active the `production`, `proddb` and `prodmq` profiles in one hit. ### 3.3. Programmatically Setting Profiles You can programmatically set active profiles by calling `SpringApplication.setAdditionalProfiles(…​)` before your application runs. It is also possible to activate profiles by using Spring’s `ConfigurableEnvironment` interface. ### 3.4. Profile-specific Configuration Files Profile-specific variants of both `application.properties` (or `application.yml`) and files referenced through `@ConfigurationProperties` are considered as files and loaded. See "[Profile Specific Files](#features.external-config.files.profile-specific)" for details. 4. Logging ----------- Spring Boot uses [Commons Logging](https://commons.apache.org/logging) for all internal logging but leaves the underlying log implementation open. Default configurations are provided for [Java Util Logging](https://docs.oracle.com/javase/8/docs/api/java/util/logging/package-summary.html), [Log4J2](https://logging.apache.org/log4j/2.x/), and [Logback](https://logback.qos.ch/). In each case, loggers are pre-configured to use console output with optional file output also available. By default, if you use the “Starters”, Logback is used for logging. Appropriate Logback routing is also included to ensure that dependent libraries that use Java Util Logging, Commons Logging, Log4J, or SLF4J all work correctly. | | | | --- | --- | | | There are a lot of logging frameworks available for Java. Do not worry if the above list seems confusing. Generally, you do not need to change your logging dependencies and the Spring Boot defaults work just fine. | | | | | --- | --- | | | When you deploy your application to a servlet container or application server, logging performed with the Java Util Logging API is not routed into your application’s logs. This prevents logging performed by the container or other applications that have been deployed to it from appearing in your application’s logs. | ### 4.1. Log Format The default log output from Spring Boot resembles the following example: ``` 2019-03-05 10:57:51.112 INFO 45469 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet Engine: Apache Tomcat/7.0.52 2019-03-05 10:57:51.253 INFO 45469 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2019-03-05 10:57:51.253 INFO 45469 --- [ost-startStop-1] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 1358 ms 2019-03-05 10:57:51.698 INFO 45469 --- [ost-startStop-1] o.s.b.c.e.ServletRegistrationBean : Mapping servlet: 'dispatcherServlet' to [/] 2019-03-05 10:57:51.702 INFO 45469 --- [ost-startStop-1] o.s.b.c.embedded.FilterRegistrationBean : Mapping filter: 'hiddenHttpMethodFilter' to: [/*] ``` The following items are output: * Date and Time: Millisecond precision and easily sortable. * Log Level: `ERROR`, `WARN`, `INFO`, `DEBUG`, or `TRACE`. * Process ID. * A `---` separator to distinguish the start of actual log messages. * Thread name: Enclosed in square brackets (may be truncated for console output). * Logger name: This is usually the source class name (often abbreviated). * The log message. | | | | --- | --- | | | Logback does not have a `FATAL` level. It is mapped to `ERROR`. | ### 4.2. Console Output The default log configuration echoes messages to the console as they are written. By default, `ERROR`-level, `WARN`-level, and `INFO`-level messages are logged. You can also enable a “debug” mode by starting your application with a `--debug` flag. ``` $ java -jar myapp.jar --debug ``` | | | | --- | --- | | | You can also specify `debug=true` in your `application.properties`. | When the debug mode is enabled, a selection of core loggers (embedded container, Hibernate, and Spring Boot) are configured to output more information. Enabling the debug mode does *not* configure your application to log all messages with `DEBUG` level. Alternatively, you can enable a “trace” mode by starting your application with a `--trace` flag (or `trace=true` in your `application.properties`). Doing so enables trace logging for a selection of core loggers (embedded container, Hibernate schema generation, and the whole Spring portfolio). #### 4.2.1. Color-coded Output If your terminal supports ANSI, color output is used to aid readability. You can set `spring.output.ansi.enabled` to a [supported value](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/ansi/AnsiOutput.Enabled.html) to override the auto-detection. Color coding is configured by using the `%clr` conversion word. In its simplest form, the converter colors the output according to the log level, as shown in the following example: ``` %clr(%5p) ``` The following table describes the mapping of log levels to colors: | Level | Color | | --- | --- | | `FATAL` | Red | | `ERROR` | Red | | `WARN` | Yellow | | `INFO` | Green | | `DEBUG` | Green | | `TRACE` | Green | Alternatively, you can specify the color or style that should be used by providing it as an option to the conversion. For example, to make the text yellow, use the following setting: ``` %clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){yellow} ``` The following colors and styles are supported: * `blue` * `cyan` * `faint` * `green` * `magenta` * `red` * `yellow` ### 4.3. File Output By default, Spring Boot logs only to the console and does not write log files. If you want to write log files in addition to the console output, you need to set a `logging.file.name` or `logging.file.path` property (for example, in your `application.properties`). The following table shows how the `logging.*` properties can be used together: Table 5. Logging properties | `logging.file.name` | `logging.file.path` | Example | Description | | --- | --- | --- | --- | | *(none)* | *(none)* | | Console only logging. | | Specific file | *(none)* | `my.log` | Writes to the specified log file. Names can be an exact location or relative to the current directory. | | *(none)* | Specific directory | `/var/log` | Writes `spring.log` to the specified directory. Names can be an exact location or relative to the current directory. | Log files rotate when they reach 10 MB and, as with console output, `ERROR`-level, `WARN`-level, and `INFO`-level messages are logged by default. | | | | --- | --- | | | Logging properties are independent of the actual logging infrastructure. As a result, specific configuration keys (such as `logback.configurationFile` for Logback) are not managed by spring Boot. | ### 4.4. File Rotation If you are using the Logback, it is possible to fine-tune log rotation settings using your `application.properties` or `application.yaml` file. For all other logging system, you will need to configure rotation settings directly yourself (for example, if you use Log4J2 then you could add a `log4j2.xml` or `log4j2-spring.xml` file). The following rotation policy properties are supported: | Name | Description | | --- | --- | | `logging.logback.rollingpolicy.file-name-pattern` | The filename pattern used to create log archives. | | `logging.logback.rollingpolicy.clean-history-on-start` | If log archive cleanup should occur when the application starts. | | `logging.logback.rollingpolicy.max-file-size` | The maximum size of log file before it is archived. | | `logging.logback.rollingpolicy.total-size-cap` | The maximum amount of size log archives can take before being deleted. | | `logging.logback.rollingpolicy.max-history` | The maximum number of archive log files to keep (defaults to 7). | ### 4.5. Log Levels All the supported logging systems can have the logger levels set in the Spring `Environment` (for example, in `application.properties`) by using `logging.level.<logger-name>=<level>` where `level` is one of TRACE, DEBUG, INFO, WARN, ERROR, FATAL, or OFF. The `root` logger can be configured by using `logging.level.root`. The following example shows potential logging settings in `application.properties`: Properties ``` logging.level.root=warn logging.level.org.springframework.web=debug logging.level.org.hibernate=error ``` Yaml ``` logging: level: root: "warn" org.springframework.web: "debug" org.hibernate: "error" ``` It is also possible to set logging levels using environment variables. For example, `LOGGING_LEVEL_ORG_SPRINGFRAMEWORK_WEB=DEBUG` will set `org.springframework.web` to `DEBUG`. | | | | --- | --- | | | The above approach will only work for package level logging. Since relaxed binding always converts environment variables to lowercase, it is not possible to configure logging for an individual class in this way. If you need to configure logging for a class, you can use [the `SPRING_APPLICATION_JSON`](#features.external-config.application-json) variable. | ### 4.6. Log Groups It is often useful to be able to group related loggers together so that they can all be configured at the same time. For example, you might commonly change the logging levels for *all* Tomcat related loggers, but you can not easily remember top level packages. To help with this, Spring Boot allows you to define logging groups in your Spring `Environment`. For example, here is how you could define a “tomcat” group by adding it to your `application.properties`: Properties ``` logging.group.tomcat=org.apache.catalina,org.apache.coyote,org.apache.tomcat ``` Yaml ``` logging: group: tomcat: "org.apache.catalina,org.apache.coyote,org.apache.tomcat" ``` Once defined, you can change the level for all the loggers in the group with a single line: Properties ``` logging.level.tomcat=trace ``` Yaml ``` logging: level: tomcat: "trace" ``` Spring Boot includes the following pre-defined logging groups that can be used out-of-the-box: | Name | Loggers | | --- | --- | | web | `org.springframework.core.codec`, `org.springframework.http`, `org.springframework.web`, `org.springframework.boot.actuate.endpoint.web`, `org.springframework.boot.web.servlet.ServletContextInitializerBeans` | | sql | `org.springframework.jdbc.core`, `org.hibernate.SQL`, `org.jooq.tools.LoggerListener` | ### 4.7. Using a Log Shutdown Hook In order to release logging resources when your application terminates, a shutdown hook that will trigger log system cleanup when the JVM exits is provided. This shutdown hook is registered automatically unless your application is deployed as a war file. If your application has complex context hierarchies the shutdown hook may not meet your needs. If it does not, disable the shutdown hook and investigate the options provided directly by the underlying logging system. For example, Logback offers [context selectors](http://logback.qos.ch/manual/loggingSeparation.html) which allow each Logger to be created in its own context. You can use the `logging.register-shutdown-hook` property to disable the shutdown hook. Setting it to `false` will disable the registration. You can set the property in your `application.properties` or `application.yaml` file: Properties ``` logging.register-shutdown-hook=false ``` Yaml ``` logging: register-shutdown-hook: false ``` ### 4.8. Custom Log Configuration The various logging systems can be activated by including the appropriate libraries on the classpath and can be further customized by providing a suitable configuration file in the root of the classpath or in a location specified by the following Spring `Environment` property: `logging.config`. You can force Spring Boot to use a particular logging system by using the `org.springframework.boot.logging.LoggingSystem` system property. The value should be the fully qualified class name of a `LoggingSystem` implementation. You can also disable Spring Boot’s logging configuration entirely by using a value of `none`. | | | | --- | --- | | | Since logging is initialized **before** the `ApplicationContext` is created, it is not possible to control logging from `@PropertySources` in Spring `@Configuration` files. The only way to change the logging system or disable it entirely is through System properties. | Depending on your logging system, the following files are loaded: | Logging System | Customization | | --- | --- | | Logback | `logback-spring.xml`, `logback-spring.groovy`, `logback.xml`, or `logback.groovy` | | Log4j2 | `log4j2-spring.xml` or `log4j2.xml` | | JDK (Java Util Logging) | `logging.properties` | | | | | --- | --- | | | When possible, we recommend that you use the `-spring` variants for your logging configuration (for example, `logback-spring.xml` rather than `logback.xml`). If you use standard configuration locations, Spring cannot completely control log initialization. | | | | | --- | --- | | | There are known classloading issues with Java Util Logging that cause problems when running from an 'executable jar'. We recommend that you avoid it when running from an 'executable jar' if at all possible. | To help with the customization, some other properties are transferred from the Spring `Environment` to System properties, as described in the following table: | Spring Environment | System Property | Comments | | --- | --- | --- | | `logging.exception-conversion-word` | `LOG_EXCEPTION_CONVERSION_WORD` | The conversion word used when logging exceptions. | | `logging.file.name` | `LOG_FILE` | If defined, it is used in the default log configuration. | | `logging.file.path` | `LOG_PATH` | If defined, it is used in the default log configuration. | | `logging.pattern.console` | `CONSOLE_LOG_PATTERN` | The log pattern to use on the console (stdout). | | `logging.pattern.dateformat` | `LOG_DATEFORMAT_PATTERN` | Appender pattern for log date format. | | `logging.charset.console` | `CONSOLE_LOG_CHARSET` | The charset to use for console logging. | | `logging.pattern.file` | `FILE_LOG_PATTERN` | The log pattern to use in a file (if `LOG_FILE` is enabled). | | `logging.charset.file` | `FILE_LOG_CHARSET` | The charset to use for file logging (if `LOG_FILE` is enabled). | | `logging.pattern.level` | `LOG_LEVEL_PATTERN` | The format to use when rendering the log level (default `%5p`). | | `PID` | `PID` | The current process ID (discovered if possible and when not already defined as an OS environment variable). | If you use Logback, the following properties are also transferred: | Spring Environment | System Property | Comments | | --- | --- | --- | | `logging.logback.rollingpolicy.file-name-pattern` | `LOGBACK_ROLLINGPOLICY_FILE_NAME_PATTERN` | Pattern for rolled-over log file names (default `${LOG_FILE}.%d{yyyy-MM-dd}.%i.gz`). | | `logging.logback.rollingpolicy.clean-history-on-start` | `LOGBACK_ROLLINGPOLICY_CLEAN_HISTORY_ON_START` | Whether to clean the archive log files on startup. | | `logging.logback.rollingpolicy.max-file-size` | `LOGBACK_ROLLINGPOLICY_MAX_FILE_SIZE` | Maximum log file size. | | `logging.logback.rollingpolicy.total-size-cap` | `LOGBACK_ROLLINGPOLICY_TOTAL_SIZE_CAP` | Total size of log backups to be kept. | | `logging.logback.rollingpolicy.max-history` | `LOGBACK_ROLLINGPOLICY_MAX_HISTORY` | Maximum number of archive log files to keep. | All the supported logging systems can consult System properties when parsing their configuration files. See the default configurations in `spring-boot.jar` for examples: * [Logback](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/resources/org/springframework/boot/logging/logback/defaults.xml) * [Log4j 2](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/resources/org/springframework/boot/logging/log4j2/log4j2.xml) * [Java Util logging](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/resources/org/springframework/boot/logging/java/logging-file.properties) | | | | --- | --- | | | If you want to use a placeholder in a logging property, you should use [Spring Boot’s syntax](#features.external-config.files.property-placeholders) and not the syntax of the underlying framework. Notably, if you use Logback, you should use `:` as the delimiter between a property name and its default value and not use `:-`. | | | | | --- | --- | | | You can add MDC and other ad-hoc content to log lines by overriding only the `LOG_LEVEL_PATTERN` (or `logging.pattern.level` with Logback). For example, if you use `logging.pattern.level=user:%X{user} %5p`, then the default log format contains an MDC entry for "user", if it exists, as shown in the following example. ``` 2019-08-30 12:30:04.031 user:someone INFO 22174 --- [ nio-8080-exec-0] demo.Controller Handling authenticated request ``` | ### 4.9. Logback Extensions Spring Boot includes a number of extensions to Logback that can help with advanced configuration. You can use these extensions in your `logback-spring.xml` configuration file. | | | | --- | --- | | | Because the standard `logback.xml` configuration file is loaded too early, you cannot use extensions in it. You need to either use `logback-spring.xml` or define a `logging.config` property. | | | | | --- | --- | | | The extensions cannot be used with Logback’s [configuration scanning](https://logback.qos.ch/manual/configuration.html#autoScan). If you attempt to do so, making changes to the configuration file results in an error similar to one of the following being logged: | ``` ERROR in ch.qos.logback.core.joran.spi.Interpreter@4:71 - no applicable action for [springProperty], current ElementPath is [[configuration][springProperty]] ERROR in ch.qos.logback.core.joran.spi.Interpreter@4:71 - no applicable action for [springProfile], current ElementPath is [[configuration][springProfile]] ``` #### 4.9.1. Profile-specific Configuration The `<springProfile>` tag lets you optionally include or exclude sections of configuration based on the active Spring profiles. Profile sections are supported anywhere within the `<configuration>` element. Use the `name` attribute to specify which profile accepts the configuration. The `<springProfile>` tag can contain a profile name (for example `staging`) or a profile expression. A profile expression allows for more complicated profile logic to be expressed, for example `production & (eu-central | eu-west)`. Check the [reference guide](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/core.html#beans-definition-profiles-java) for more details. The following listing shows three sample profiles: ``` <springProfile name="staging"> <!-- configuration to be enabled when the "staging" profile is active --> </springProfile> <springProfile name="dev | staging"> <!-- configuration to be enabled when the "dev" or "staging" profiles are active --> </springProfile> <springProfile name="!production"> <!-- configuration to be enabled when the "production" profile is not active --> </springProfile> ``` #### 4.9.2. Environment Properties The `<springProperty>` tag lets you expose properties from the Spring `Environment` for use within Logback. Doing so can be useful if you want to access values from your `application.properties` file in your Logback configuration. The tag works in a similar way to Logback’s standard `<property>` tag. However, rather than specifying a direct `value`, you specify the `source` of the property (from the `Environment`). If you need to store the property somewhere other than in `local` scope, you can use the `scope` attribute. If you need a fallback value (in case the property is not set in the `Environment`), you can use the `defaultValue` attribute. The following example shows how to expose properties for use within Logback: ``` <springProperty scope="context" name="fluentHost" source="myapp.fluentd.host" defaultValue="localhost"/> <appender name="FLUENT" class="ch.qos.logback.more.appenders.DataFluentAppender"> <remoteHost>${fluentHost}</remoteHost> ... </appender> ``` | | | | --- | --- | | | The `source` must be specified in kebab case (such as `my.property-name`). However, properties can be added to the `Environment` by using the relaxed rules. | 5. Internationalization ------------------------ Spring Boot supports localized messages so that your application can cater to users of different language preferences. By default, Spring Boot looks for the presence of a `messages` resource bundle at the root of the classpath. | | | | --- | --- | | | The auto-configuration applies when the default properties file for the configured resource bundle is available (`messages.properties` by default). If your resource bundle contains only language-specific properties files, you are required to add the default. If no properties file is found that matches any of the configured base names, there will be no auto-configured `MessageSource`. | The basename of the resource bundle as well as several other attributes can be configured using the `spring.messages` namespace, as shown in the following example: Properties ``` spring.messages.basename=messages,config.i18n.messages spring.messages.fallback-to-system-locale=false ``` Yaml ``` spring: messages: basename: "messages,config.i18n.messages" fallback-to-system-locale: false ``` | | | | --- | --- | | | `spring.messages.basename` supports comma-separated list of locations, either a package qualifier or a resource resolved from the classpath root. | See [`MessageSourceProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/context/MessageSourceProperties.java) for more supported options. 6. JSON -------- Spring Boot provides integration with three JSON mapping libraries: * Gson * Jackson * JSON-B Jackson is the preferred and default library. ### 6.1. Jackson Auto-configuration for Jackson is provided and Jackson is part of `spring-boot-starter-json`. When Jackson is on the classpath an `ObjectMapper` bean is automatically configured. Several configuration properties are provided for [customizing the configuration of the `ObjectMapper`](howto#howto.spring-mvc.customize-jackson-objectmapper). #### 6.1.1. Custom Serializers and Deserializers If you use Jackson to serialize and deserialize JSON data, you might want to write your own `JsonSerializer` and `JsonDeserializer` classes. Custom serializers are usually [registered with Jackson through a module](https://github.com/FasterXML/jackson-docs/wiki/JacksonHowToCustomSerializers), but Spring Boot provides an alternative `@JsonComponent` annotation that makes it easier to directly register Spring Beans. You can use the `@JsonComponent` annotation directly on `JsonSerializer`, `JsonDeserializer` or `KeyDeserializer` implementations. You can also use it on classes that contain serializers/deserializers as inner classes, as shown in the following example: Java ``` import java.io.IOException; import com.fasterxml.jackson.core.JsonGenerator; import com.fasterxml.jackson.core.JsonParser; import com.fasterxml.jackson.core.ObjectCodec; import com.fasterxml.jackson.databind.DeserializationContext; import com.fasterxml.jackson.databind.JsonDeserializer; import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.JsonSerializer; import com.fasterxml.jackson.databind.SerializerProvider; import org.springframework.boot.jackson.JsonComponent; @JsonComponent public class MyJsonComponent { public static class Serializer extends JsonSerializer<MyObject> { @Override public void serialize(MyObject value, JsonGenerator jgen, SerializerProvider serializers) throws IOException { jgen.writeStartObject(); jgen.writeStringField("name", value.getName()); jgen.writeNumberField("age", value.getAge()); jgen.writeEndObject(); } } public static class Deserializer extends JsonDeserializer<MyObject> { @Override public MyObject deserialize(JsonParser jsonParser, DeserializationContext ctxt) throws IOException { ObjectCodec codec = jsonParser.getCodec(); JsonNode tree = codec.readTree(jsonParser); String name = tree.get("name").textValue(); int age = tree.get("age").intValue(); return new MyObject(name, age); } } } ``` Kotlin ``` import com.fasterxml.jackson.core.JsonGenerator import com.fasterxml.jackson.core.JsonParser import com.fasterxml.jackson.core.JsonProcessingException import com.fasterxml.jackson.databind.DeserializationContext import com.fasterxml.jackson.databind.JsonDeserializer import com.fasterxml.jackson.databind.JsonNode import com.fasterxml.jackson.databind.JsonSerializer import com.fasterxml.jackson.databind.SerializerProvider import org.springframework.boot.jackson.JsonComponent import java.io.IOException import kotlin.jvm.Throws @JsonComponent class MyJsonComponent { class Serializer : JsonSerializer<MyObject>() { @Throws(IOException::class) override fun serialize(value: MyObject, jgen: JsonGenerator, serializers: SerializerProvider) { jgen.writeStartObject() jgen.writeStringField("name", value.name) jgen.writeNumberField("age", value.age) jgen.writeEndObject() } } class Deserializer : JsonDeserializer<MyObject>() { @Throws(IOException::class, JsonProcessingException::class) override fun deserialize(jsonParser: JsonParser, ctxt: DeserializationContext): MyObject { val codec = jsonParser.codec val tree = codec.readTree<JsonNode>(jsonParser) val name = tree["name"].textValue() val age = tree["age"].intValue() return MyObject(name, age) } } } ``` All `@JsonComponent` beans in the `ApplicationContext` are automatically registered with Jackson. Because `@JsonComponent` is meta-annotated with `@Component`, the usual component-scanning rules apply. Spring Boot also provides [`JsonObjectSerializer`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jackson/JsonObjectSerializer.java) and [`JsonObjectDeserializer`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/jackson/JsonObjectDeserializer.java) base classes that provide useful alternatives to the standard Jackson versions when serializing objects. See [`JsonObjectSerializer`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/jackson/JsonObjectSerializer.html) and [`JsonObjectDeserializer`](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/jackson/JsonObjectDeserializer.html) in the Javadoc for details. The example above can be rewritten to use `JsonObjectSerializer`/`JsonObjectDeserializer` as follows: Java ``` import java.io.IOException; import com.fasterxml.jackson.core.JsonGenerator; import com.fasterxml.jackson.core.JsonParser; import com.fasterxml.jackson.core.ObjectCodec; import com.fasterxml.jackson.databind.DeserializationContext; import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.SerializerProvider; import org.springframework.boot.jackson.JsonComponent; import org.springframework.boot.jackson.JsonObjectDeserializer; import org.springframework.boot.jackson.JsonObjectSerializer; @JsonComponent public class MyJsonComponent { public static class Serializer extends JsonObjectSerializer<MyObject> { @Override protected void serializeObject(MyObject value, JsonGenerator jgen, SerializerProvider provider) throws IOException { jgen.writeStringField("name", value.getName()); jgen.writeNumberField("age", value.getAge()); } } public static class Deserializer extends JsonObjectDeserializer<MyObject> { @Override protected MyObject deserializeObject(JsonParser jsonParser, DeserializationContext context, ObjectCodec codec, JsonNode tree) throws IOException { String name = nullSafeValue(tree.get("name"), String.class); int age = nullSafeValue(tree.get("age"), Integer.class); return new MyObject(name, age); } } } ``` Kotlin ``` `object` import com.fasterxml.jackson.core.JsonGenerator import com.fasterxml.jackson.core.JsonParser import com.fasterxml.jackson.core.ObjectCodec import com.fasterxml.jackson.databind.DeserializationContext import com.fasterxml.jackson.databind.JsonNode import com.fasterxml.jackson.databind.SerializerProvider import org.springframework.boot.jackson.JsonComponent import org.springframework.boot.jackson.JsonObjectDeserializer import org.springframework.boot.jackson.JsonObjectSerializer import java.io.IOException import kotlin.jvm.Throws @JsonComponent class MyJsonComponent { class Serializer : JsonObjectSerializer<MyObject>() { @Throws(IOException::class) override fun serializeObject(value: MyObject, jgen: JsonGenerator, provider: SerializerProvider) { jgen.writeStringField("name", value.name) jgen.writeNumberField("age", value.age) } } class Deserializer : JsonObjectDeserializer<MyObject>() { @Throws(IOException::class) override fun deserializeObject(jsonParser: JsonParser, context: DeserializationContext, codec: ObjectCodec, tree: JsonNode): MyObject { val name = nullSafeValue(tree["name"], String::class.java) val age = nullSafeValue(tree["age"], Int::class.java) return MyObject(name, age) } } } ``` #### 6.1.2. Mixins Jackson has support for mixins that can be used to mix additional annotations into those already declared on a target class. Spring Boot’s Jackson auto-configuration will scan your application’s packages for classes annotated with `@JsonMixin` and register them with the auto-configured `ObjectMapper`. The registration is performed by Spring Boot’s `JsonMixinModule`. ### 6.2. Gson Auto-configuration for Gson is provided. When Gson is on the classpath a `Gson` bean is automatically configured. Several `spring.gson.*` configuration properties are provided for customizing the configuration. To take more control, one or more `GsonBuilderCustomizer` beans can be used. ### 6.3. JSON-B Auto-configuration for JSON-B is provided. When the JSON-B API and an implementation are on the classpath a `Jsonb` bean will be automatically configured. The preferred JSON-B implementation is Apache Johnzon for which dependency management is provided. 7. Task Execution and Scheduling --------------------------------- In the absence of an `Executor` bean in the context, Spring Boot auto-configures a `ThreadPoolTaskExecutor` with sensible defaults that can be automatically associated to asynchronous task execution (`@EnableAsync`) and Spring MVC asynchronous request processing. | | | | --- | --- | | | If you have defined a custom `Executor` in the context, regular task execution (that is `@EnableAsync`) will use it transparently but the Spring MVC support will not be configured as it requires an `AsyncTaskExecutor` implementation (named `applicationTaskExecutor`). Depending on your target arrangement, you could change your `Executor` into a `ThreadPoolTaskExecutor` or define both a `ThreadPoolTaskExecutor` and an `AsyncConfigurer` wrapping your custom `Executor`. The auto-configured `TaskExecutorBuilder` allows you to easily create instances that reproduce what the auto-configuration does by default. | The thread pool uses 8 core threads that can grow and shrink according to the load. Those default settings can be fine-tuned using the `spring.task.execution` namespace, as shown in the following example: Properties ``` spring.task.execution.pool.max-size=16 spring.task.execution.pool.queue-capacity=100 spring.task.execution.pool.keep-alive=10s ``` Yaml ``` spring: task: execution: pool: max-size: 16 queue-capacity: 100 keep-alive: "10s" ``` This changes the thread pool to use a bounded queue so that when the queue is full (100 tasks), the thread pool increases to maximum 16 threads. Shrinking of the pool is more aggressive as threads are reclaimed when they are idle for 10 seconds (rather than 60 seconds by default). A `ThreadPoolTaskScheduler` can also be auto-configured if need to be associated to scheduled task execution (using `@EnableScheduling` for instance). The thread pool uses one thread by default and its settings can be fine-tuned using the `spring.task.scheduling` namespace, as shown in the following example: Properties ``` spring.task.scheduling.thread-name-prefix=scheduling- spring.task.scheduling.pool.size=2 ``` Yaml ``` spring: task: scheduling: thread-name-prefix: "scheduling-" pool: size: 2 ``` Both a `TaskExecutorBuilder` bean and a `TaskSchedulerBuilder` bean are made available in the context if a custom executor or scheduler needs to be created. 8. Testing ----------- Spring Boot provides a number of utilities and annotations to help when testing your application. Test support is provided by two modules: `spring-boot-test` contains core items, and `spring-boot-test-autoconfigure` supports auto-configuration for tests. Most developers use the `spring-boot-starter-test` “Starter”, which imports both Spring Boot test modules as well as JUnit Jupiter, AssertJ, Hamcrest, and a number of other useful libraries. | | | | --- | --- | | | If you have tests that use JUnit 4, JUnit 5’s vintage engine can be used to run them. To use the vintage engine, add a dependency on `junit-vintage-engine`, as shown in the following example: ``` <dependency> <groupId>org.junit.vintage</groupId> <artifactId>junit-vintage-engine</artifactId> <scope>test</scope> <exclusions> <exclusion> <groupId>org.hamcrest</groupId> <artifactId>hamcrest-core</artifactId> </exclusion> </exclusions> </dependency> ``` | `hamcrest-core` is excluded in favor of `org.hamcrest:hamcrest` that is part of `spring-boot-starter-test`. ### 8.1. Test Scope Dependencies The `spring-boot-starter-test` “Starter” (in the `test` `scope`) contains the following provided libraries: * [JUnit 5](https://junit.org/junit5/): The de-facto standard for unit testing Java applications. * [Spring Test](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#integration-testing) & Spring Boot Test: Utilities and integration test support for Spring Boot applications. * [AssertJ](https://assertj.github.io/doc/): A fluent assertion library. * [Hamcrest](https://github.com/hamcrest/JavaHamcrest): A library of matcher objects (also known as constraints or predicates). * [Mockito](https://site.mockito.org/): A Java mocking framework. * [JSONassert](https://github.com/skyscreamer/JSONassert): An assertion library for JSON. * [JsonPath](https://github.com/jayway/JsonPath): XPath for JSON. We generally find these common libraries to be useful when writing tests. If these libraries do not suit your needs, you can add additional test dependencies of your own. ### 8.2. Testing Spring Applications One of the major advantages of dependency injection is that it should make your code easier to unit test. You can instantiate objects by using the `new` operator without even involving Spring. You can also use *mock objects* instead of real dependencies. Often, you need to move beyond unit testing and start integration testing (with a Spring `ApplicationContext`). It is useful to be able to perform integration testing without requiring deployment of your application or needing to connect to other infrastructure. The Spring Framework includes a dedicated test module for such integration testing. You can declare a dependency directly to `org.springframework:spring-test` or use the `spring-boot-starter-test` “Starter” to pull it in transitively. If you have not used the `spring-test` module before, you should start by reading the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#testing) of the Spring Framework reference documentation. ### 8.3. Testing Spring Boot Applications A Spring Boot application is a Spring `ApplicationContext`, so nothing very special has to be done to test it beyond what you would normally do with a vanilla Spring context. | | | | --- | --- | | | External properties, logging, and other features of Spring Boot are installed in the context by default only if you use `SpringApplication` to create it. | Spring Boot provides a `@SpringBootTest` annotation, which can be used as an alternative to the standard `spring-test` `@ContextConfiguration` annotation when you need Spring Boot features. The annotation works by [creating the `ApplicationContext` used in your tests through `SpringApplication`](#features.testing.spring-boot-applications.detecting-configuration). In addition to `@SpringBootTest` a number of other annotations are also provided for [testing more specific slices](#features.testing.spring-boot-applications.autoconfigured-tests) of an application. | | | | --- | --- | | | If you are using JUnit 4, do not forget to also add `@RunWith(SpringRunner.class)` to your test, otherwise the annotations will be ignored. If you are using JUnit 5, there is no need to add the equivalent `@ExtendWith(SpringExtension.class)` as `@SpringBootTest` and the other `@…​Test` annotations are already annotated with it. | By default, `@SpringBootTest` will not start a server. You can use the `webEnvironment` attribute of `@SpringBootTest` to further refine how your tests run: * `MOCK`(Default) : Loads a web `ApplicationContext` and provides a mock web environment. Embedded servers are not started when using this annotation. If a web environment is not available on your classpath, this mode transparently falls back to creating a regular non-web `ApplicationContext`. It can be used in conjunction with [`@AutoConfigureMockMvc` or `@AutoConfigureWebTestClient`](#features.testing.spring-boot-applications.with-mock-environment) for mock-based testing of your web application. * `RANDOM_PORT`: Loads a `WebServerApplicationContext` and provides a real web environment. Embedded servers are started and listen on a random port. * `DEFINED_PORT`: Loads a `WebServerApplicationContext` and provides a real web environment. Embedded servers are started and listen on a defined port (from your `application.properties`) or on the default port of `8080`. * `NONE`: Loads an `ApplicationContext` by using `SpringApplication` but does not provide *any* web environment (mock or otherwise). | | | | --- | --- | | | If your test is `@Transactional`, it rolls back the transaction at the end of each test method by default. However, as using this arrangement with either `RANDOM_PORT` or `DEFINED_PORT` implicitly provides a real servlet environment, the HTTP client and server run in separate threads and, thus, in separate transactions. Any transaction initiated on the server does not roll back in this case. | | | | | --- | --- | | | `@SpringBootTest` with `webEnvironment = WebEnvironment.RANDOM_PORT` will also start the management server on a separate random port if your application uses a different port for the management server. | #### 8.3.1. Detecting Web Application Type If Spring MVC is available, a regular MVC-based application context is configured. If you have only Spring WebFlux, we will detect that and configure a WebFlux-based application context instead. If both are present, Spring MVC takes precedence. If you want to test a reactive web application in this scenario, you must set the `spring.main.web-application-type` property: Java ``` import org.springframework.boot.test.context.SpringBootTest; @SpringBootTest(properties = "spring.main.web-application-type=reactive") class MyWebFluxTests { // ... } ``` Kotlin ``` import org.springframework.boot.test.context.SpringBootTest @SpringBootTest(properties = ["spring.main.web-application-type=reactive"]) class MyWebFluxTests { // ... } ``` #### 8.3.2. Detecting Test Configuration If you are familiar with the Spring Test Framework, you may be used to using `@ContextConfiguration(classes=…​)` in order to specify which Spring `@Configuration` to load. Alternatively, you might have often used nested `@Configuration` classes within your test. When testing Spring Boot applications, this is often not required. Spring Boot’s `@*Test` annotations search for your primary configuration automatically whenever you do not explicitly define one. The search algorithm works up from the package that contains the test until it finds a class annotated with `@SpringBootApplication` or `@SpringBootConfiguration`. As long as you [structured your code](using#using.structuring-your-code) in a sensible way, your main configuration is usually found. | | | | --- | --- | | | If you use a [test annotation to test a more specific slice of your application](#features.testing.spring-boot-applications.autoconfigured-tests), you should avoid adding configuration settings that are specific to a particular area on the [main method’s application class](#features.testing.spring-boot-applications.user-configuration-and-slicing). The underlying component scan configuration of `@SpringBootApplication` defines exclude filters that are used to make sure slicing works as expected. If you are using an explicit `@ComponentScan` directive on your `@SpringBootApplication`-annotated class, be aware that those filters will be disabled. If you are using slicing, you should define them again. | If you want to customize the primary configuration, you can use a nested `@TestConfiguration` class. Unlike a nested `@Configuration` class, which would be used instead of your application’s primary configuration, a nested `@TestConfiguration` class is used in addition to your application’s primary configuration. | | | | --- | --- | | | Spring’s test framework caches application contexts between tests. Therefore, as long as your tests share the same configuration (no matter how it is discovered), the potentially time-consuming process of loading the context happens only once. | #### 8.3.3. Excluding Test Configuration If your application uses component scanning (for example, if you use `@SpringBootApplication` or `@ComponentScan`), you may find top-level configuration classes that you created only for specific tests accidentally get picked up everywhere. As we [have seen earlier](#features.testing.spring-boot-applications.detecting-configuration), `@TestConfiguration` can be used on an inner class of a test to customize the primary configuration. When placed on a top-level class, `@TestConfiguration` indicates that classes in `src/test/java` should not be picked up by scanning. You can then import that class explicitly where it is required, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.context.annotation.Import; @SpringBootTest @Import(MyTestsConfiguration.class) class MyTests { @Test void exampleTest() { // ... } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.boot.test.context.SpringBootTest import org.springframework.context.annotation.Import @SpringBootTest @Import(MyTestsConfiguration::class) class MyTests { @Test fun exampleTest() { // ... } } ``` | | | | --- | --- | | | If you directly use `@ComponentScan` (that is, not through `@SpringBootApplication`) you need to register the `TypeExcludeFilter` with it. See [the Javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/context/TypeExcludeFilter.html) for details. | #### 8.3.4. Using Application Arguments If your application expects [arguments](#features.spring-application.application-arguments), you can have `@SpringBootTest` inject them using the `args` attribute. Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.ApplicationArguments; import org.springframework.boot.test.context.SpringBootTest; import static org.assertj.core.api.Assertions.assertThat; @SpringBootTest(args = "--app.test=one") class MyApplicationArgumentTests { @Test void applicationArgumentsPopulated(@Autowired ApplicationArguments args) { assertThat(args.getOptionNames()).containsOnly("app.test"); assertThat(args.getOptionValues("app.test")).containsOnly("one"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.ApplicationArguments import org.springframework.boot.test.context.SpringBootTest @SpringBootTest(args = ["--app.test=one"]) class MyApplicationArgumentTests { @Test fun applicationArgumentsPopulated(@Autowired args: ApplicationArguments) { assertThat(args.optionNames).containsOnly("app.test") assertThat(args.getOptionValues("app.test")).containsOnly("one") } } ``` #### 8.3.5. Testing with a mock environment By default, `@SpringBootTest` does not start the server but instead sets up a mock environment for testing web endpoints. With Spring MVC, we can query our web endpoints using [`MockMvc`](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#spring-mvc-test-framework) or `WebTestClient`, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.test.web.reactive.server.WebTestClient; import org.springframework.test.web.servlet.MockMvc; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; @SpringBootTest @AutoConfigureMockMvc class MyMockMvcTests { @Test void testWithMockMvc(@Autowired MockMvc mvc) throws Exception { mvc.perform(get("/")).andExpect(status().isOk()).andExpect(content().string("Hello World")); } // If Spring WebFlux is on the classpath, you can drive MVC tests with a WebTestClient @Test void testWithWebTestClient(@Autowired WebTestClient webClient) { webClient .get().uri("/") .exchange() .expectStatus().isOk() .expectBody(String.class).isEqualTo("Hello World"); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc import org.springframework.boot.test.context.SpringBootTest import org.springframework.test.web.reactive.server.WebTestClient import org.springframework.test.web.reactive.server.expectBody import org.springframework.test.web.servlet.MockMvc import org.springframework.test.web.servlet.request.MockMvcRequestBuilders import org.springframework.test.web.servlet.result.MockMvcResultMatchers @SpringBootTest @AutoConfigureMockMvc class MyMockMvcTests { @Test fun testWithMockMvc(@Autowired mvc: MockMvc) { mvc.perform(MockMvcRequestBuilders.get("/")).andExpect(MockMvcResultMatchers.status().isOk) .andExpect(MockMvcResultMatchers.content().string("Hello World")) } // If Spring WebFlux is on the classpath, you can drive MVC tests with a WebTestClient @Test fun testWithWebTestClient(@Autowired webClient: WebTestClient) { webClient .get().uri("/") .exchange() .expectStatus().isOk .expectBody<String>().isEqualTo("Hello World") } } ``` | | | | --- | --- | | | If you want to focus only on the web layer and not start a complete `ApplicationContext`, consider [using `@WebMvcTest` instead](#features.testing.spring-boot-applications.spring-mvc-tests). | With Spring WebFlux endpoints, you can use [`WebTestClient`](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#webtestclient-tests) as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.reactive.AutoConfigureWebTestClient; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.test.web.reactive.server.WebTestClient; @SpringBootTest @AutoConfigureWebTestClient class MyMockWebTestClientTests { @Test void exampleTest(@Autowired WebTestClient webClient) { webClient .get().uri("/") .exchange() .expectStatus().isOk() .expectBody(String.class).isEqualTo("Hello World"); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.reactive.AutoConfigureWebTestClient import org.springframework.boot.test.context.SpringBootTest import org.springframework.test.web.reactive.server.WebTestClient import org.springframework.test.web.reactive.server.expectBody @SpringBootTest @AutoConfigureWebTestClient class MyMockWebTestClientTests { @Test fun exampleTest(@Autowired webClient: WebTestClient) { webClient .get().uri("/") .exchange() .expectStatus().isOk .expectBody<String>().isEqualTo("Hello World") } } ``` | | | | --- | --- | | | Testing within a mocked environment is usually faster than running with a full servlet container. However, since mocking occurs at the Spring MVC layer, code that relies on lower-level servlet container behavior cannot be directly tested with MockMvc. For example, Spring Boot’s error handling is based on the “error page” support provided by the servlet container. This means that, whilst you can test your MVC layer throws and handles exceptions as expected, you cannot directly test that a specific [custom error page](web#web.servlet.spring-mvc.error-handling.error-pages) is rendered. If you need to test these lower-level concerns, you can start a fully running server as described in the next section. | #### 8.3.6. Testing with a running server If you need to start a full running server, we recommend that you use random ports. If you use `@SpringBootTest(webEnvironment=WebEnvironment.RANDOM_PORT)`, an available port is picked at random each time your test runs. The `@LocalServerPort` annotation can be used to [inject the actual port used](howto#howto.webserver.discover-port) into your test. For convenience, tests that need to make REST calls to the started server can additionally `@Autowire` a [`WebTestClient`](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#webtestclient-tests), which resolves relative links to the running server and comes with a dedicated API for verifying responses, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.boot.test.context.SpringBootTest.WebEnvironment; import org.springframework.test.web.reactive.server.WebTestClient; @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MyRandomPortWebTestClientTests { @Test void exampleTest(@Autowired WebTestClient webClient) { webClient .get().uri("/") .exchange() .expectStatus().isOk() .expectBody(String.class).isEqualTo("Hello World"); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.context.SpringBootTest.WebEnvironment import org.springframework.test.web.reactive.server.WebTestClient import org.springframework.test.web.reactive.server.expectBody @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MyRandomPortWebTestClientTests { @Test fun exampleTest(@Autowired webClient: WebTestClient) { webClient .get().uri("/") .exchange() .expectStatus().isOk .expectBody<String>().isEqualTo("Hello World") } } ``` | | | | --- | --- | | | `WebTestClient` can be used against both live servers and [mock environments](#features.testing.spring-boot-applications.with-mock-environment). | This setup requires `spring-webflux` on the classpath. If you can not or will not add webflux, Spring Boot also provides a `TestRestTemplate` facility: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.boot.test.context.SpringBootTest.WebEnvironment; import org.springframework.boot.test.web.client.TestRestTemplate; import static org.assertj.core.api.Assertions.assertThat; @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MyRandomPortTestRestTemplateTests { @Test void exampleTest(@Autowired TestRestTemplate restTemplate) { String body = restTemplate.getForObject("/", String.class); assertThat(body).isEqualTo("Hello World"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.context.SpringBootTest.WebEnvironment import org.springframework.boot.test.web.client.TestRestTemplate @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MyRandomPortTestRestTemplateTests { @Test fun exampleTest(@Autowired restTemplate: TestRestTemplate) { val body = restTemplate.getForObject("/", String::class.java) assertThat(body).isEqualTo("Hello World") } } ``` #### 8.3.7. Customizing WebTestClient To customize the `WebTestClient` bean, configure a `WebTestClientBuilderCustomizer` bean. Any such beans are called with the `WebTestClient.Builder` that is used to create the `WebTestClient`. #### 8.3.8. Using JMX As the test context framework caches context, JMX is disabled by default to prevent identical components to register on the same domain. If such test needs access to an `MBeanServer`, consider marking it dirty as well: Java ``` import javax.management.MBeanServer; import javax.management.MalformedObjectNameException; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.test.annotation.DirtiesContext; import org.springframework.test.context.junit.jupiter.SpringExtension; import static org.assertj.core.api.Assertions.assertThat; @ExtendWith(SpringExtension.class) @SpringBootTest(properties = "spring.jmx.enabled=true") @DirtiesContext class MyJmxTests { @Autowired private MBeanServer mBeanServer; @Test void exampleTest() throws MalformedObjectNameException { assertThat(this.mBeanServer.getDomains()).contains("java.lang"); // ... } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.junit.jupiter.api.extension.ExtendWith import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.context.SpringBootTest import org.springframework.test.annotation.DirtiesContext import org.springframework.test.context.junit.jupiter.SpringExtension import javax.management.MBeanServer @ExtendWith(SpringExtension::class) @SpringBootTest(properties = ["spring.jmx.enabled=true"]) @DirtiesContext class MyJmxTests(@Autowired val mBeanServer: MBeanServer) { @Test fun exampleTest() { assertThat(mBeanServer.domains).contains("java.lang") // ... } } ``` #### 8.3.9. Using Metrics Regardless of your classpath, meter registries, except the in-memory backed, are not auto-configured when using `@SpringBootTest`. If you need to export metrics to a different backend as part of an integration test, annotate it with `@AutoConfigureMetrics`. #### 8.3.10. Mocking and Spying Beans When running tests, it is sometimes necessary to mock certain components within your application context. For example, you may have a facade over some remote service that is unavailable during development. Mocking can also be useful when you want to simulate failures that might be hard to trigger in a real environment. Spring Boot includes a `@MockBean` annotation that can be used to define a Mockito mock for a bean inside your `ApplicationContext`. You can use the annotation to add new beans or replace a single existing bean definition. The annotation can be used directly on test classes, on fields within your test, or on `@Configuration` classes and fields. When used on a field, the instance of the created mock is also injected. Mock beans are automatically reset after each test method. | | | | --- | --- | | | If your test uses one of Spring Boot’s test annotations (such as `@SpringBootTest`), this feature is automatically enabled. To use this feature with a different arrangement, listeners must be explicitly added, as shown in the following example: Java ``` import org.springframework.boot.test.mock.mockito.MockitoTestExecutionListener; import org.springframework.boot.test.mock.mockito.ResetMocksTestExecutionListener; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.TestExecutionListeners; @ContextConfiguration(classes = MyConfig.class) @TestExecutionListeners({ MockitoTestExecutionListener.class, ResetMocksTestExecutionListener.class }) class MyTests { // ... } ``` Kotlin ``` import org.springframework.boot.test.mock.mockito.MockitoTestExecutionListener import org.springframework.boot.test.mock.mockito.ResetMocksTestExecutionListener import org.springframework.test.context.ContextConfiguration import org.springframework.test.context.TestExecutionListeners @ContextConfiguration(classes = [MyConfig::class]) @TestExecutionListeners( MockitoTestExecutionListener::class, ResetMocksTestExecutionListener::class ) class MyTests { // ... } ``` | The following example replaces an existing `RemoteService` bean with a mock implementation: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.boot.test.mock.mockito.MockBean; import static org.assertj.core.api.Assertions.assertThat; import static org.mockito.BDDMockito.given; @SpringBootTest class MyTests { @Autowired private Reverser reverser; @MockBean private RemoteService remoteService; @Test void exampleTest() { given(this.remoteService.getValue()).willReturn("spring"); String reverse = this.reverser.getReverseValue(); // Calls injected RemoteService assertThat(reverse).isEqualTo("gnirps"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.mockito.BDDMockito.given import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.mock.mockito.MockBean @SpringBootTest class MyTests(@Autowired val reverser: Reverser, @MockBean val remoteService: RemoteService) { @Test fun exampleTest() { given(remoteService.value).willReturn("spring") val reverse = reverser.reverseValue // Calls injected RemoteService assertThat(reverse).isEqualTo("gnirps") } } ``` | | | | --- | --- | | | `@MockBean` cannot be used to mock the behavior of a bean that is exercised during application context refresh. By the time the test is executed, the application context refresh has completed and it is too late to configure the mocked behavior. We recommend using a `@Bean` method to create and configure the mock in this situation. | Additionally, you can use `@SpyBean` to wrap any existing bean with a Mockito `spy`. See the [Javadoc](https://docs.spring.io/spring-boot/docs/2.7.0/api/org/springframework/boot/test/mock/mockito/SpyBean.html) for full details. | | | | --- | --- | | | CGLib proxies, such as those created for scoped beans, declare the proxied methods as `final`. This stops Mockito from functioning correctly as it cannot mock or spy on `final` methods in its default configuration. If you want to mock or spy on such a bean, configure Mockito to use its inline mock maker by adding `org.mockito:mockito-inline` to your application’s test dependencies. This allows Mockito to mock and spy on `final` methods. | | | | | --- | --- | | | While Spring’s test framework caches application contexts between tests and reuses a context for tests sharing the same configuration, the use of `@MockBean` or `@SpyBean` influences the cache key, which will most likely increase the number of contexts. | | | | | --- | --- | | | If you are using `@SpyBean` to spy on a bean with `@Cacheable` methods that refer to parameters by name, your application must be compiled with `-parameters`. This ensures that the parameter names are available to the caching infrastructure once the bean has been spied upon. | | | | | --- | --- | | | When you are using `@SpyBean` to spy on a bean that is proxied by Spring, you may need to remove Spring’s proxy in some situations, for example when setting expectations using `given` or `when`. Use `AopTestUtils.getTargetObject(yourProxiedSpy)` to do so. | #### 8.3.11. Auto-configured Tests Spring Boot’s auto-configuration system works well for applications but can sometimes be a little too much for tests. It often helps to load only the parts of the configuration that are required to test a “slice” of your application. For example, you might want to test that Spring MVC controllers are mapping URLs correctly, and you do not want to involve database calls in those tests, or you might want to test JPA entities, and you are not interested in the web layer when those tests run. The `spring-boot-test-autoconfigure` module includes a number of annotations that can be used to automatically configure such “slices”. Each of them works in a similar way, providing a `@…​Test` annotation that loads the `ApplicationContext` and one or more `@AutoConfigure…​` annotations that can be used to customize auto-configuration settings. | | | | --- | --- | | | Each slice restricts component scan to appropriate components and loads a very restricted set of auto-configuration classes. If you need to exclude one of them, most `@…​Test` annotations provide an `excludeAutoConfiguration` attribute. Alternatively, you can use `@ImportAutoConfiguration#exclude`. | | | | | --- | --- | | | Including multiple “slices” by using several `@…​Test` annotations in one test is not supported. If you need multiple “slices”, pick one of the `@…​Test` annotations and include the `@AutoConfigure…​` annotations of the other “slices” by hand. | | | | | --- | --- | | | It is also possible to use the `@AutoConfigure…​` annotations with the standard `@SpringBootTest` annotation. You can use this combination if you are not interested in “slicing” your application but you want some of the auto-configured test beans. | #### 8.3.12. Auto-configured JSON Tests To test that object JSON serialization and deserialization is working as expected, you can use the `@JsonTest` annotation. `@JsonTest` auto-configures the available supported JSON mapper, which can be one of the following libraries: * Jackson `ObjectMapper`, any `@JsonComponent` beans and any Jackson `Module`s * `Gson` * `Jsonb` | | | | --- | --- | | | A list of the auto-configurations that are enabled by `@JsonTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | If you need to configure elements of the auto-configuration, you can use the `@AutoConfigureJsonTesters` annotation. Spring Boot includes AssertJ-based helpers that work with the JSONAssert and JsonPath libraries to check that JSON appears as expected. The `JacksonTester`, `GsonTester`, `JsonbTester`, and `BasicJsonTester` classes can be used for Jackson, Gson, Jsonb, and Strings respectively. Any helper fields on the test class can be `@Autowired` when using `@JsonTest`. The following example shows a test class for Jackson: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.json.JsonTest; import org.springframework.boot.test.json.JacksonTester; import static org.assertj.core.api.Assertions.assertThat; @JsonTest class MyJsonTests { @Autowired private JacksonTester<VehicleDetails> json; @Test void serialize() throws Exception { VehicleDetails details = new VehicleDetails("Honda", "Civic"); // Assert against a `.json` file in the same package as the test assertThat(this.json.write(details)).isEqualToJson("expected.json"); // Or use JSON path based assertions assertThat(this.json.write(details)).hasJsonPathStringValue("@.make"); assertThat(this.json.write(details)).extractingJsonPathStringValue("@.make").isEqualTo("Honda"); } @Test void deserialize() throws Exception { String content = "{\"make\":\"Ford\",\"model\":\"Focus\"}"; assertThat(this.json.parse(content)).isEqualTo(new VehicleDetails("Ford", "Focus")); assertThat(this.json.parseObject(content).getMake()).isEqualTo("Ford"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.json.JsonTest import org.springframework.boot.test.json.JacksonTester @JsonTest class MyJsonTests(@Autowired val json: JacksonTester<VehicleDetails>) { @Test fun serialize() { val details = VehicleDetails("Honda", "Civic") // Assert against a `.json` file in the same package as the test assertThat(json.write(details)).isEqualToJson("expected.json") // Or use JSON path based assertions assertThat(json.write(details)).hasJsonPathStringValue("@.make") assertThat(json.write(details)).extractingJsonPathStringValue("@.make").isEqualTo("Honda") } @Test fun deserialize() { val content = "{\"make\":\"Ford\",\"model\":\"Focus\"}" assertThat(json.parse(content)).isEqualTo(VehicleDetails("Ford", "Focus")) assertThat(json.parseObject(content).make).isEqualTo("Ford") } } ``` | | | | --- | --- | | | JSON helper classes can also be used directly in standard unit tests. To do so, call the `initFields` method of the helper in your `@Before` method if you do not use `@JsonTest`. | If you use Spring Boot’s AssertJ-based helpers to assert on a number value at a given JSON path, you might not be able to use `isEqualTo` depending on the type. Instead, you can use AssertJ’s `satisfies` to assert that the value matches the given condition. For instance, the following example asserts that the actual number is a float value close to `0.15` within an offset of `0.01`. Java ``` @Test void someTest() throws Exception { SomeObject value = new SomeObject(0.152f); assertThat(this.json.write(value)).extractingJsonPathNumberValue("@.test.numberValue") .satisfies((number) -> assertThat(number.floatValue()).isCloseTo(0.15f, within(0.01f))); } ``` Kotlin ``` @Test fun someTest() { val value = SomeObject(0.152f) assertThat(json.write(value)).extractingJsonPathNumberValue("@.test.numberValue") .satisfies(ThrowingConsumer { number -> assertThat(number.toFloat()).isCloseTo(0.15f, within(0.01f)) }) } ``` #### 8.3.13. Auto-configured Spring MVC Tests To test whether Spring MVC controllers are working as expected, use the `@WebMvcTest` annotation. `@WebMvcTest` auto-configures the Spring MVC infrastructure and limits scanned beans to `@Controller`, `@ControllerAdvice`, `@JsonComponent`, `Converter`, `GenericConverter`, `Filter`, `HandlerInterceptor`, `WebMvcConfigurer`, `WebMvcRegistrations`, and `HandlerMethodArgumentResolver`. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@WebMvcTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@WebMvcTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | | | | | --- | --- | | | If you need to register extra components, such as the Jackson `Module`, you can import additional configuration classes by using `@Import` on your test. | Often, `@WebMvcTest` is limited to a single controller and is used in combination with `@MockBean` to provide mock implementations for required collaborators. `@WebMvcTest` also auto-configures `MockMvc`. Mock MVC offers a powerful way to quickly test MVC controllers without needing to start a full HTTP server. | | | | --- | --- | | | You can also auto-configure `MockMvc` in a non-`@WebMvcTest` (such as `@SpringBootTest`) by annotating it with `@AutoConfigureMockMvc`. The following example uses `MockMvc`: | Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest; import org.springframework.boot.test.mock.mockito.MockBean; import org.springframework.http.MediaType; import org.springframework.test.web.servlet.MockMvc; import static org.mockito.BDDMockito.given; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; @WebMvcTest(UserVehicleController.class) class MyControllerTests { @Autowired private MockMvc mvc; @MockBean private UserVehicleService userVehicleService; @Test void testExample() throws Exception { given(this.userVehicleService.getVehicleDetails("sboot")) .willReturn(new VehicleDetails("Honda", "Civic")); this.mvc.perform(get("/sboot/vehicle").accept(MediaType.TEXT\_PLAIN)) .andExpect(status().isOk()) .andExpect(content().string("Honda Civic")); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.mockito.BDDMockito.given import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest import org.springframework.boot.test.mock.mockito.MockBean import org.springframework.http.MediaType import org.springframework.test.web.servlet.MockMvc import org.springframework.test.web.servlet.request.MockMvcRequestBuilders import org.springframework.test.web.servlet.result.MockMvcResultMatchers @WebMvcTest(UserVehicleController::class) class MyControllerTests(@Autowired val mvc: MockMvc) { @MockBean lateinit var userVehicleService: UserVehicleService @Test fun testExample() { given(userVehicleService.getVehicleDetails("sboot")) .willReturn(VehicleDetails("Honda", "Civic")) mvc.perform(MockMvcRequestBuilders.get("/sboot/vehicle").accept(MediaType.TEXT\_PLAIN)) .andExpect(MockMvcResultMatchers.status().isOk) .andExpect(MockMvcResultMatchers.content().string("Honda Civic")) } } ``` | | | | --- | --- | | | If you need to configure elements of the auto-configuration (for example, when servlet filters should be applied) you can use attributes in the `@AutoConfigureMockMvc` annotation. | If you use HtmlUnit and Selenium, auto-configuration also provides an HtmlUnit `WebClient` bean and/or a Selenium `WebDriver` bean. The following example uses HtmlUnit: Java ``` import com.gargoylesoftware.htmlunit.WebClient; import com.gargoylesoftware.htmlunit.html.HtmlPage; import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest; import org.springframework.boot.test.mock.mockito.MockBean; import static org.assertj.core.api.Assertions.assertThat; import static org.mockito.BDDMockito.given; @WebMvcTest(UserVehicleController.class) class MyHtmlUnitTests { @Autowired private WebClient webClient; @MockBean private UserVehicleService userVehicleService; @Test void testExample() throws Exception { given(this.userVehicleService.getVehicleDetails("sboot")).willReturn(new VehicleDetails("Honda", "Civic")); HtmlPage page = this.webClient.getPage("/sboot/vehicle.html"); assertThat(page.getBody().getTextContent()).isEqualTo("Honda Civic"); } } ``` Kotlin ``` import com.gargoylesoftware.htmlunit.WebClient import com.gargoylesoftware.htmlunit.html.HtmlPage import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.mockito.BDDMockito.given import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest import org.springframework.boot.test.mock.mockito.MockBean @WebMvcTest(UserVehicleController::class) class MyHtmlUnitTests(@Autowired val webClient: WebClient) { @MockBean lateinit var userVehicleService: UserVehicleService @Test fun testExample() { given(userVehicleService.getVehicleDetails("sboot")).willReturn(VehicleDetails("Honda", "Civic")) val page = webClient.getPage<HtmlPage>("/sboot/vehicle.html") assertThat(page.body.textContent).isEqualTo("Honda Civic") } } ``` | | | | --- | --- | | | By default, Spring Boot puts `WebDriver` beans in a special “scope” to ensure that the driver exits after each test and that a new instance is injected. If you do not want this behavior, you can add `@Scope("singleton")` to your `WebDriver` `@Bean` definition. | | | | | --- | --- | | | The `webDriver` scope created by Spring Boot will replace any user defined scope of the same name. If you define your own `webDriver` scope you may find it stops working when you use `@WebMvcTest`. | If you have Spring Security on the classpath, `@WebMvcTest` will also scan `WebSecurityConfigurer` beans. Instead of disabling security completely for such tests, you can use Spring Security’s test support. More details on how to use Spring Security’s `MockMvc` support can be found in this *[howto.html](howto#howto.testing.with-spring-security)* how-to section. | | | | --- | --- | | | Sometimes writing Spring MVC tests is not enough; Spring Boot can help you run [full end-to-end tests with an actual server](#features.testing.spring-boot-applications.with-running-server). | #### 8.3.14. Auto-configured Spring WebFlux Tests To test that [Spring WebFlux](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html) controllers are working as expected, you can use the `@WebFluxTest` annotation. `@WebFluxTest` auto-configures the Spring WebFlux infrastructure and limits scanned beans to `@Controller`, `@ControllerAdvice`, `@JsonComponent`, `Converter`, `GenericConverter`, `WebFilter`, and `WebFluxConfigurer`. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@WebFluxTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configurations that are enabled by `@WebFluxTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | | | | | --- | --- | | | If you need to register extra components, such as Jackson `Module`, you can import additional configuration classes using `@Import` on your test. | Often, `@WebFluxTest` is limited to a single controller and used in combination with the `@MockBean` annotation to provide mock implementations for required collaborators. `@WebFluxTest` also auto-configures [`WebTestClient`](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#webtestclient), which offers a powerful way to quickly test WebFlux controllers without needing to start a full HTTP server. | | | | --- | --- | | | You can also auto-configure `WebTestClient` in a non-`@WebFluxTest` (such as `@SpringBootTest`) by annotating it with `@AutoConfigureWebTestClient`. The following example shows a class that uses both `@WebFluxTest` and a `WebTestClient`: | Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest; import org.springframework.boot.test.mock.mockito.MockBean; import org.springframework.http.MediaType; import org.springframework.test.web.reactive.server.WebTestClient; import static org.mockito.BDDMockito.given; @WebFluxTest(UserVehicleController.class) class MyControllerTests { @Autowired private WebTestClient webClient; @MockBean private UserVehicleService userVehicleService; @Test void testExample() throws Exception { given(this.userVehicleService.getVehicleDetails("sboot")) .willReturn(new VehicleDetails("Honda", "Civic")); this.webClient.get().uri("/sboot/vehicle").accept(MediaType.TEXT\_PLAIN).exchange() .expectStatus().isOk() .expectBody(String.class).isEqualTo("Honda Civic"); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.mockito.BDDMockito.given import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest import org.springframework.boot.test.mock.mockito.MockBean import org.springframework.http.MediaType import org.springframework.test.web.reactive.server.WebTestClient import org.springframework.test.web.reactive.server.expectBody @WebFluxTest(UserVehicleController::class) class MyControllerTests(@Autowired val webClient: WebTestClient) { @MockBean lateinit var userVehicleService: UserVehicleService @Test fun testExample() { given(userVehicleService.getVehicleDetails("sboot")) .willReturn(VehicleDetails("Honda", "Civic")) webClient.get().uri("/sboot/vehicle").accept(MediaType.TEXT\_PLAIN).exchange() .expectStatus().isOk .expectBody<String>().isEqualTo("Honda Civic") } } ``` | | | | --- | --- | | | This setup is only supported by WebFlux applications as using `WebTestClient` in a mocked web application only works with WebFlux at the moment. | | | | | --- | --- | | | `@WebFluxTest` cannot detect routes registered through the functional web framework. For testing `RouterFunction` beans in the context, consider importing your `RouterFunction` yourself by using `@Import` or by using `@SpringBootTest`. | | | | | --- | --- | | | `@WebFluxTest` cannot detect custom security configuration registered as a `@Bean` of type `SecurityWebFilterChain`. To include that in your test, you will need to import the configuration that registers the bean by using `@Import` or by using `@SpringBootTest`. | | | | | --- | --- | | | Sometimes writing Spring WebFlux tests is not enough; Spring Boot can help you run [full end-to-end tests with an actual server](#features.testing.spring-boot-applications.with-running-server). | #### 8.3.15. Auto-configured Spring GraphQL Tests Spring GraphQL offers a dedicated testing support module; you’ll need to add it to your project: Maven ``` <dependencies> <dependency> <groupId>org.springframework.graphql</groupId> <artifactId>spring-graphql-test</artifactId> <scope>test</scope> </dependency> <!-- Unless already present in the compile scope --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-webflux</artifactId> <scope>test</scope> </dependency> </dependencies> ``` Gradle ``` dependencies { testImplementation("org.springframework.graphql:spring-graphql-test") // Unless already present in the implementation configuration testImplementation("org.springframework.boot:spring-boot-starter-webflux") } ``` This testing module ships the [GraphQlTester](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html//#testing-graphqltester). The tester is heavily used in test, so be sure to become familiar with using it. There are `GraphQlTester` variants and Spring Boot will auto-configure them depending on the type of tests: * the `ExecutionGraphQlServiceTester` performs tests on the server side, without a client nor a transport * the `HttpGraphQlTester` performs tests with a client that connects to a server, with or without a live server Spring Boot helps you to test your [Spring GraphQL Controllers](https://docs.spring.io/spring-graphql/docs/1.0.0/reference/html/#controllers) with the `@GraphQlTest` annotation. `@GraphQlTest` auto-configures the Spring GraphQL infrastructure, without any transport nor server being involved. This limits scanned beans to `@Controller`, `RuntimeWiringConfigurer`, `JsonComponent`, `Converter`, `GenericConverter`, `DataFetcherExceptionResolver`, `Instrumentation` and `GraphQlSourceBuilderCustomizer`. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@GraphQlTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configurations that are enabled by `@GraphQlTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | | | | | --- | --- | | | If you need to register extra components, such as Jackson `Module`, you can import additional configuration classes using `@Import` on your test. | Often, `@GraphQlTest` is limited to a set of controllers and used in combination with the `@MockBean` annotation to provide mock implementations for required collaborators. Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.docs.web.graphql.runtimewiring.GreetingController; import org.springframework.boot.test.autoconfigure.graphql.GraphQlTest; import org.springframework.graphql.test.tester.GraphQlTester; @GraphQlTest(GreetingController.class) class GreetingControllerTests { @Autowired private GraphQlTester graphQlTester; @Test void shouldGreetWithSpecificName() { this.graphQlTester.document("{ greeting(name: \"Alice\") } ").execute().path("greeting").entity(String.class) .isEqualTo("Hello, Alice!"); } @Test void shouldGreetWithDefaultName() { this.graphQlTester.document("{ greeting } ").execute().path("greeting").entity(String.class) .isEqualTo("Hello, Spring!"); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.docs.web.graphql.runtimewiring.GreetingController import org.springframework.boot.test.autoconfigure.graphql.GraphQlTest import org.springframework.graphql.test.tester.GraphQlTester @GraphQlTest(GreetingController::class) internal class GreetingControllerTests { @Autowired lateinit var graphQlTester: GraphQlTester @Test fun shouldGreetWithSpecificName() { graphQlTester.document("{ greeting(name: \"Alice\") } ").execute().path("greeting").entity(String::class.java) .isEqualTo("Hello, Alice!") } @Test fun shouldGreetWithDefaultName() { graphQlTester.document("{ greeting } ").execute().path("greeting").entity(String::class.java) .isEqualTo("Hello, Spring!") } } ``` `@SpringBootTest` tests are full integration tests and involve the entire application. When using a random or defined port, a live server is configured and an `HttpGraphQlTester` bean is contributed automatically so you can use it to test your server. When a MOCK environment is configured, you can also request an `HttpGraphQlTester` bean by annotating your test class with `@AutoConfigureHttpGraphQlTester`: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.graphql.tester.AutoConfigureHttpGraphQlTester; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.graphql.test.tester.HttpGraphQlTester; @AutoConfigureHttpGraphQlTester @SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.MOCK) class GraphQlIntegrationTests { @Test void shouldGreetWithSpecificName(@Autowired HttpGraphQlTester graphQlTester) { HttpGraphQlTester authenticatedTester = graphQlTester.mutate() .webTestClient( (client) -> client.defaultHeaders((headers) -> headers.setBasicAuth("admin", "ilovespring"))) .build(); authenticatedTester.document("{ greeting(name: \"Alice\") } ").execute().path("greeting").entity(String.class) .isEqualTo("Hello, Alice!"); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.graphql.tester.AutoConfigureHttpGraphQlTester import org.springframework.boot.test.context.SpringBootTest import org.springframework.graphql.test.tester.HttpGraphQlTester import org.springframework.http.HttpHeaders import org.springframework.test.web.reactive.server.WebTestClient @AutoConfigureHttpGraphQlTester @SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.MOCK) class GraphQlIntegrationTests { @Test fun shouldGreetWithSpecificName(@Autowired graphQlTester: HttpGraphQlTester) { val authenticatedTester = graphQlTester.mutate() .webTestClient { client: WebTestClient.Builder -> client.defaultHeaders { headers: HttpHeaders -> headers.setBasicAuth("admin", "ilovespring") } }.build() authenticatedTester.document("{ greeting(name: \"Alice\") } ").execute() .path("greeting").entity(String::class.java).isEqualTo("Hello, Alice!") } } ``` #### 8.3.16. Auto-configured Data Cassandra Tests You can use `@DataCassandraTest` to test Cassandra applications. By default, it configures a `CassandraTemplate`, scans for `@Table` classes, and configures Spring Data Cassandra repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataCassandraTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using Cassandra with Spring Boot, see "[data.html](data#data.nosql.cassandra)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataCassandraTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows a typical setup for using Cassandra tests in Spring Boot: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.cassandra.DataCassandraTest; @DataCassandraTest class MyDataCassandraTests { @Autowired private SomeRepository repository; } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.cassandra.DataCassandraTest @DataCassandraTest class MyDataCassandraTests(@Autowired val repository: SomeRepository) ``` #### 8.3.17. Auto-configured Data Couchbase Tests You can use `@DataCouchbaseTest` to test Couchbase applications. By default, it configures a `CouchbaseTemplate` or `ReactiveCouchbaseTemplate`, scans for `@Document` classes, and configures Spring Data Couchbase repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataCouchbaseTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using Couchbase with Spring Boot, see "[data.html](data#data.nosql.couchbase)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataCouchbaseTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows a typical setup for using Couchbase tests in Spring Boot: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.couchbase.DataCouchbaseTest; @DataCouchbaseTest class MyDataCouchbaseTests { @Autowired private SomeRepository repository; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.couchbase.DataCouchbaseTest @DataCouchbaseTest class MyDataCouchbaseTests(@Autowired val repository: SomeRepository) { // ... } ``` #### 8.3.18. Auto-configured Data Elasticsearch Tests You can use `@DataElasticsearchTest` to test Elasticsearch applications. By default, it configures an `ElasticsearchRestTemplate`, scans for `@Document` classes, and configures Spring Data Elasticsearch repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataElasticsearchTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using Elasticsearch with Spring Boot, see "[data.html](data#data.nosql.elasticsearch)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataElasticsearchTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows a typical setup for using Elasticsearch tests in Spring Boot: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.elasticsearch.DataElasticsearchTest; @DataElasticsearchTest class MyDataElasticsearchTests { @Autowired private SomeRepository repository; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.elasticsearch.DataElasticsearchTest @DataElasticsearchTest class MyDataElasticsearchTests(@Autowired val repository: SomeRepository) { // ... } ``` #### 8.3.19. Auto-configured Data JPA Tests You can use the `@DataJpaTest` annotation to test JPA applications. By default, it scans for `@Entity` classes and configures Spring Data JPA repositories. If an embedded database is available on the classpath, it configures one as well. SQL queries are logged by default by setting the `spring.jpa.show-sql` property to `true`. This can be disabled using the `showSql()` attribute of the annotation. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataJpaTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataJpaTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | By default, data JPA tests are transactional and roll back at the end of each test. See the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#testcontext-tx-enabling-transactions) in the Spring Framework Reference Documentation for more details. If that is not what you want, you can disable transaction management for a test or for the whole class as follows: Java ``` import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest; import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Transactional; @DataJpaTest @Transactional(propagation = Propagation.NOT\_SUPPORTED) class MyNonTransactionalTests { // ... } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest import org.springframework.transaction.annotation.Propagation import org.springframework.transaction.annotation.Transactional @DataJpaTest @Transactional(propagation = Propagation.NOT\_SUPPORTED) class MyNonTransactionalTests { // ... } ``` Data JPA tests may also inject a [`TestEntityManager`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-test-autoconfigure/src/main/java/org/springframework/boot/test/autoconfigure/orm/jpa/TestEntityManager.java) bean, which provides an alternative to the standard JPA `EntityManager` that is specifically designed for tests. | | | | --- | --- | | | `TestEntityManager` can also be auto-configured to any of your Spring-based test class by adding `@AutoConfigureTestEntityManager`. When doing so, make sure that your test is running in a transaction, for instance by adding `@Transactional` on your test class or method. | A `JdbcTemplate` is also available if you need that. The following example shows the `@DataJpaTest` annotation in use: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest; import org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManager; import static org.assertj.core.api.Assertions.assertThat; @DataJpaTest class MyRepositoryTests { @Autowired private TestEntityManager entityManager; @Autowired private UserRepository repository; @Test void testExample() throws Exception { this.entityManager.persist(new User("sboot", "1234")); User user = this.repository.findByUsername("sboot"); assertThat(user.getUsername()).isEqualTo("sboot"); assertThat(user.getEmployeeNumber()).isEqualTo("1234"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest import org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManager @DataJpaTest class MyRepositoryTests(@Autowired val entityManager: TestEntityManager, @Autowired val repository: UserRepository) { @Test fun testExample() { entityManager.persist(User("sboot", "1234")) val user = repository.findByUsername("sboot") assertThat(user?.username).isEqualTo("sboot") assertThat(user?.employeeNumber).isEqualTo("1234") } } ``` In-memory embedded databases generally work well for tests, since they are fast and do not require any installation. If, however, you prefer to run tests against a real database you can use the `@AutoConfigureTestDatabase` annotation, as shown in the following example: Java ``` import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase; import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase.Replace; import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest; @DataJpaTest @AutoConfigureTestDatabase(replace = Replace.NONE) class MyRepositoryTests { // ... } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest @DataJpaTest @AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE) class MyRepositoryTests { // ... } ``` #### 8.3.20. Auto-configured JDBC Tests `@JdbcTest` is similar to `@DataJpaTest` but is for tests that only require a `DataSource` and do not use Spring Data JDBC. By default, it configures an in-memory embedded database and a `JdbcTemplate`. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@JdbcTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configurations that are enabled by `@JdbcTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | By default, JDBC tests are transactional and roll back at the end of each test. See the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#testcontext-tx-enabling-transactions) in the Spring Framework Reference Documentation for more details. If that is not what you want, you can disable transaction management for a test or for the whole class, as follows: Java ``` import org.springframework.boot.test.autoconfigure.jdbc.JdbcTest; import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Transactional; @JdbcTest @Transactional(propagation = Propagation.NOT\_SUPPORTED) class MyTransactionalTests { } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.jdbc.JdbcTest import org.springframework.transaction.annotation.Propagation import org.springframework.transaction.annotation.Transactional @JdbcTest @Transactional(propagation = Propagation.NOT\_SUPPORTED) class MyTransactionalTests ``` If you prefer your test to run against a real database, you can use the `@AutoConfigureTestDatabase` annotation in the same way as for `DataJpaTest`. (See "[Auto-configured Data JPA Tests](#features.testing.spring-boot-applications.autoconfigured-spring-data-jpa)".) #### 8.3.21. Auto-configured Data JDBC Tests `@DataJdbcTest` is similar to `@JdbcTest` but is for tests that use Spring Data JDBC repositories. By default, it configures an in-memory embedded database, a `JdbcTemplate`, and Spring Data JDBC repositories. Only `AbstractJdbcConfiguration` sub-classes are scanned when the `@DataJdbcTest` annotation is used, regular `@Component` and `@ConfigurationProperties` beans are not scanned. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configurations that are enabled by `@DataJdbcTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | By default, Data JDBC tests are transactional and roll back at the end of each test. See the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#testcontext-tx-enabling-transactions) in the Spring Framework Reference Documentation for more details. If that is not what you want, you can disable transaction management for a test or for the whole test class as [shown in the JDBC example](#features.testing.spring-boot-applications.autoconfigured-jdbc). If you prefer your test to run against a real database, you can use the `@AutoConfigureTestDatabase` annotation in the same way as for `DataJpaTest`. (See "[Auto-configured Data JPA Tests](#features.testing.spring-boot-applications.autoconfigured-spring-data-jpa)".) #### 8.3.22. Auto-configured jOOQ Tests You can use `@JooqTest` in a similar fashion as `@JdbcTest` but for jOOQ-related tests. As jOOQ relies heavily on a Java-based schema that corresponds with the database schema, the existing `DataSource` is used. If you want to replace it with an in-memory database, you can use `@AutoConfigureTestDatabase` to override those settings. (For more about using jOOQ with Spring Boot, see "[data.html](data#data.sql.jooq)", earlier in this chapter.) Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@JooqTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configurations that are enabled by `@JooqTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | `@JooqTest` configures a `DSLContext`. The following example shows the `@JooqTest` annotation in use: Java ``` import org.jooq.DSLContext; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.jooq.JooqTest; @JooqTest class MyJooqTests { @Autowired private DSLContext dslContext; // ... } ``` Kotlin ``` import org.jooq.DSLContext import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.jooq.JooqTest @JooqTest class MyJooqTests(@Autowired val dslContext: DSLContext) { // ... } ``` JOOQ tests are transactional and roll back at the end of each test by default. If that is not what you want, you can disable transaction management for a test or for the whole test class as [shown in the JDBC example](#features.testing.spring-boot-applications.autoconfigured-jdbc). #### 8.3.23. Auto-configured Data MongoDB Tests You can use `@DataMongoTest` to test MongoDB applications. By default, it configures an in-memory embedded MongoDB (if available), configures a `MongoTemplate`, scans for `@Document` classes, and configures Spring Data MongoDB repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataMongoTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using MongoDB with Spring Boot, see "[data.html](data#data.nosql.mongodb)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataMongoTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following class shows the `@DataMongoTest` annotation in use: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.mongo.DataMongoTest; import org.springframework.data.mongodb.core.MongoTemplate; @DataMongoTest class MyDataMongoDbTests { @Autowired private MongoTemplate mongoTemplate; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.mongo.DataMongoTest import org.springframework.data.mongodb.core.MongoTemplate @DataMongoTest class MyDataMongoDbTests(@Autowired val mongoTemplate: MongoTemplate) { // ... } ``` In-memory embedded MongoDB generally works well for tests, since it is fast and does not require any developer installation. If, however, you prefer to run tests against a real MongoDB server, you should exclude the embedded MongoDB auto-configuration, as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.mongo.embedded.EmbeddedMongoAutoConfiguration; import org.springframework.boot.test.autoconfigure.data.mongo.DataMongoTest; @DataMongoTest(excludeAutoConfiguration = EmbeddedMongoAutoConfiguration.class) class MyDataMongoDbTests { // ... } ``` Kotlin ``` import org.springframework.boot.autoconfigure.mongo.embedded.EmbeddedMongoAutoConfiguration import org.springframework.boot.test.autoconfigure.data.mongo.DataMongoTest @DataMongoTest(excludeAutoConfiguration = [EmbeddedMongoAutoConfiguration::class]) class MyDataMongoDbTests { // ... } ``` #### 8.3.24. Auto-configured Data Neo4j Tests You can use `@DataNeo4jTest` to test Neo4j applications. By default, it scans for `@Node` classes, and configures Spring Data Neo4j repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataNeo4jTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using Neo4J with Spring Boot, see "[data.html](data#data.nosql.neo4j)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataNeo4jTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows a typical setup for using Neo4J tests in Spring Boot: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.neo4j.DataNeo4jTest; @DataNeo4jTest class MyDataNeo4jTests { @Autowired private SomeRepository repository; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.neo4j.DataNeo4jTest @DataNeo4jTest class MyDataNeo4jTests(@Autowired val repository: SomeRepository) { // ... } ``` By default, Data Neo4j tests are transactional and roll back at the end of each test. See the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/testing.html#testcontext-tx-enabling-transactions) in the Spring Framework Reference Documentation for more details. If that is not what you want, you can disable transaction management for a test or for the whole class, as follows: Java ``` import org.springframework.boot.test.autoconfigure.data.neo4j.DataNeo4jTest; import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Transactional; @DataNeo4jTest @Transactional(propagation = Propagation.NOT\_SUPPORTED) class MyDataNeo4jTests { } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.data.neo4j.DataNeo4jTest import org.springframework.transaction.annotation.Propagation import org.springframework.transaction.annotation.Transactional @DataNeo4jTest @Transactional(propagation = Propagation.NOT\_SUPPORTED) class MyDataNeo4jTests ``` | | | | --- | --- | | | Transactional tests are not supported with reactive access. If you are using this style, you must configure `@DataNeo4jTest` tests as described above. | #### 8.3.25. Auto-configured Data Redis Tests You can use `@DataRedisTest` to test Redis applications. By default, it scans for `@RedisHash` classes and configures Spring Data Redis repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataRedisTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using Redis with Spring Boot, see "[data.html](data#data.nosql.redis)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataRedisTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows the `@DataRedisTest` annotation in use: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.redis.DataRedisTest; @DataRedisTest class MyDataRedisTests { @Autowired private SomeRepository repository; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.redis.DataRedisTest @DataRedisTest class MyDataRedisTests(@Autowired val repository: SomeRepository) { // ... } ``` #### 8.3.26. Auto-configured Data LDAP Tests You can use `@DataLdapTest` to test LDAP applications. By default, it configures an in-memory embedded LDAP (if available), configures an `LdapTemplate`, scans for `@Entry` classes, and configures Spring Data LDAP repositories. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@DataLdapTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. (For more about using LDAP with Spring Boot, see "[data.html](data#data.nosql.ldap)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@DataLdapTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows the `@DataLdapTest` annotation in use: Java ``` import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.data.ldap.DataLdapTest; import org.springframework.ldap.core.LdapTemplate; @DataLdapTest class MyDataLdapTests { @Autowired private LdapTemplate ldapTemplate; // ... } ``` Kotlin ``` import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.data.ldap.DataLdapTest import org.springframework.ldap.core.LdapTemplate @DataLdapTest class MyDataLdapTests(@Autowired val ldapTemplate: LdapTemplate) { // ... } ``` In-memory embedded LDAP generally works well for tests, since it is fast and does not require any developer installation. If, however, you prefer to run tests against a real LDAP server, you should exclude the embedded LDAP auto-configuration, as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.ldap.embedded.EmbeddedLdapAutoConfiguration; import org.springframework.boot.test.autoconfigure.data.ldap.DataLdapTest; @DataLdapTest(excludeAutoConfiguration = EmbeddedLdapAutoConfiguration.class) class MyDataLdapTests { // ... } ``` Kotlin ``` import org.springframework.boot.autoconfigure.ldap.embedded.EmbeddedLdapAutoConfiguration import org.springframework.boot.test.autoconfigure.data.ldap.DataLdapTest @DataLdapTest(excludeAutoConfiguration = [EmbeddedLdapAutoConfiguration::class]) class MyDataLdapTests { // ... } ``` #### 8.3.27. Auto-configured REST Clients You can use the `@RestClientTest` annotation to test REST clients. By default, it auto-configures Jackson, GSON, and Jsonb support, configures a `RestTemplateBuilder`, and adds support for `MockRestServiceServer`. Regular `@Component` and `@ConfigurationProperties` beans are not scanned when the `@RestClientTest` annotation is used. `@EnableConfigurationProperties` can be used to include `@ConfigurationProperties` beans. | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@RestClientTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The specific beans that you want to test should be specified by using the `value` or `components` attribute of `@RestClientTest`, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.client.RestClientTest; import org.springframework.http.MediaType; import org.springframework.test.web.client.MockRestServiceServer; import static org.assertj.core.api.Assertions.assertThat; import static org.springframework.test.web.client.match.MockRestRequestMatchers.requestTo; import static org.springframework.test.web.client.response.MockRestResponseCreators.withSuccess; @RestClientTest(RemoteVehicleDetailsService.class) class MyRestClientTests { @Autowired private RemoteVehicleDetailsService service; @Autowired private MockRestServiceServer server; @Test void getVehicleDetailsWhenResultIsSuccessShouldReturnDetails() throws Exception { this.server.expect(requestTo("/greet/details")).andRespond(withSuccess("hello", MediaType.TEXT\_PLAIN)); String greeting = this.service.callRestService(); assertThat(greeting).isEqualTo("hello"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.web.client.RestClientTest import org.springframework.http.MediaType import org.springframework.test.web.client.MockRestServiceServer import org.springframework.test.web.client.match.MockRestRequestMatchers import org.springframework.test.web.client.response.MockRestResponseCreators @RestClientTest(RemoteVehicleDetailsService::class) class MyRestClientTests( @Autowired val service: RemoteVehicleDetailsService, @Autowired val server: MockRestServiceServer) { @Test fun getVehicleDetailsWhenResultIsSuccessShouldReturnDetails(): Unit { server.expect(MockRestRequestMatchers.requestTo("/greet/details")) .andRespond(MockRestResponseCreators.withSuccess("hello", MediaType.TEXT\_PLAIN)) val greeting = service.callRestService() assertThat(greeting).isEqualTo("hello") } } ``` #### 8.3.28. Auto-configured Spring REST Docs Tests You can use the `@AutoConfigureRestDocs` annotation to use [Spring REST Docs](https://spring.io/projects/spring-restdocs) in your tests with Mock MVC, REST Assured, or WebTestClient. It removes the need for the JUnit extension in Spring REST Docs. `@AutoConfigureRestDocs` can be used to override the default output directory (`target/generated-snippets` if you are using Maven or `build/generated-snippets` if you are using Gradle). It can also be used to configure the host, scheme, and port that appears in any documented URIs. ##### Auto-configured Spring REST Docs Tests with Mock MVC `@AutoConfigureRestDocs` customizes the `MockMvc` bean to use Spring REST Docs when testing servlet-based web applications. You can inject it by using `@Autowired` and use it in your tests as you normally would when using Mock MVC and Spring REST Docs, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs; import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest; import org.springframework.http.MediaType; import org.springframework.test.web.servlet.MockMvc; import static org.springframework.restdocs.mockmvc.MockMvcRestDocumentation.document; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; @WebMvcTest(UserController.class) @AutoConfigureRestDocs class MyUserDocumentationTests { @Autowired private MockMvc mvc; @Test void listUsers() throws Exception { this.mvc.perform(get("/users").accept(MediaType.TEXT\_PLAIN)) .andExpect(status().isOk()) .andDo(document("list-users")); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest import org.springframework.http.MediaType import org.springframework.restdocs.mockmvc.MockMvcRestDocumentation import org.springframework.test.web.servlet.MockMvc import org.springframework.test.web.servlet.request.MockMvcRequestBuilders import org.springframework.test.web.servlet.result.MockMvcResultMatchers @WebMvcTest(UserController::class) @AutoConfigureRestDocs class MyUserDocumentationTests(@Autowired val mvc: MockMvc) { @Test fun listUsers() { mvc.perform(MockMvcRequestBuilders.get("/users").accept(MediaType.TEXT\_PLAIN)) .andExpect(MockMvcResultMatchers.status().isOk) .andDo(MockMvcRestDocumentation.document("list-users")) } } ``` If you require more control over Spring REST Docs configuration than offered by the attributes of `@AutoConfigureRestDocs`, you can use a `RestDocsMockMvcConfigurationCustomizer` bean, as shown in the following example: Java ``` import org.springframework.boot.test.autoconfigure.restdocs.RestDocsMockMvcConfigurationCustomizer; import org.springframework.boot.test.context.TestConfiguration; import org.springframework.restdocs.mockmvc.MockMvcRestDocumentationConfigurer; import org.springframework.restdocs.templates.TemplateFormats; @TestConfiguration(proxyBeanMethods = false) public class MyRestDocsConfiguration implements RestDocsMockMvcConfigurationCustomizer { @Override public void customize(MockMvcRestDocumentationConfigurer configurer) { configurer.snippets().withTemplateFormat(TemplateFormats.markdown()); } } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.restdocs.RestDocsMockMvcConfigurationCustomizer import org.springframework.boot.test.context.TestConfiguration import org.springframework.restdocs.mockmvc.MockMvcRestDocumentationConfigurer import org.springframework.restdocs.templates.TemplateFormats @TestConfiguration(proxyBeanMethods = false) class MyRestDocsConfiguration : RestDocsMockMvcConfigurationCustomizer { override fun customize(configurer: MockMvcRestDocumentationConfigurer) { configurer.snippets().withTemplateFormat(TemplateFormats.markdown()) } } ``` If you want to make use of Spring REST Docs support for a parameterized output directory, you can create a `RestDocumentationResultHandler` bean. The auto-configuration calls `alwaysDo` with this result handler, thereby causing each `MockMvc` call to automatically generate the default snippets. The following example shows a `RestDocumentationResultHandler` being defined: Java ``` import org.springframework.boot.test.context.TestConfiguration; import org.springframework.context.annotation.Bean; import org.springframework.restdocs.mockmvc.MockMvcRestDocumentation; import org.springframework.restdocs.mockmvc.RestDocumentationResultHandler; @TestConfiguration(proxyBeanMethods = false) public class MyResultHandlerConfiguration { @Bean public RestDocumentationResultHandler restDocumentation() { return MockMvcRestDocumentation.document("{method-name}"); } } ``` Kotlin ``` import org.springframework.boot.test.context.TestConfiguration import org.springframework.context.annotation.Bean import org.springframework.restdocs.mockmvc.MockMvcRestDocumentation import org.springframework.restdocs.mockmvc.RestDocumentationResultHandler @TestConfiguration(proxyBeanMethods = false) class MyResultHandlerConfiguration { @Bean fun restDocumentation(): RestDocumentationResultHandler { return MockMvcRestDocumentation.document("{method-name}") } } ``` ##### Auto-configured Spring REST Docs Tests with WebTestClient `@AutoConfigureRestDocs` can also be used with `WebTestClient` when testing reactive web applications. You can inject it by using `@Autowired` and use it in your tests as you normally would when using `@WebFluxTest` and Spring REST Docs, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs; import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest; import org.springframework.test.web.reactive.server.WebTestClient; import static org.springframework.restdocs.webtestclient.WebTestClientRestDocumentation.document; @WebFluxTest @AutoConfigureRestDocs class MyUsersDocumentationTests { @Autowired private WebTestClient webTestClient; @Test void listUsers() { this.webTestClient .get().uri("/") .exchange() .expectStatus() .isOk() .expectBody() .consumeWith(document("list-users")); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest import org.springframework.restdocs.webtestclient.WebTestClientRestDocumentation import org.springframework.test.web.reactive.server.WebTestClient @WebFluxTest @AutoConfigureRestDocs class MyUsersDocumentationTests(@Autowired val webTestClient: WebTestClient) { @Test fun listUsers() { webTestClient .get().uri("/") .exchange() .expectStatus() .isOk .expectBody() .consumeWith(WebTestClientRestDocumentation.document("list-users")) } } ``` If you require more control over Spring REST Docs configuration than offered by the attributes of `@AutoConfigureRestDocs`, you can use a `RestDocsWebTestClientConfigurationCustomizer` bean, as shown in the following example: Java ``` import org.springframework.boot.test.autoconfigure.restdocs.RestDocsWebTestClientConfigurationCustomizer; import org.springframework.boot.test.context.TestConfiguration; import org.springframework.restdocs.webtestclient.WebTestClientRestDocumentationConfigurer; @TestConfiguration(proxyBeanMethods = false) public class MyRestDocsConfiguration implements RestDocsWebTestClientConfigurationCustomizer { @Override public void customize(WebTestClientRestDocumentationConfigurer configurer) { configurer.snippets().withEncoding("UTF-8"); } } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.restdocs.RestDocsWebTestClientConfigurationCustomizer import org.springframework.boot.test.context.TestConfiguration import org.springframework.restdocs.webtestclient.WebTestClientRestDocumentationConfigurer @TestConfiguration(proxyBeanMethods = false) class MyRestDocsConfiguration : RestDocsWebTestClientConfigurationCustomizer { override fun customize(configurer: WebTestClientRestDocumentationConfigurer) { configurer.snippets().withEncoding("UTF-8") } } ``` If you want to make use of Spring REST Docs support for a parameterized output directory, you can use a `WebTestClientBuilderCustomizer` to configure a consumer for every entity exchange result. The following example shows such a `WebTestClientBuilderCustomizer` being defined: Java ``` import org.springframework.boot.test.context.TestConfiguration; import org.springframework.boot.test.web.reactive.server.WebTestClientBuilderCustomizer; import org.springframework.context.annotation.Bean; import static org.springframework.restdocs.webtestclient.WebTestClientRestDocumentation.document; @TestConfiguration(proxyBeanMethods = false) public class MyWebTestClientBuilderCustomizerConfiguration { @Bean public WebTestClientBuilderCustomizer restDocumentation() { return (builder) -> builder.entityExchangeResultConsumer(document("{method-name}")); } } ``` Kotlin ``` import org.springframework.boot.test.context.TestConfiguration import org.springframework.boot.test.web.reactive.server.WebTestClientBuilderCustomizer import org.springframework.context.annotation.Bean import org.springframework.restdocs.webtestclient.WebTestClientRestDocumentation import org.springframework.test.web.reactive.server.WebTestClient @TestConfiguration(proxyBeanMethods = false) class MyWebTestClientBuilderCustomizerConfiguration { @Bean fun restDocumentation(): WebTestClientBuilderCustomizer { return WebTestClientBuilderCustomizer { builder: WebTestClient.Builder -> builder.entityExchangeResultConsumer( WebTestClientRestDocumentation.document("{method-name}") ) } } } ``` ##### Auto-configured Spring REST Docs Tests with REST Assured `@AutoConfigureRestDocs` makes a `RequestSpecification` bean, preconfigured to use Spring REST Docs, available to your tests. You can inject it by using `@Autowired` and use it in your tests as you normally would when using REST Assured and Spring REST Docs, as shown in the following example: Java ``` import io.restassured.specification.RequestSpecification; import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.boot.test.context.SpringBootTest.WebEnvironment; import org.springframework.boot.test.web.server.LocalServerPort; import static io.restassured.RestAssured.given; import static org.hamcrest.Matchers.is; import static org.springframework.restdocs.restassured3.RestAssuredRestDocumentation.document; @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) @AutoConfigureRestDocs class MyUserDocumentationTests { @Test void listUsers(@Autowired RequestSpecification documentationSpec, @LocalServerPort int port) { given(documentationSpec) .filter(document("list-users")) .when() .port(port) .get("/") .then().assertThat() .statusCode(is(200)); } } ``` Kotlin ``` import io.restassured.RestAssured import io.restassured.specification.RequestSpecification import org.hamcrest.Matchers import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.context.SpringBootTest.WebEnvironment import org.springframework.boot.test.web.server.LocalServerPort import org.springframework.restdocs.restassured3.RestAssuredRestDocumentation @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) @AutoConfigureRestDocs class MyUserDocumentationTests { @Test fun listUsers(@Autowired documentationSpec: RequestSpecification?, @LocalServerPort port: Int) { RestAssured.given(documentationSpec) .filter(RestAssuredRestDocumentation.document("list-users")) .`when`() .port(port)["/"] .then().assertThat() .statusCode(Matchers.`is`(200)) } } ``` If you require more control over Spring REST Docs configuration than offered by the attributes of `@AutoConfigureRestDocs`, a `RestDocsRestAssuredConfigurationCustomizer` bean can be used, as shown in the following example: Java ``` import org.springframework.boot.test.autoconfigure.restdocs.RestDocsRestAssuredConfigurationCustomizer; import org.springframework.boot.test.context.TestConfiguration; import org.springframework.restdocs.restassured3.RestAssuredRestDocumentationConfigurer; import org.springframework.restdocs.templates.TemplateFormats; @TestConfiguration(proxyBeanMethods = false) public class MyRestDocsConfiguration implements RestDocsRestAssuredConfigurationCustomizer { @Override public void customize(RestAssuredRestDocumentationConfigurer configurer) { configurer.snippets().withTemplateFormat(TemplateFormats.markdown()); } } ``` Kotlin ``` import org.springframework.boot.test.autoconfigure.restdocs.RestDocsRestAssuredConfigurationCustomizer import org.springframework.boot.test.context.TestConfiguration import org.springframework.restdocs.restassured3.RestAssuredRestDocumentationConfigurer import org.springframework.restdocs.templates.TemplateFormats @TestConfiguration(proxyBeanMethods = false) class MyRestDocsConfiguration : RestDocsRestAssuredConfigurationCustomizer { override fun customize(configurer: RestAssuredRestDocumentationConfigurer) { configurer.snippets().withTemplateFormat(TemplateFormats.markdown()) } } ``` #### 8.3.29. Auto-configured Spring Web Services Tests ##### Auto-configured Spring Web Services Client Tests You can use `@WebServiceClientTest` to test applications that call web services using the Spring Web Services project. By default, it configures a mock `WebServiceServer` bean and automatically customizes your `WebServiceTemplateBuilder`. (For more about using Web Services with Spring Boot, see "[io.html](io#io.webservices)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@WebServiceClientTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows the `@WebServiceClientTest` annotation in use: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.webservices.client.WebServiceClientTest; import org.springframework.ws.test.client.MockWebServiceServer; import org.springframework.xml.transform.StringSource; import static org.assertj.core.api.Assertions.assertThat; import static org.springframework.ws.test.client.RequestMatchers.payload; import static org.springframework.ws.test.client.ResponseCreators.withPayload; @WebServiceClientTest(SomeWebService.class) class MyWebServiceClientTests { @Autowired private MockWebServiceServer server; @Autowired private SomeWebService someWebService; @Test void mockServerCall() { this.server .expect(payload(new StringSource("<request/>"))) .andRespond(withPayload(new StringSource("<response><status>200</status></response>"))); assertThat(this.someWebService.test()) .extracting(Response::getStatus) .isEqualTo(200); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.webservices.client.WebServiceClientTest import org.springframework.ws.test.client.MockWebServiceServer import org.springframework.ws.test.client.RequestMatchers import org.springframework.ws.test.client.ResponseCreators import org.springframework.xml.transform.StringSource @WebServiceClientTest(SomeWebService::class) class MyWebServiceClientTests(@Autowired val server: MockWebServiceServer, @Autowired val someWebService: SomeWebService) { @Test fun mockServerCall() { server .expect(RequestMatchers.payload(StringSource("<request/>"))) .andRespond(ResponseCreators.withPayload(StringSource("<response><status>200</status></response>"))) assertThat(this.someWebService.test()).extracting(Response::status).isEqualTo(200) } } ``` ##### Auto-configured Spring Web Services Server Tests You can use `@WebServiceServerTest` to test applications that implement web services using the Spring Web Services project. By default, it configures a `MockWebServiceClient` bean that can be used to call your web service endpoints. (For more about using Web Services with Spring Boot, see "[io.html](io#io.webservices)", earlier in this chapter.) | | | | --- | --- | | | A list of the auto-configuration settings that are enabled by `@WebServiceServerTest` can be [found in the appendix](test-auto-configuration#appendix.test-auto-configuration). | The following example shows the `@WebServiceServerTest` annotation in use: Java ``` import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.webservices.server.WebServiceServerTest; import org.springframework.ws.test.server.MockWebServiceClient; import org.springframework.ws.test.server.RequestCreators; import org.springframework.ws.test.server.ResponseMatchers; import org.springframework.xml.transform.StringSource; @WebServiceServerTest(ExampleEndpoint.class) class MyWebServiceServerTests { @Autowired private MockWebServiceClient client; @Test void mockServerCall() { this.client .sendRequest(RequestCreators.withPayload(new StringSource("<ExampleRequest/>"))) .andExpect(ResponseMatchers.payload(new StringSource("<ExampleResponse>42</ExampleResponse>"))); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.autoconfigure.webservices.server.WebServiceServerTest import org.springframework.ws.test.server.MockWebServiceClient import org.springframework.ws.test.server.RequestCreators import org.springframework.ws.test.server.ResponseMatchers import org.springframework.xml.transform.StringSource @WebServiceServerTest(ExampleEndpoint::class) class MyWebServiceServerTests(@Autowired val client: MockWebServiceClient) { @Test fun mockServerCall() { client .sendRequest(RequestCreators.withPayload(StringSource("<ExampleRequest/>"))) .andExpect(ResponseMatchers.payload(StringSource("<ExampleResponse>42</ExampleResponse>"))) } } ``` #### 8.3.30. Additional Auto-configuration and Slicing Each slice provides one or more `@AutoConfigure…​` annotations that namely defines the auto-configurations that should be included as part of a slice. Additional auto-configurations can be added on a test-by-test basis by creating a custom `@AutoConfigure…​` annotation or by adding `@ImportAutoConfiguration` to the test as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.ImportAutoConfiguration; import org.springframework.boot.autoconfigure.integration.IntegrationAutoConfiguration; import org.springframework.boot.test.autoconfigure.jdbc.JdbcTest; @JdbcTest @ImportAutoConfiguration(IntegrationAutoConfiguration.class) class MyJdbcTests { } ``` Kotlin ``` import org.springframework.boot.autoconfigure.ImportAutoConfiguration import org.springframework.boot.autoconfigure.integration.IntegrationAutoConfiguration import org.springframework.boot.test.autoconfigure.jdbc.JdbcTest @JdbcTest @ImportAutoConfiguration(IntegrationAutoConfiguration::class) class MyJdbcTests ``` | | | | --- | --- | | | Make sure to not use the regular `@Import` annotation to import auto-configurations as they are handled in a specific way by Spring Boot. | Alternatively, additional auto-configurations can be added for any use of a slice annotation by registering them in a file stored in `META-INF/spring` as shown in the following example: META-INF/spring/org.springframework.boot.test.autoconfigure.jdbc.JdbcTest.imports ``` com.example.IntegrationAutoConfiguration ``` In this example, the `com.example.IntegrationAutoConfiguration` is enabled on every test annotated with `@JdbcTest`. | | | | --- | --- | | | You can use comments via `#` in this file. | | | | | --- | --- | | | A slice or `@AutoConfigure…​` annotation can be customized this way as long as it is meta-annotated with `@ImportAutoConfiguration`. | #### 8.3.31. User Configuration and Slicing If you [structure your code](using#using.structuring-your-code) in a sensible way, your `@SpringBootApplication` class is [used by default](#features.testing.spring-boot-applications.detecting-configuration) as the configuration of your tests. It then becomes important not to litter the application’s main class with configuration settings that are specific to a particular area of its functionality. Assume that you are using Spring Batch and you rely on the auto-configuration for it. You could define your `@SpringBootApplication` as follows: Java ``` import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication @EnableBatchProcessing public class MyApplication { // ... } ``` Kotlin ``` import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing import org.springframework.boot.autoconfigure.SpringBootApplication @SpringBootApplication @EnableBatchProcessing class MyApplication { // ... } ``` Because this class is the source configuration for the test, any slice test actually tries to start Spring Batch, which is definitely not what you want to do. A recommended approach is to move that area-specific configuration to a separate `@Configuration` class at the same level as your application, as shown in the following example: Java ``` import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) @EnableBatchProcessing public class MyBatchConfiguration { // ... } ``` Kotlin ``` import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) @EnableBatchProcessing class MyBatchConfiguration { // ... } ``` | | | | --- | --- | | | Depending on the complexity of your application, you may either have a single `@Configuration` class for your customizations or one class per domain area. The latter approach lets you enable it in one of your tests, if necessary, with the `@Import` annotation. See [this how-to section](howto#howto.testing.slice-tests) for more details on when you might want to enable specific `@Configuration` classes for slice tests. | Test slices exclude `@Configuration` classes from scanning. For example, for a `@WebMvcTest`, the following configuration will not include the given `WebMvcConfigurer` bean in the application context loaded by the test slice: Java ``` import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.web.servlet.config.annotation.WebMvcConfigurer; @Configuration(proxyBeanMethods = false) public class MyWebConfiguration { @Bean public WebMvcConfigurer testConfigurer() { return new WebMvcConfigurer() { // ... }; } } ``` Kotlin ``` import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.web.servlet.config.annotation.WebMvcConfigurer @Configuration(proxyBeanMethods = false) class MyWebConfiguration { @Bean fun testConfigurer(): WebMvcConfigurer { return object : WebMvcConfigurer { // ... } } } ``` The configuration below will, however, cause the custom `WebMvcConfigurer` to be loaded by the test slice. Java ``` import org.springframework.stereotype.Component; import org.springframework.web.servlet.config.annotation.WebMvcConfigurer; @Component public class MyWebMvcConfigurer implements WebMvcConfigurer { // ... } ``` Kotlin ``` import org.springframework.stereotype.Component import org.springframework.web.servlet.config.annotation.WebMvcConfigurer @Component class MyWebMvcConfigurer : WebMvcConfigurer { // ... } ``` Another source of confusion is classpath scanning. Assume that, while you structured your code in a sensible way, you need to scan an additional package. Your application may resemble the following code: Java ``` import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.context.annotation.ComponentScan; @SpringBootApplication @ComponentScan({ "com.example.app", "com.example.another" }) public class MyApplication { // ... } ``` Kotlin ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.context.annotation.ComponentScan @SpringBootApplication @ComponentScan("com.example.app", "com.example.another") class MyApplication { // ... } ``` Doing so effectively overrides the default component scan directive with the side effect of scanning those two packages regardless of the slice that you chose. For instance, a `@DataJpaTest` seems to suddenly scan components and user configurations of your application. Again, moving the custom directive to a separate class is a good way to fix this issue. | | | | --- | --- | | | If this is not an option for you, you can create a `@SpringBootConfiguration` somewhere in the hierarchy of your test so that it is used instead. Alternatively, you can specify a source for your test, which disables the behavior of finding a default one. | #### 8.3.32. Using Spock to Test Spring Boot Applications Spock 2.x can be used to test a Spring Boot application. To do so, add a dependency on Spock’s `spock-spring` module to your application’s build. `spock-spring` integrates Spring’s test framework into Spock. See [the documentation for Spock’s Spring module](https://spockframework.org/spock/docs/2.0/modules.html#_spring_module) for further details. ### 8.4. Test Utilities A few test utility classes that are generally useful when testing your application are packaged as part of `spring-boot`. #### 8.4.1. ConfigDataApplicationContextInitializer `ConfigDataApplicationContextInitializer` is an `ApplicationContextInitializer` that you can apply to your tests to load Spring Boot `application.properties` files. You can use it when you do not need the full set of features provided by `@SpringBootTest`, as shown in the following example: Java ``` import org.springframework.boot.test.context.ConfigDataApplicationContextInitializer; import org.springframework.test.context.ContextConfiguration; @ContextConfiguration(classes = Config.class, initializers = ConfigDataApplicationContextInitializer.class) class MyConfigFileTests { // ... } ``` Kotlin ``` import org.springframework.boot.test.context.ConfigDataApplicationContextInitializer import org.springframework.test.context.ContextConfiguration @ContextConfiguration(classes = [Config::class], initializers = [ConfigDataApplicationContextInitializer::class]) class MyConfigFileTests { // ... } ``` | | | | --- | --- | | | Using `ConfigDataApplicationContextInitializer` alone does not provide support for `@Value("${…​}")` injection. Its only job is to ensure that `application.properties` files are loaded into Spring’s `Environment`. For `@Value` support, you need to either additionally configure a `PropertySourcesPlaceholderConfigurer` or use `@SpringBootTest`, which auto-configures one for you. | #### 8.4.2. TestPropertyValues `TestPropertyValues` lets you quickly add properties to a `ConfigurableEnvironment` or `ConfigurableApplicationContext`. You can call it with `key=value` strings, as follows: Java ``` import org.junit.jupiter.api.Test; import org.springframework.boot.test.util.TestPropertyValues; import org.springframework.mock.env.MockEnvironment; import static org.assertj.core.api.Assertions.assertThat; class MyEnvironmentTests { @Test void testPropertySources() { MockEnvironment environment = new MockEnvironment(); TestPropertyValues.of("org=Spring", "name=Boot").applyTo(environment); assertThat(environment.getProperty("name")).isEqualTo("Boot"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.boot.test.util.TestPropertyValues import org.springframework.mock.env.MockEnvironment class MyEnvironmentTests { @Test fun testPropertySources() { val environment = MockEnvironment() TestPropertyValues.of("org=Spring", "name=Boot").applyTo(environment) assertThat(environment.getProperty("name")).isEqualTo("Boot") } } ``` #### 8.4.3. OutputCapture `OutputCapture` is a JUnit `Extension` that you can use to capture `System.out` and `System.err` output. To use add `@ExtendWith(OutputCaptureExtension.class)` and inject `CapturedOutput` as an argument to your test class constructor or test method as follows: Java ``` import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.springframework.boot.test.system.CapturedOutput; import org.springframework.boot.test.system.OutputCaptureExtension; import static org.assertj.core.api.Assertions.assertThat; @ExtendWith(OutputCaptureExtension.class) class MyOutputCaptureTests { @Test void testName(CapturedOutput output) { System.out.println("Hello World!"); assertThat(output).contains("World"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.junit.jupiter.api.extension.ExtendWith import org.springframework.boot.test.system.CapturedOutput import org.springframework.boot.test.system.OutputCaptureExtension @ExtendWith(OutputCaptureExtension::class) class MyOutputCaptureTests { @Test fun testName(output: CapturedOutput?) { println("Hello World!") assertThat(output).contains("World") } } ``` #### 8.4.4. TestRestTemplate `TestRestTemplate` is a convenience alternative to Spring’s `RestTemplate` that is useful in integration tests. You can get a vanilla template or one that sends Basic HTTP authentication (with a username and password). In either case, the template is fault tolerant. This means that it behaves in a test-friendly way by not throwing exceptions on 4xx and 5xx errors. Instead, such errors can be detected through the returned `ResponseEntity` and its status code. | | | | --- | --- | | | Spring Framework 5.0 provides a new `WebTestClient` that works for [WebFlux integration tests](#features.testing.spring-boot-applications.spring-webflux-tests) and both [WebFlux and MVC end-to-end testing](#features.testing.spring-boot-applications.with-running-server). It provides a fluent API for assertions, unlike `TestRestTemplate`. | It is recommended, but not mandatory, to use the Apache HTTP Client (version 4.3.2 or better). If you have that on your classpath, the `TestRestTemplate` responds by configuring the client appropriately. If you do use Apache’s HTTP client, some additional test-friendly features are enabled: * Redirects are not followed (so you can assert the response location). * Cookies are ignored (so the template is stateless). `TestRestTemplate` can be instantiated directly in your integration tests, as shown in the following example: Java ``` import org.junit.jupiter.api.Test; import org.springframework.boot.test.web.client.TestRestTemplate; import org.springframework.http.ResponseEntity; import static org.assertj.core.api.Assertions.assertThat; class MyTests { private final TestRestTemplate template = new TestRestTemplate(); @Test void testRequest() throws Exception { ResponseEntity<String> headers = this.template.getForEntity("https://myhost.example.com/example", String.class); assertThat(headers.getHeaders().getLocation()).hasHost("other.example.com"); } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.boot.test.web.client.TestRestTemplate class MyTests { private val template = TestRestTemplate() @Test fun testRequest() { val headers = template.getForEntity("https://myhost.example.com/example", String::class.java) assertThat(headers.headers.location).hasHost("other.example.com") } } ``` Alternatively, if you use the `@SpringBootTest` annotation with `WebEnvironment.RANDOM_PORT` or `WebEnvironment.DEFINED_PORT`, you can inject a fully configured `TestRestTemplate` and start using it. If necessary, additional customizations can be applied through the `RestTemplateBuilder` bean. Any URLs that do not specify a host and port automatically connect to the embedded server, as shown in the following example: Java ``` import java.time.Duration; import org.junit.jupiter.api.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.boot.test.context.SpringBootTest.WebEnvironment; import org.springframework.boot.test.context.TestConfiguration; import org.springframework.boot.test.web.client.TestRestTemplate; import org.springframework.boot.web.client.RestTemplateBuilder; import org.springframework.context.annotation.Bean; import org.springframework.http.HttpHeaders; import static org.assertj.core.api.Assertions.assertThat; @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MySpringBootTests { @Autowired private TestRestTemplate template; @Test void testRequest() { HttpHeaders headers = this.template.getForEntity("/example", String.class).getHeaders(); assertThat(headers.getLocation()).hasHost("other.example.com"); } @TestConfiguration(proxyBeanMethods = false) static class RestTemplateBuilderConfiguration { @Bean RestTemplateBuilder restTemplateBuilder() { return new RestTemplateBuilder().setConnectTimeout(Duration.ofSeconds(1)) .setReadTimeout(Duration.ofSeconds(1)); } } } ``` Kotlin ``` import org.assertj.core.api.Assertions.assertThat import org.junit.jupiter.api.Test import org.springframework.beans.factory.annotation.Autowired import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.context.SpringBootTest.WebEnvironment import org.springframework.boot.test.context.TestConfiguration import org.springframework.boot.test.web.client.TestRestTemplate import org.springframework.boot.web.client.RestTemplateBuilder import org.springframework.context.annotation.Bean import java.time.Duration @SpringBootTest(webEnvironment = WebEnvironment.RANDOM\_PORT) class MySpringBootTests(@Autowired val template: TestRestTemplate) { @Test fun testRequest() { val headers = template.getForEntity("/example", String::class.java).headers assertThat(headers.location).hasHost("other.example.com") } @TestConfiguration(proxyBeanMethods = false) internal class RestTemplateBuilderConfiguration { @Bean fun restTemplateBuilder(): RestTemplateBuilder { return RestTemplateBuilder().setConnectTimeout(Duration.ofSeconds(1)) .setReadTimeout(Duration.ofSeconds(1)) } } } ``` 9. Creating Your Own Auto-configuration ---------------------------------------- If you work in a company that develops shared libraries, or if you work on an open-source or commercial library, you might want to develop your own auto-configuration. Auto-configuration classes can be bundled in external jars and still be picked-up by Spring Boot. Auto-configuration can be associated to a “starter” that provides the auto-configuration code as well as the typical libraries that you would use with it. We first cover what you need to know to build your own auto-configuration and then we move on to the [typical steps required to create a custom starter](#features.developing-auto-configuration.custom-starter). | | | | --- | --- | | | A [demo project](https://github.com/snicoll-demos/spring-boot-master-auto-configuration) is available to showcase how you can create a starter step-by-step. | ### 9.1. Understanding Auto-configured Beans Under the hood, auto-configuration is implemented with the `@AutoConfiguration` annotation. This annotation itself is meta-annotated with `@Configuration`, making auto-configurations standard `@Configuration` classes. Additional `@Conditional` annotations are used to constrain when the auto-configuration should apply. Usually, auto-configuration classes use `@ConditionalOnClass` and `@ConditionalOnMissingBean` annotations. This ensures that auto-configuration applies only when relevant classes are found and when you have not declared your own `@Configuration`. You can browse the source code of [`spring-boot-autoconfigure`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure) to see the `@Configuration` classes that Spring provides (see the [`META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.imports`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/resources/META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.imports) file). ### 9.2. Locating Auto-configuration Candidates Spring Boot checks for the presence of a `META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.imports` file within your published jar. The file should list your configuration classes, as shown in the following example: ``` com.mycorp.libx.autoconfigure.LibXAutoConfiguration com.mycorp.libx.autoconfigure.LibXWebAutoConfiguration ``` | | | | --- | --- | | | You can use comments via `#` in this file. | | | | | --- | --- | | | Auto-configurations must be loaded that way *only*. Make sure that they are defined in a specific package space and that they are never the target of component scanning. Furthermore, auto-configuration classes should not enable component scanning to find additional components. Specific `@Import`s should be used instead. | You can use the [`@AutoConfigureAfter`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/AutoConfigureAfter.java) or [`@AutoConfigureBefore`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/AutoConfigureBefore.java) annotations if your configuration needs to be applied in a specific order. For example, if you provide web-specific configuration, your class may need to be applied after `WebMvcAutoConfiguration`. | | | | --- | --- | | | If you are using the [`@AutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/AutoConfiguration.java) annotation, you can use the `before`, `beforeName`, `after` and `afterName` attribute aliases instead of the dedicated annotations. | If you want to order certain auto-configurations that should not have any direct knowledge of each other, you can also use `@AutoConfigureOrder`. That annotation has the same semantic as the regular `@Order` annotation but provides a dedicated order for auto-configuration classes. As with standard `@Configuration` classes, the order in which auto-configuration classes are applied only affects the order in which their beans are defined. The order in which those beans are subsequently created is unaffected and is determined by each bean’s dependencies and any `@DependsOn` relationships. ### 9.3. Condition Annotations You almost always want to include one or more `@Conditional` annotations on your auto-configuration class. The `@ConditionalOnMissingBean` annotation is one common example that is used to allow developers to override auto-configuration if they are not happy with your defaults. Spring Boot includes a number of `@Conditional` annotations that you can reuse in your own code by annotating `@Configuration` classes or individual `@Bean` methods. These annotations include: * [Class Conditions](#features.developing-auto-configuration.condition-annotations.class-conditions) * [Bean Conditions](#features.developing-auto-configuration.condition-annotations.bean-conditions) * [Property Conditions](#features.developing-auto-configuration.condition-annotations.property-conditions) * [Resource Conditions](#features.developing-auto-configuration.condition-annotations.resource-conditions) * [Web Application Conditions](#features.developing-auto-configuration.condition-annotations.web-application-conditions) * [SpEL Expression Conditions](#features.developing-auto-configuration.condition-annotations.spel-conditions) #### 9.3.1. Class Conditions The `@ConditionalOnClass` and `@ConditionalOnMissingClass` annotations let `@Configuration` classes be included based on the presence or absence of specific classes. Due to the fact that annotation metadata is parsed by using [ASM](https://asm.ow2.io/), you can use the `value` attribute to refer to the real class, even though that class might not actually appear on the running application classpath. You can also use the `name` attribute if you prefer to specify the class name by using a `String` value. This mechanism does not apply the same way to `@Bean` methods where typically the return type is the target of the condition: before the condition on the method applies, the JVM will have loaded the class and potentially processed method references which will fail if the class is not present. To handle this scenario, a separate `@Configuration` class can be used to isolate the condition, as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.AutoConfiguration; import org.springframework.boot.autoconfigure.condition.ConditionalOnClass; import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @AutoConfiguration // Some conditions ... public class MyAutoConfiguration { // Auto-configured beans ... @Configuration(proxyBeanMethods = false) @ConditionalOnClass(SomeService.class) public static class SomeServiceConfiguration { @Bean @ConditionalOnMissingBean public SomeService someService() { return new SomeService(); } } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.condition.ConditionalOnClass import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) // Some conditions ... class MyAutoConfiguration { // Auto-configured beans ... @Configuration(proxyBeanMethods = false) @ConditionalOnClass(SomeService::class) class SomeServiceConfiguration { @Bean @ConditionalOnMissingBean fun someService(): SomeService { return SomeService() } } } ``` | | | | --- | --- | | | If you use `@ConditionalOnClass` or `@ConditionalOnMissingClass` as a part of a meta-annotation to compose your own composed annotations, you must use `name` as referring to the class in such a case is not handled. | #### 9.3.2. Bean Conditions The `@ConditionalOnBean` and `@ConditionalOnMissingBean` annotations let a bean be included based on the presence or absence of specific beans. You can use the `value` attribute to specify beans by type or `name` to specify beans by name. The `search` attribute lets you limit the `ApplicationContext` hierarchy that should be considered when searching for beans. When placed on a `@Bean` method, the target type defaults to the return type of the method, as shown in the following example: Java ``` import org.springframework.boot.autoconfigure.AutoConfiguration; import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean; import org.springframework.context.annotation.Bean; @AutoConfiguration public class MyAutoConfiguration { @Bean @ConditionalOnMissingBean public SomeService someService() { return new SomeService(); } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyAutoConfiguration { @Bean @ConditionalOnMissingBean fun someService(): SomeService { return SomeService() } } ``` In the preceding example, the `someService` bean is going to be created if no bean of type `SomeService` is already contained in the `ApplicationContext`. | | | | --- | --- | | | You need to be very careful about the order in which bean definitions are added, as these conditions are evaluated based on what has been processed so far. For this reason, we recommend using only `@ConditionalOnBean` and `@ConditionalOnMissingBean` annotations on auto-configuration classes (since these are guaranteed to load after any user-defined bean definitions have been added). | | | | | --- | --- | | | `@ConditionalOnBean` and `@ConditionalOnMissingBean` do not prevent `@Configuration` classes from being created. The only difference between using these conditions at the class level and marking each contained `@Bean` method with the annotation is that the former prevents registration of the `@Configuration` class as a bean if the condition does not match. | | | | | --- | --- | | | When declaring a `@Bean` method, provide as much type information as possible in the method’s return type. For example, if your bean’s concrete class implements an interface the bean method’s return type should be the concrete class and not the interface. Providing as much type information as possible in `@Bean` methods is particularly important when using bean conditions as their evaluation can only rely upon to type information that is available in the method signature. | #### 9.3.3. Property Conditions The `@ConditionalOnProperty` annotation lets configuration be included based on a Spring Environment property. Use the `prefix` and `name` attributes to specify the property that should be checked. By default, any property that exists and is not equal to `false` is matched. You can also create more advanced checks by using the `havingValue` and `matchIfMissing` attributes. #### 9.3.4. Resource Conditions The `@ConditionalOnResource` annotation lets configuration be included only when a specific resource is present. Resources can be specified by using the usual Spring conventions, as shown in the following example: `file:/home/user/test.dat`. #### 9.3.5. Web Application Conditions The `@ConditionalOnWebApplication` and `@ConditionalOnNotWebApplication` annotations let configuration be included depending on whether the application is a “web application”. A servlet-based web application is any application that uses a Spring `WebApplicationContext`, defines a `session` scope, or has a `ConfigurableWebEnvironment`. A reactive web application is any application that uses a `ReactiveWebApplicationContext`, or has a `ConfigurableReactiveWebEnvironment`. The `@ConditionalOnWarDeployment` annotation lets configuration be included depending on whether the application is a traditional WAR application that is deployed to a container. This condition will not match for applications that are run with an embedded server. #### 9.3.6. SpEL Expression Conditions The `@ConditionalOnExpression` annotation lets configuration be included based on the result of a [SpEL expression](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/core.html#expressions). | | | | --- | --- | | | Referencing a bean in the expression will cause that bean to be initialized very early in context refresh processing. As a result, the bean won’t be eligible for post-processing (such as configuration properties binding) and its state may be incomplete. | ### 9.4. Testing your Auto-configuration An auto-configuration can be affected by many factors: user configuration (`@Bean` definition and `Environment` customization), condition evaluation (presence of a particular library), and others. Concretely, each test should create a well defined `ApplicationContext` that represents a combination of those customizations. `ApplicationContextRunner` provides a great way to achieve that. `ApplicationContextRunner` is usually defined as a field of the test class to gather the base, common configuration. The following example makes sure that `MyServiceAutoConfiguration` is always invoked: Java ``` private final ApplicationContextRunner contextRunner = new ApplicationContextRunner() .withConfiguration(AutoConfigurations.of(MyServiceAutoConfiguration.class)); ``` Kotlin ``` val contextRunner = ApplicationContextRunner() .withConfiguration(AutoConfigurations.of(MyServiceAutoConfiguration::class.java)) ``` | | | | --- | --- | | | If multiple auto-configurations have to be defined, there is no need to order their declarations as they are invoked in the exact same order as when running the application. | Each test can use the runner to represent a particular use case. For instance, the sample below invokes a user configuration (`UserConfiguration`) and checks that the auto-configuration backs off properly. Invoking `run` provides a callback context that can be used with `AssertJ`. Java ``` @Test void defaultServiceBacksOff() { this.contextRunner.withUserConfiguration(UserConfiguration.class).run((context) -> { assertThat(context).hasSingleBean(MyService.class); assertThat(context).getBean("myCustomService").isSameAs(context.getBean(MyService.class)); }); } @Configuration(proxyBeanMethods = false) static class UserConfiguration { @Bean MyService myCustomService() { return new MyService("mine"); } } ``` Kotlin ``` @Test fun defaultServiceBacksOff() { contextRunner.withUserConfiguration(UserConfiguration::class.java) .run { context: AssertableApplicationContext -> assertThat(context).hasSingleBean(MyService::class.java) assertThat(context).getBean("myCustomService") .isSameAs(context.getBean(MyService::class.java)) } } @Configuration(proxyBeanMethods = false) internal class UserConfiguration { @Bean fun myCustomService(): MyService { return MyService("mine") } } ``` It is also possible to easily customize the `Environment`, as shown in the following example: Java ``` @Test void serviceNameCanBeConfigured() { this.contextRunner.withPropertyValues("user.name=test123").run((context) -> { assertThat(context).hasSingleBean(MyService.class); assertThat(context.getBean(MyService.class).getName()).isEqualTo("test123"); }); } ``` Kotlin ``` @Test fun serviceNameCanBeConfigured() { contextRunner.withPropertyValues("user.name=test123").run { context: AssertableApplicationContext -> assertThat(context).hasSingleBean(MyService::class.java) assertThat(context.getBean(MyService::class.java).name).isEqualTo("test123") } } ``` The runner can also be used to display the `ConditionEvaluationReport`. The report can be printed at `INFO` or `DEBUG` level. The following example shows how to use the `ConditionEvaluationReportLoggingListener` to print the report in auto-configuration tests. Java ``` import org.junit.jupiter.api.Test; import org.springframework.boot.autoconfigure.logging.ConditionEvaluationReportLoggingListener; import org.springframework.boot.logging.LogLevel; import org.springframework.boot.test.context.runner.ApplicationContextRunner; class MyConditionEvaluationReportingTests { @Test void autoConfigTest() { new ApplicationContextRunner() .withInitializer(new ConditionEvaluationReportLoggingListener(LogLevel.INFO)) .run((context) -> { // Test something... }); } } ``` Kotlin ``` import org.junit.jupiter.api.Test import org.springframework.boot.autoconfigure.logging.ConditionEvaluationReportLoggingListener import org.springframework.boot.logging.LogLevel import org.springframework.boot.test.context.assertj.AssertableApplicationContext import org.springframework.boot.test.context.runner.ApplicationContextRunner class MyConditionEvaluationReportingTests { @Test fun autoConfigTest() { ApplicationContextRunner() .withInitializer(ConditionEvaluationReportLoggingListener(LogLevel.INFO)) .run { context: AssertableApplicationContext? -> } } } ``` #### 9.4.1. Simulating a Web Context If you need to test an auto-configuration that only operates in a servlet or reactive web application context, use the `WebApplicationContextRunner` or `ReactiveWebApplicationContextRunner` respectively. #### 9.4.2. Overriding the Classpath It is also possible to test what happens when a particular class and/or package is not present at runtime. Spring Boot ships with a `FilteredClassLoader` that can easily be used by the runner. In the following example, we assert that if `MyService` is not present, the auto-configuration is properly disabled: Java ``` @Test void serviceIsIgnoredIfLibraryIsNotPresent() { this.contextRunner.withClassLoader(new FilteredClassLoader(MyService.class)) .run((context) -> assertThat(context).doesNotHaveBean("myService")); } ``` Kotlin ``` @Test fun serviceIsIgnoredIfLibraryIsNotPresent() { contextRunner.withClassLoader(FilteredClassLoader(MyService::class.java)) .run { context: AssertableApplicationContext? -> assertThat(context).doesNotHaveBean("myService") } } ``` ### 9.5. Creating Your Own Starter A typical Spring Boot starter contains code to auto-configure and customize the infrastructure of a given technology, let’s call that "acme". To make it easily extensible, a number of configuration keys in a dedicated namespace can be exposed to the environment. Finally, a single "starter" dependency is provided to help users get started as easily as possible. Concretely, a custom starter can contain the following: * The `autoconfigure` module that contains the auto-configuration code for "acme". * The `starter` module that provides a dependency to the `autoconfigure` module as well as "acme" and any additional dependencies that are typically useful. In a nutshell, adding the starter should provide everything needed to start using that library. This separation in two modules is in no way necessary. If "acme" has several flavors, options or optional features, then it is better to separate the auto-configuration as you can clearly express the fact some features are optional. Besides, you have the ability to craft a starter that provides an opinion about those optional dependencies. At the same time, others can rely only on the `autoconfigure` module and craft their own starter with different opinions. If the auto-configuration is relatively straightforward and does not have optional feature, merging the two modules in the starter is definitely an option. #### 9.5.1. Naming You should make sure to provide a proper namespace for your starter. Do not start your module names with `spring-boot`, even if you use a different Maven `groupId`. We may offer official support for the thing you auto-configure in the future. As a rule of thumb, you should name a combined module after the starter. For example, assume that you are creating a starter for "acme" and that you name the auto-configure module `acme-spring-boot` and the starter `acme-spring-boot-starter`. If you only have one module that combines the two, name it `acme-spring-boot-starter`. #### 9.5.2. Configuration keys If your starter provides configuration keys, use a unique namespace for them. In particular, do not include your keys in the namespaces that Spring Boot uses (such as `server`, `management`, `spring`, and so on). If you use the same namespace, we may modify these namespaces in the future in ways that break your modules. As a rule of thumb, prefix all your keys with a namespace that you own (for example `acme`). Make sure that configuration keys are documented by adding field javadoc for each property, as shown in the following example: Java ``` import java.time.Duration; import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties("acme") public class AcmeProperties { /\*\* \* Whether to check the location of acme resources. \*/ private boolean checkLocation = true; /\*\* \* Timeout for establishing a connection to the acme server. \*/ private Duration loginTimeout = Duration.ofSeconds(3); // getters/setters ... public boolean isCheckLocation() { return this.checkLocation; } public void setCheckLocation(boolean checkLocation) { this.checkLocation = checkLocation; } public Duration getLoginTimeout() { return this.loginTimeout; } public void setLoginTimeout(Duration loginTimeout) { this.loginTimeout = loginTimeout; } } ``` Kotlin ``` import org.springframework.boot.context.properties.ConfigurationProperties import java.time.Duration @ConfigurationProperties("acme") class AcmeProperties( /\*\* \* Whether to check the location of acme resources. \*/ var isCheckLocation: Boolean = true, /\*\* \* Timeout for establishing a connection to the acme server. \*/ var loginTimeout:Duration = Duration.ofSeconds(3)) ``` | | | | --- | --- | | | You should only use plain text with `@ConfigurationProperties` field Javadoc, since they are not processed before being added to the JSON. | Here are some rules we follow internally to make sure descriptions are consistent: * Do not start the description by "The" or "A". * For `boolean` types, start the description with "Whether" or "Enable". * For collection-based types, start the description with "Comma-separated list" * Use `java.time.Duration` rather than `long` and describe the default unit if it differs from milliseconds, such as "If a duration suffix is not specified, seconds will be used". * Do not provide the default value in the description unless it has to be determined at runtime. Make sure to [trigger meta-data generation](configuration-metadata#appendix.configuration-metadata.annotation-processor) so that IDE assistance is available for your keys as well. You may want to review the generated metadata (`META-INF/spring-configuration-metadata.json`) to make sure your keys are properly documented. Using your own starter in a compatible IDE is also a good idea to validate that quality of the metadata. #### 9.5.3. The “autoconfigure” Module The `autoconfigure` module contains everything that is necessary to get started with the library. It may also contain configuration key definitions (such as `@ConfigurationProperties`) and any callback interface that can be used to further customize how the components are initialized. | | | | --- | --- | | | You should mark the dependencies to the library as optional so that you can include the `autoconfigure` module in your projects more easily. If you do it that way, the library is not provided and, by default, Spring Boot backs off. | Spring Boot uses an annotation processor to collect the conditions on auto-configurations in a metadata file (`META-INF/spring-autoconfigure-metadata.properties`). If that file is present, it is used to eagerly filter auto-configurations that do not match, which will improve startup time. It is recommended to add the following dependency in a module that contains auto-configurations: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-autoconfigure-processor</artifactId> <optional>true</optional> </dependency> ``` If you have defined auto-configurations directly in your application, make sure to configure the `spring-boot-maven-plugin` to prevent the `repackage` goal from adding the dependency into the fat jar: ``` <project> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <excludes> <exclude> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-autoconfigure-processor</artifactId> </exclude> </excludes> </configuration> </plugin> </plugins> </build> </project> ``` With Gradle 4.5 and earlier, the dependency should be declared in the `compileOnly` configuration, as shown in the following example: ``` dependencies { compileOnly "org.springframework.boot:spring-boot-autoconfigure-processor" } ``` With Gradle 4.6 and later, the dependency should be declared in the `annotationProcessor` configuration, as shown in the following example: ``` dependencies { annotationProcessor "org.springframework.boot:spring-boot-autoconfigure-processor" } ``` #### 9.5.4. Starter Module The starter is really an empty jar. Its only purpose is to provide the necessary dependencies to work with the library. You can think of it as an opinionated view of what is required to get started. Do not make assumptions about the project in which your starter is added. If the library you are auto-configuring typically requires other starters, mention them as well. Providing a proper set of *default* dependencies may be hard if the number of optional dependencies is high, as you should avoid including dependencies that are unnecessary for a typical usage of the library. In other words, you should not include optional dependencies. | | | | --- | --- | | | Either way, your starter must reference the core Spring Boot starter (`spring-boot-starter`) directly or indirectly (there is no need to add it if your starter relies on another starter). If a project is created with only your custom starter, Spring Boot’s core features will be honoured by the presence of the core starter. | 10. Kotlin support ------------------- [Kotlin](https://kotlinlang.org) is a statically-typed language targeting the JVM (and other platforms) which allows writing concise and elegant code while providing [interoperability](https://kotlinlang.org/docs/reference/java-interop.html) with existing libraries written in Java. Spring Boot provides Kotlin support by leveraging the support in other Spring projects such as Spring Framework, Spring Data, and Reactor. See the [Spring Framework Kotlin support documentation](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/languages.html#kotlin) for more information. The easiest way to start with Spring Boot and Kotlin is to follow [this comprehensive tutorial](https://spring.io/guides/tutorials/spring-boot-kotlin/). You can create new Kotlin projects by using [start.spring.io](https://start.spring.io/#!language=kotlin). Feel free to join the #spring channel of [Kotlin Slack](https://slack.kotlinlang.org/) or ask a question with the `spring` and `kotlin` tags on [Stack Overflow](https://stackoverflow.com/questions/tagged/spring+kotlin) if you need support. ### 10.1. Requirements Spring Boot requires at least Kotlin 1.3.x and manages a suitable Kotlin version through dependency management. To use Kotlin, `org.jetbrains.kotlin:kotlin-stdlib` and `org.jetbrains.kotlin:kotlin-reflect` must be present on the classpath. The `kotlin-stdlib` variants `kotlin-stdlib-jdk7` and `kotlin-stdlib-jdk8` can also be used. Since [Kotlin classes are final by default](https://discuss.kotlinlang.org/t/classes-final-by-default/166), you are likely to want to configure [kotlin-spring](https://kotlinlang.org/docs/reference/compiler-plugins.html#spring-support) plugin in order to automatically open Spring-annotated classes so that they can be proxied. [Jackson’s Kotlin module](https://github.com/FasterXML/jackson-module-kotlin) is required for serializing / deserializing JSON data in Kotlin. It is automatically registered when found on the classpath. A warning message is logged if Jackson and Kotlin are present but the Jackson Kotlin module is not. | | | | --- | --- | | | These dependencies and plugins are provided by default if one bootstraps a Kotlin project on [start.spring.io](https://start.spring.io/#!language=kotlin). | ### 10.2. Null-safety One of Kotlin’s key features is [null-safety](https://kotlinlang.org/docs/reference/null-safety.html). It deals with `null` values at compile time rather than deferring the problem to runtime and encountering a `NullPointerException`. This helps to eliminate a common source of bugs without paying the cost of wrappers like `Optional`. Kotlin also allows using functional constructs with nullable values as described in this [comprehensive guide to null-safety in Kotlin](https://www.baeldung.com/kotlin-null-safety). Although Java does not allow one to express null-safety in its type system, Spring Framework, Spring Data, and Reactor now provide null-safety of their API through tooling-friendly annotations. By default, types from Java APIs used in Kotlin are recognized as [platform types](https://kotlinlang.org/docs/reference/java-interop.html#null-safety-and-platform-types) for which null-checks are relaxed. [Kotlin’s support for JSR 305 annotations](https://kotlinlang.org/docs/reference/java-interop.html#jsr-305-support) combined with nullability annotations provide null-safety for the related Spring API in Kotlin. The JSR 305 checks can be configured by adding the `-Xjsr305` compiler flag with the following options: `-Xjsr305={strict|warn|ignore}`. The default behavior is the same as `-Xjsr305=warn`. The `strict` value is required to have null-safety taken in account in Kotlin types inferred from Spring API but should be used with the knowledge that Spring API nullability declaration could evolve even between minor releases and more checks may be added in the future). | | | | --- | --- | | | Generic type arguments, varargs and array elements nullability are not yet supported. See [SPR-15942](https://jira.spring.io/browse/SPR-15942) for up-to-date information. Also be aware that Spring Boot’s own API is [not yet annotated](https://github.com/spring-projects/spring-boot/issues/10712). | ### 10.3. Kotlin API #### 10.3.1. runApplication Spring Boot provides an idiomatic way to run an application with `runApplication<MyApplication>(*args)` as shown in the following example: ``` import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.runApplication @SpringBootApplication class MyApplication fun main(args: Array<String>) { runApplication<MyApplication>(\*args) } ``` This is a drop-in replacement for `SpringApplication.run(MyApplication::class.java, *args)`. It also allows customization of the application as shown in the following example: ``` runApplication<MyApplication>(*args) { setBannerMode(OFF) } ``` #### 10.3.2. Extensions Kotlin [extensions](https://kotlinlang.org/docs/reference/extensions.html) provide the ability to extend existing classes with additional functionality. The Spring Boot Kotlin API makes use of these extensions to add new Kotlin specific conveniences to existing APIs. `TestRestTemplate` extensions, similar to those provided by Spring Framework for `RestOperations` in Spring Framework, are provided. Among other things, the extensions make it possible to take advantage of Kotlin reified type parameters. ### 10.4. Dependency management In order to avoid mixing different versions of Kotlin dependencies on the classpath, Spring Boot imports the Kotlin BOM. With Maven, the Kotlin version can be customized by setting the `kotlin.version` property and plugin management is provided for `kotlin-maven-plugin`. With Gradle, the Spring Boot plugin automatically aligns the `kotlin.version` with the version of the Kotlin plugin. Spring Boot also manages the version of Coroutines dependencies by importing the Kotlin Coroutines BOM. The version can be customized by setting the `kotlin-coroutines.version` property. | | | | --- | --- | | | `org.jetbrains.kotlinx:kotlinx-coroutines-reactor` dependency is provided by default if one bootstraps a Kotlin project with at least one reactive dependency on [start.spring.io](https://start.spring.io/#!language=kotlin). | ### 10.5. @ConfigurationProperties `@ConfigurationProperties` when used in combination with [`@ConstructorBinding`](#features.external-config.typesafe-configuration-properties.constructor-binding) supports classes with immutable `val` properties as shown in the following example: ``` @ConstructorBinding @ConfigurationProperties("example.kotlin") data class KotlinExampleProperties( val name: String, val description: String, val myService: MyService) { data class MyService( val apiToken: String, val uri: URI ) } ``` | | | | --- | --- | | | To generate [your own metadata](configuration-metadata#appendix.configuration-metadata.annotation-processor) using the annotation processor, [`kapt` should be configured](https://kotlinlang.org/docs/reference/kapt.html) with the `spring-boot-configuration-processor` dependency. Note that some features (such as detecting the default value or deprecated items) are not working due to limitations in the model kapt provides. | ### 10.6. Testing While it is possible to use JUnit 4 to test Kotlin code, JUnit 5 is provided by default and is recommended. JUnit 5 enables a test class to be instantiated once and reused for all of the class’s tests. This makes it possible to use `@BeforeAll` and `@AfterAll` annotations on non-static methods, which is a good fit for Kotlin. To mock Kotlin classes, [MockK](https://mockk.io/) is recommended. If you need the `Mockk` equivalent of the Mockito specific [`@MockBean` and `@SpyBean` annotations](#features.testing.spring-boot-applications.mocking-beans), you can use [SpringMockK](https://github.com/Ninja-Squad/springmockk) which provides similar `@MockkBean` and `@SpykBean` annotations. ### 10.7. Resources #### 10.7.1. Further reading * [Kotlin language reference](https://kotlinlang.org/docs/reference/) * [Kotlin Slack](https://kotlinlang.slack.com/) (with a dedicated #spring channel) * [Stackoverflow with `spring` and `kotlin` tags](https://stackoverflow.com/questions/tagged/spring+kotlin) * [Try Kotlin in your browser](https://try.kotlinlang.org/) * [Kotlin blog](https://blog.jetbrains.com/kotlin/) * [Awesome Kotlin](https://kotlin.link/) * [Tutorial: building web applications with Spring Boot and Kotlin](https://spring.io/guides/tutorials/spring-boot-kotlin/) * [Developing Spring Boot applications with Kotlin](https://spring.io/blog/2016/02/15/developing-spring-boot-applications-with-kotlin) * [A Geospatial Messenger with Kotlin, Spring Boot and PostgreSQL](https://spring.io/blog/2016/03/20/a-geospatial-messenger-with-kotlin-spring-boot-and-postgresql) * [Introducing Kotlin support in Spring Framework 5.0](https://spring.io/blog/2017/01/04/introducing-kotlin-support-in-spring-framework-5-0) * [Spring Framework 5 Kotlin APIs, the functional way](https://spring.io/blog/2017/08/01/spring-framework-5-kotlin-apis-the-functional-way) #### 10.7.2. Examples * [spring-boot-kotlin-demo](https://github.com/sdeleuze/spring-boot-kotlin-demo): regular Spring Boot + Spring Data JPA project * [mixit](https://github.com/mixitconf/mixit): Spring Boot 2 + WebFlux + Reactive Spring Data MongoDB * [spring-kotlin-fullstack](https://github.com/sdeleuze/spring-kotlin-fullstack): WebFlux Kotlin fullstack example with Kotlin2js for frontend instead of JavaScript or TypeScript * [spring-petclinic-kotlin](https://github.com/spring-petclinic/spring-petclinic-kotlin): Kotlin version of the Spring PetClinic Sample Application * [spring-kotlin-deepdive](https://github.com/sdeleuze/spring-kotlin-deepdive): a step by step migration for Boot 1.0 + Java to Boot 2.0 + Kotlin * [spring-boot-coroutines-demo](https://github.com/sdeleuze/spring-boot-coroutines-demo): Coroutines sample project 11. What to Read Next ---------------------- If you want to learn more about any of the classes discussed in this section, see the [Spring Boot API documentation](https://docs.spring.io/spring-boot/docs/2.7.0/api/) or you can browse the [source code directly](https://github.com/spring-projects/spring-boot/tree/v2.7.0). If you have specific questions, see the [how-to](howto#howto) section. If you are comfortable with Spring Boot’s core features, you can continue on and read about [production-ready features](actuator#actuator).
programming_docs
spring_boot The Executable Jar Format The Executable Jar Format ========================= The `spring-boot-loader` modules lets Spring Boot support executable jar and war files. If you use the Maven plugin or the Gradle plugin, executable jars are automatically generated, and you generally do not need to know the details of how they work. If you need to create executable jars from a different build system or if you are just curious about the underlying technology, this appendix provides some background. 1. Nested JARs --------------- Java does not provide any standard way to load nested jar files (that is, jar files that are themselves contained within a jar). This can be problematic if you need to distribute a self-contained application that can be run from the command line without unpacking. To solve this problem, many developers use “shaded” jars. A shaded jar packages all classes, from all jars, into a single “uber jar”. The problem with shaded jars is that it becomes hard to see which libraries are actually in your application. It can also be problematic if the same filename is used (but with different content) in multiple jars. Spring Boot takes a different approach and lets you actually nest jars directly. ### 1.1. The Executable Jar File Structure Spring Boot Loader-compatible jar files should be structured in the following way: ``` example.jar | +-META-INF | +-MANIFEST.MF +-org | +-springframework | +-boot | +-loader | +-<spring boot loader classes> +-BOOT-INF +-classes | +-mycompany | +-project | +-YourClasses.class +-lib +-dependency1.jar +-dependency2.jar ``` Application classes should be placed in a nested `BOOT-INF/classes` directory. Dependencies should be placed in a nested `BOOT-INF/lib` directory. ### 1.2. The Executable War File Structure Spring Boot Loader-compatible war files should be structured in the following way: ``` example.war | +-META-INF | +-MANIFEST.MF +-org | +-springframework | +-boot | +-loader | +-<spring boot loader classes> +-WEB-INF +-classes | +-com | +-mycompany | +-project | +-YourClasses.class +-lib | +-dependency1.jar | +-dependency2.jar +-lib-provided +-servlet-api.jar +-dependency3.jar ``` Dependencies should be placed in a nested `WEB-INF/lib` directory. Any dependencies that are required when running embedded but are not required when deploying to a traditional web container should be placed in `WEB-INF/lib-provided`. ### 1.3. Index Files Spring Boot Loader-compatible jar and war archives can include additional index files under the `BOOT-INF/` directory. A `classpath.idx` file can be provided for both jars and wars, and it provides the ordering that jars should be added to the classpath. The `layers.idx` file can be used only for jars, and it allows a jar to be split into logical layers for Docker/OCI image creation. Index files follow a YAML compatible syntax so that they can be easily parsed by third-party tools. These files, however, are *not* parsed internally as YAML and they must be written in exactly the formats described below in order to be used. ### 1.4. Classpath Index The classpath index file can be provided in `BOOT-INF/classpath.idx`. It provides a list of jar names (including the directory) in the order that they should be added to the classpath. Each line must start with dash space (`"-·"`) and names must be in double quotes. For example, given the following jar: ``` example.jar | +-META-INF | +-... +-BOOT-INF +-classes | +... +-lib +-dependency1.jar +-dependency2.jar ``` The index file would look like this: ``` - "BOOT-INF/lib/dependency2.jar" - "BOOT-INF/lib/dependency1.jar" ``` ### 1.5. Layer Index The layers index file can be provided in `BOOT-INF/layers.idx`. It provides a list of layers and the parts of the jar that should be contained within them. Layers are written in the order that they should be added to the Docker/OCI image. Layers names are written as quoted strings prefixed with dash space (`"-·"`) and with a colon (`":"`) suffix. Layer content is either a file or directory name written as a quoted string prefixed by space space dash space (`"··-·"`). A directory name ends with `/`, a file name does not. When a directory name is used it means that all files inside that directory are in the same layer. A typical example of a layers index would be: ``` - "dependencies": - "BOOT-INF/lib/dependency1.jar" - "BOOT-INF/lib/dependency2.jar" - "application": - "BOOT-INF/classes/" - "META-INF/" ``` 2. Spring Boot’s “JarFile” Class --------------------------------- The core class used to support loading nested jars is `org.springframework.boot.loader.jar.JarFile`. It lets you load jar content from a standard jar file or from nested child jar data. When first loaded, the location of each `JarEntry` is mapped to a physical file offset of the outer jar, as shown in the following example: ``` myapp.jar +-------------------+-------------------------+ | /BOOT-INF/classes | /BOOT-INF/lib/mylib.jar | |+-----------------+||+-----------+----------+| || A.class ||| B.class | C.class || |+-----------------+||+-----------+----------+| +-------------------+-------------------------+ ^ ^ ^ 0063 3452 3980 ``` The preceding example shows how `A.class` can be found in `/BOOT-INF/classes` in `myapp.jar` at position `0063`. `B.class` from the nested jar can actually be found in `myapp.jar` at position `3452`, and `C.class` is at position `3980`. Armed with this information, we can load specific nested entries by seeking to the appropriate part of the outer jar. We do not need to unpack the archive, and we do not need to read all entry data into memory. ### 2.1. Compatibility with the Standard Java “JarFile” Spring Boot Loader strives to remain compatible with existing code and libraries. `org.springframework.boot.loader.jar.JarFile` extends from `java.util.jar.JarFile` and should work as a drop-in replacement. The `getURL()` method returns a `URL` that opens a connection compatible with `java.net.JarURLConnection` and can be used with Java’s `URLClassLoader`. 3. Launching Executable Jars ----------------------------- The `org.springframework.boot.loader.Launcher` class is a special bootstrap class that is used as an executable jar’s main entry point. It is the actual `Main-Class` in your jar file, and it is used to setup an appropriate `URLClassLoader` and ultimately call your `main()` method. There are three launcher subclasses (`JarLauncher`, `WarLauncher`, and `PropertiesLauncher`). Their purpose is to load resources (`.class` files and so on) from nested jar files or war files in directories (as opposed to those explicitly on the classpath). In the case of `JarLauncher` and `WarLauncher`, the nested paths are fixed. `JarLauncher` looks in `BOOT-INF/lib/`, and `WarLauncher` looks in `WEB-INF/lib/` and `WEB-INF/lib-provided/`. You can add extra jars in those locations if you want more. The `PropertiesLauncher` looks in `BOOT-INF/lib/` in your application archive by default. You can add additional locations by setting an environment variable called `LOADER_PATH` or `loader.path` in `loader.properties` (which is a comma-separated list of directories, archives, or directories within archives). ### 3.1. Launcher Manifest You need to specify an appropriate `Launcher` as the `Main-Class` attribute of `META-INF/MANIFEST.MF`. The actual class that you want to launch (that is, the class that contains a `main` method) should be specified in the `Start-Class` attribute. The following example shows a typical `MANIFEST.MF` for an executable jar file: ``` Main-Class: org.springframework.boot.loader.JarLauncher Start-Class: com.mycompany.project.MyApplication ``` For a war file, it would be as follows: ``` Main-Class: org.springframework.boot.loader.WarLauncher Start-Class: com.mycompany.project.MyApplication ``` | | | | --- | --- | | | You need not specify `Class-Path` entries in your manifest file. The classpath is deduced from the nested jars. | 4. PropertiesLauncher Features ------------------------------- `PropertiesLauncher` has a few special features that can be enabled with external properties (System properties, environment variables, manifest entries, or `loader.properties`). The following table describes these properties: | Key | Purpose | | --- | --- | | `loader.path` | Comma-separated Classpath, such as `lib,${HOME}/app/lib`. Earlier entries take precedence, like a regular `-classpath` on the `javac` command line. | | `loader.home` | Used to resolve relative paths in `loader.path`. For example, given `loader.path=lib`, then `${loader.home}/lib` is a classpath location (along with all jar files in that directory). This property is also used to locate a `loader.properties` file, as in the following example `[/opt/app](file:///opt/app)` It defaults to `${user.dir}`. | | `loader.args` | Default arguments for the main method (space separated). | | `loader.main` | Name of main class to launch (for example, `com.app.Application`). | | `loader.config.name` | Name of properties file (for example, `launcher`). It defaults to `loader`. | | `loader.config.location` | Path to properties file (for example, `classpath:loader.properties`). It defaults to `loader.properties`. | | `loader.system` | Boolean flag to indicate that all properties should be added to System properties. It defaults to `false`. | When specified as environment variables or manifest entries, the following names should be used: | Key | Manifest entry | Environment variable | | --- | --- | --- | | `loader.path` | `Loader-Path` | `LOADER_PATH` | | `loader.home` | `Loader-Home` | `LOADER_HOME` | | `loader.args` | `Loader-Args` | `LOADER_ARGS` | | `loader.main` | `Start-Class` | `LOADER_MAIN` | | `loader.config.location` | `Loader-Config-Location` | `LOADER_CONFIG_LOCATION` | | `loader.system` | `Loader-System` | `LOADER_SYSTEM` | | | | | --- | --- | | | Build plugins automatically move the `Main-Class` attribute to `Start-Class` when the fat jar is built. If you use that, specify the name of the class to launch by using the `Main-Class` attribute and leaving out `Start-Class`. | The following rules apply to working with `PropertiesLauncher`: * `loader.properties` is searched for in `loader.home`, then in the root of the classpath, and then in `classpath:/BOOT-INF/classes`. The first location where a file with that name exists is used. * `loader.home` is the directory location of an additional properties file (overriding the default) only when `loader.config.location` is not specified. * `loader.path` can contain directories (which are scanned recursively for jar and zip files), archive paths, a directory within an archive that is scanned for jar files (for example, `dependencies.jar!/lib`), or wildcard patterns (for the default JVM behavior). Archive paths can be relative to `loader.home` or anywhere in the file system with a `jar:file:` prefix. * `loader.path` (if empty) defaults to `BOOT-INF/lib` (meaning a local directory or a nested one if running from an archive). Because of this, `PropertiesLauncher` behaves the same as `JarLauncher` when no additional configuration is provided. * `loader.path` can not be used to configure the location of `loader.properties` (the classpath used to search for the latter is the JVM classpath when `PropertiesLauncher` is launched). * Placeholder replacement is done from System and environment variables plus the properties file itself on all values before use. * The search order for properties (where it makes sense to look in more than one place) is environment variables, system properties, `loader.properties`, the exploded archive manifest, and the archive manifest. 5. Executable Jar Restrictions ------------------------------- You need to consider the following restrictions when working with a Spring Boot Loader packaged application: * Zip entry compression: The `ZipEntry` for a nested jar must be saved by using the `ZipEntry.STORED` method. This is required so that we can seek directly to individual content within the nested jar. The content of the nested jar file itself can still be compressed, as can any other entries in the outer jar. * System classLoader: Launched applications should use `Thread.getContextClassLoader()` when loading classes (most libraries and frameworks do so by default). Trying to load nested jar classes with `ClassLoader.getSystemClassLoader()` fails. `java.util.Logging` always uses the system classloader. For this reason, you should consider a different logging implementation. 6. Alternative Single Jar Solutions ------------------------------------ If the preceding restrictions mean that you cannot use Spring Boot Loader, consider the following alternatives: * [Maven Shade Plugin](https://maven.apache.org/plugins/maven-shade-plugin/) * [JarClassLoader](http://www.jdotsoft.com/JarClassLoader.php) * [OneJar](https://sourceforge.net/projects/one-jar/) * [Gradle Shadow Plugin](https://imperceptiblethoughts.com/shadow/) spring_boot Build Tool Plugins Build Tool Plugins ================== Spring Boot provides build tool plugins for Maven and Gradle. The plugins offer a variety of features, including the packaging of executable jars. This section provides more details on both plugins as well as some help should you need to extend an unsupported build system. If you are just getting started, you might want to read “[using.html](using#using.build-systems)” from the “[using.html](using#using)” section first. 1. Spring Boot Maven Plugin ---------------------------- The Spring Boot Maven Plugin provides Spring Boot support in Maven, letting you package executable jar or war archives and run an application “in-place”. To use it, you must use Maven 3.2 (or later). See the plugin’s documentation to learn more: * Reference ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/htmlsingle/) and [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/reference/pdf/spring-boot-maven-plugin-reference.pdf)) * [API](https://docs.spring.io/spring-boot/docs/2.7.0/maven-plugin/api/) 2. Spring Boot Gradle Plugin ----------------------------- The Spring Boot Gradle Plugin provides Spring Boot support in Gradle, letting you package executable jar or war archives, run Spring Boot applications, and use the dependency management provided by `spring-boot-dependencies`. It requires Gradle 6.8, 6.9, or 7.x. See the plugin’s documentation to learn more: * Reference ([HTML](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/htmlsingle/) and [PDF](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/reference/pdf/spring-boot-gradle-plugin-reference.pdf)) * [API](https://docs.spring.io/spring-boot/docs/2.7.0/gradle-plugin/api/) 3. Spring Boot AntLib Module ----------------------------- The Spring Boot AntLib module provides basic Spring Boot support for Apache Ant. You can use the module to create executable jars. To use the module, you need to declare an additional `spring-boot` namespace in your `build.xml`, as shown in the following example: ``` <project xmlns:ivy="antlib:org.apache.ivy.ant" xmlns:spring-boot="antlib:org.springframework.boot.ant" name="myapp" default="build"> ... </project> ``` You need to remember to start Ant using the `-lib` option, as shown in the following example: ``` $ ant -lib <directory containing spring-boot-antlib-2.7.0.jar> ``` | | | | --- | --- | | | The “Using Spring Boot” section includes a more complete example of [using Apache Ant with `spring-boot-antlib`](using#using.build-systems.ant). | ### 3.1. Spring Boot Ant Tasks Once the `spring-boot-antlib` namespace has been declared, the following additional tasks are available: * [Using the “exejar” Task](#build-tool-plugins.antlib.tasks.exejar) * [Using the “findmainclass” Task](#build-tool-plugins.antlib.findmainclass) #### 3.1.1. Using the “exejar” Task You can use the `exejar` task to create a Spring Boot executable jar. The following attributes are supported by the task: | Attribute | Description | Required | | --- | --- | --- | | `destfile` | The destination jar file to create | Yes | | `classes` | The root directory of Java class files | Yes | | `start-class` | The main application class to run | No *(the default is the first class found that declares a `main` method)* | The following nested elements can be used with the task: | Element | Description | | --- | --- | | `resources` | One or more [Resource Collections](https://ant.apache.org/manual/Types/resources.html#collection) describing a set of [Resources](https://ant.apache.org/manual/Types/resources.html) that should be added to the content of the created jar file. | | `lib` | One or more [Resource Collections](https://ant.apache.org/manual/Types/resources.html#collection) that should be added to the set of jar libraries that make up the runtime dependency classpath of the application. | #### 3.1.2. Examples This section shows two examples of Ant tasks. Specify start-class ``` <spring-boot:exejar destfile="target/my-application.jar" classes="target/classes" start-class="com.example.MyApplication"> <resources> <fileset dir="src/main/resources" /> </resources> <lib> <fileset dir="lib" /> </lib> </spring-boot:exejar> ``` Detect start-class ``` <exejar destfile="target/my-application.jar" classes="target/classes"> <lib> <fileset dir="lib" /> </lib> </exejar> ``` ### 3.2. Using the “findmainclass” Task The `findmainclass` task is used internally by `exejar` to locate a class declaring a `main`. If necessary, you can also use this task directly in your build. The following attributes are supported: | Attribute | Description | Required | | --- | --- | --- | | `classesroot` | The root directory of Java class files | Yes *(unless `mainclass` is specified)* | | `mainclass` | Can be used to short-circuit the `main` class search | No | | `property` | The Ant property that should be set with the result | No *(result will be logged if unspecified)* | #### 3.2.1. Examples This section contains three examples of using `findmainclass`. Find and log ``` <findmainclass classesroot="target/classes" /> ``` Find and set ``` <findmainclass classesroot="target/classes" property="main-class" /> ``` Override and set ``` <findmainclass mainclass="com.example.MainClass" property="main-class" /> ``` 4. Supporting Other Build Systems ---------------------------------- If you want to use a build tool other than Maven, Gradle, or Ant, you likely need to develop your own plugin. Executable jars need to follow a specific format and certain entries need to be written in an uncompressed form (see the “[executable jar format](executable-jar#appendix.executable-jar)” section in the appendix for details). The Spring Boot Maven and Gradle plugins both make use of `spring-boot-loader-tools` to actually generate jars. If you need to, you may use this library directly. ### 4.1. Repackaging Archives To repackage an existing archive so that it becomes a self-contained executable archive, use `org.springframework.boot.loader.tools.Repackager`. The `Repackager` class takes a single constructor argument that refers to an existing jar or war archive. Use one of the two available `repackage()` methods to either replace the original file or write to a new destination. Various settings can also be configured on the repackager before it is run. ### 4.2. Nested Libraries When repackaging an archive, you can include references to dependency files by using the `org.springframework.boot.loader.tools.Libraries` interface. We do not provide any concrete implementations of `Libraries` here as they are usually build-system-specific. If your archive already includes libraries, you can use `Libraries.NONE`. ### 4.3. Finding a Main Class If you do not use `Repackager.setMainClass()` to specify a main class, the repackager uses [ASM](https://asm.ow2.io/) to read class files and tries to find a suitable class with a `public static void main(String[] args)` method. An exception is thrown if more than one candidate is found. ### 4.4. Example Repackage Implementation The following example shows a typical repackage implementation: Java ``` import java.io.File; import java.io.IOException; import java.util.List; import org.springframework.boot.loader.tools.Library; import org.springframework.boot.loader.tools.LibraryCallback; import org.springframework.boot.loader.tools.LibraryScope; import org.springframework.boot.loader.tools.Repackager; public class MyBuildTool { public void build() throws IOException { File sourceJarFile = ... Repackager repackager = new Repackager(sourceJarFile); repackager.setBackupSource(false); repackager.repackage(this::getLibraries); } private void getLibraries(LibraryCallback callback) throws IOException { // Build system specific implementation, callback for each dependency for (File nestedJar : getCompileScopeJars()) { callback.library(new Library(nestedJar, LibraryScope.COMPILE)); } // ... } private List<File> getCompileScopeJars() { return ... } } ``` Kotlin ``` import org.springframework.boot.loader.tools.Library import org.springframework.boot.loader.tools.LibraryCallback import org.springframework.boot.loader.tools.LibraryScope import org.springframework.boot.loader.tools.Repackager import java.io.File import java.io.IOException import kotlin.jvm.Throws class MyBuildTool { @Throws(IOException::class) fun build() { val sourceJarFile: File? = ... val repackager = Repackager(sourceJarFile) repackager.setBackupSource(false) repackager.repackage { callback: LibraryCallback -> getLibraries(callback) } } @Throws(IOException::class) private fun getLibraries(callback: LibraryCallback) { // Build system specific implementation, callback for each dependency for (nestedJar in getCompileScopeJars()!!) { callback.library(Library(nestedJar, LibraryScope.COMPILE)) } // ... } private fun getCompileScopeJars(): List<File?>? { return ... } } ``` 5. What to Read Next --------------------- If you are interested in how the build tool plugins work, you can look at the [`spring-boot-tools`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-tools) module on GitHub. More technical details of the executable jar format are covered in [the appendix](executable-jar#appendix.executable-jar). If you have specific build-related questions, see the “[how-to](howto#howto)” guides.
programming_docs
spring_boot Test Auto-configuration Annotations Test Auto-configuration Annotations =================================== This appendix describes the `@…​Test` auto-configuration annotations that Spring Boot provides to test slices of your application. 1. Test Slices --------------- The following table lists the various `@…​Test` annotations that can be used to test slices of your application and the auto-configuration that they import by default: | Test slice | Imported auto-configuration | | --- | --- | | `@DataCassandraTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.cassandra.CassandraAutoConfiguration` `org.springframework.boot.autoconfigure.data.cassandra.CassandraDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.cassandra.CassandraReactiveDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.cassandra.CassandraReactiveRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.data.cassandra.CassandraRepositoriesAutoConfiguration` | | `@DataCouchbaseTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.couchbase.CouchbaseAutoConfiguration` `org.springframework.boot.autoconfigure.data.couchbase.CouchbaseDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.couchbase.CouchbaseReactiveDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.couchbase.CouchbaseReactiveRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.data.couchbase.CouchbaseRepositoriesAutoConfiguration` | | `@DataElasticsearchTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.elasticsearch.ElasticsearchDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.elasticsearch.ElasticsearchRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.data.elasticsearch.ReactiveElasticsearchRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.data.elasticsearch.ReactiveElasticsearchRestClientAutoConfiguration` `org.springframework.boot.autoconfigure.elasticsearch.ElasticsearchRestClientAutoConfiguration` | | `@DataJdbcTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.jdbc.JdbcRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration` `org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration` `org.springframework.boot.autoconfigure.sql.init.SqlInitializationAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` `org.springframework.boot.test.autoconfigure.jdbc.TestDatabaseAutoConfiguration` | | `@DataJpaTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.jpa.JpaRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration` `org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration` `org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration` `org.springframework.boot.autoconfigure.sql.init.SqlInitializationAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` `org.springframework.boot.test.autoconfigure.jdbc.TestDatabaseAutoConfiguration` `org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManagerAutoConfiguration` | | `@DataLdapTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.ldap.LdapRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.ldap.LdapAutoConfiguration` `org.springframework.boot.autoconfigure.ldap.embedded.EmbeddedLdapAutoConfiguration` | | `@DataMongoTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.mongo.MongoDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.mongo.MongoReactiveDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.mongo.MongoReactiveRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.data.mongo.MongoRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration` `org.springframework.boot.autoconfigure.mongo.MongoReactiveAutoConfiguration` `org.springframework.boot.autoconfigure.mongo.embedded.EmbeddedMongoAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` | | `@DataNeo4jTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.neo4j.Neo4jDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.neo4j.Neo4jReactiveDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.neo4j.Neo4jReactiveRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.data.neo4j.Neo4jRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.neo4j.Neo4jAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` | | `@DataR2dbcTest` | `org.springframework.boot.autoconfigure.data.r2dbc.R2dbcDataAutoConfiguration` `org.springframework.boot.autoconfigure.data.r2dbc.R2dbcRepositoriesAutoConfiguration` `org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration` `org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration` `org.springframework.boot.autoconfigure.r2dbc.R2dbcAutoConfiguration` `org.springframework.boot.autoconfigure.r2dbc.R2dbcTransactionManagerAutoConfiguration` `org.springframework.boot.autoconfigure.sql.init.SqlInitializationAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` | | `@DataRedisTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.data.redis.RedisAutoConfiguration` `org.springframework.boot.autoconfigure.data.redis.RedisReactiveAutoConfiguration` `org.springframework.boot.autoconfigure.data.redis.RedisRepositoriesAutoConfiguration` | | `@GraphQlTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.graphql.GraphQlAutoConfiguration` `org.springframework.boot.autoconfigure.gson.GsonAutoConfiguration` `org.springframework.boot.autoconfigure.http.codec.CodecsAutoConfiguration` `org.springframework.boot.autoconfigure.jackson.JacksonAutoConfiguration` `org.springframework.boot.autoconfigure.jsonb.JsonbAutoConfiguration` `org.springframework.boot.autoconfigure.validation.ValidationAutoConfiguration` `org.springframework.boot.test.autoconfigure.graphql.tester.GraphQlTesterAutoConfiguration` | | `@JdbcTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration` `org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration` `org.springframework.boot.autoconfigure.sql.init.SqlInitializationAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` `org.springframework.boot.test.autoconfigure.jdbc.TestDatabaseAutoConfiguration` | | `@JooqTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration` `org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration` `org.springframework.boot.autoconfigure.jooq.JooqAutoConfiguration` `org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration` `org.springframework.boot.autoconfigure.sql.init.SqlInitializationAutoConfiguration` `org.springframework.boot.autoconfigure.transaction.TransactionAutoConfiguration` | | `@JsonTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.gson.GsonAutoConfiguration` `org.springframework.boot.autoconfigure.jackson.JacksonAutoConfiguration` `org.springframework.boot.autoconfigure.jsonb.JsonbAutoConfiguration` `org.springframework.boot.test.autoconfigure.json.JsonTestersAutoConfiguration` | | `@RestClientTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.gson.GsonAutoConfiguration` `org.springframework.boot.autoconfigure.http.HttpMessageConvertersAutoConfiguration` `org.springframework.boot.autoconfigure.http.codec.CodecsAutoConfiguration` `org.springframework.boot.autoconfigure.jackson.JacksonAutoConfiguration` `org.springframework.boot.autoconfigure.jsonb.JsonbAutoConfiguration` `org.springframework.boot.autoconfigure.web.client.RestTemplateAutoConfiguration` `org.springframework.boot.autoconfigure.web.reactive.function.client.WebClientAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.client.MockRestServiceServerAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.client.WebClientRestTemplateAutoConfiguration` | | `@WebFluxTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.context.MessageSourceAutoConfiguration` `org.springframework.boot.autoconfigure.freemarker.FreeMarkerAutoConfiguration` `org.springframework.boot.autoconfigure.gson.GsonAutoConfiguration` `org.springframework.boot.autoconfigure.http.codec.CodecsAutoConfiguration` `org.springframework.boot.autoconfigure.jackson.JacksonAutoConfiguration` `org.springframework.boot.autoconfigure.jsonb.JsonbAutoConfiguration` `org.springframework.boot.autoconfigure.mustache.MustacheAutoConfiguration` `org.springframework.boot.autoconfigure.security.oauth2.client.reactive.ReactiveOAuth2ClientAutoConfiguration` `org.springframework.boot.autoconfigure.security.oauth2.resource.reactive.ReactiveOAuth2ResourceServerAutoConfiguration` `org.springframework.boot.autoconfigure.security.reactive.ReactiveSecurityAutoConfiguration` `org.springframework.boot.autoconfigure.security.reactive.ReactiveUserDetailsServiceAutoConfiguration` `org.springframework.boot.autoconfigure.thymeleaf.ThymeleafAutoConfiguration` `org.springframework.boot.autoconfigure.validation.ValidationAutoConfiguration` `org.springframework.boot.autoconfigure.web.reactive.WebFluxAutoConfiguration` `org.springframework.boot.autoconfigure.web.reactive.error.ErrorWebFluxAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.reactive.WebTestClientAutoConfiguration` | | `@WebMvcTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.context.MessageSourceAutoConfiguration` `org.springframework.boot.autoconfigure.data.web.SpringDataWebAutoConfiguration` `org.springframework.boot.autoconfigure.freemarker.FreeMarkerAutoConfiguration` `org.springframework.boot.autoconfigure.groovy.template.GroovyTemplateAutoConfiguration` `org.springframework.boot.autoconfigure.gson.GsonAutoConfiguration` `org.springframework.boot.autoconfigure.hateoas.HypermediaAutoConfiguration` `org.springframework.boot.autoconfigure.http.HttpMessageConvertersAutoConfiguration` `org.springframework.boot.autoconfigure.jackson.JacksonAutoConfiguration` `org.springframework.boot.autoconfigure.jsonb.JsonbAutoConfiguration` `org.springframework.boot.autoconfigure.mustache.MustacheAutoConfiguration` `org.springframework.boot.autoconfigure.security.oauth2.client.servlet.OAuth2ClientAutoConfiguration` `org.springframework.boot.autoconfigure.security.oauth2.resource.servlet.OAuth2ResourceServerAutoConfiguration` `org.springframework.boot.autoconfigure.security.servlet.SecurityAutoConfiguration` `org.springframework.boot.autoconfigure.security.servlet.SecurityFilterAutoConfiguration` `org.springframework.boot.autoconfigure.security.servlet.UserDetailsServiceAutoConfiguration` `org.springframework.boot.autoconfigure.task.TaskExecutionAutoConfiguration` `org.springframework.boot.autoconfigure.thymeleaf.ThymeleafAutoConfiguration` `org.springframework.boot.autoconfigure.validation.ValidationAutoConfiguration` `org.springframework.boot.autoconfigure.web.servlet.HttpEncodingAutoConfiguration` `org.springframework.boot.autoconfigure.web.servlet.WebMvcAutoConfiguration` `org.springframework.boot.autoconfigure.web.servlet.error.ErrorMvcAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.reactive.WebTestClientAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.servlet.MockMvcAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.servlet.MockMvcSecurityConfiguration` `org.springframework.boot.test.autoconfigure.web.servlet.MockMvcWebClientAutoConfiguration` `org.springframework.boot.test.autoconfigure.web.servlet.MockMvcWebDriverAutoConfiguration` | | `@WebServiceClientTest` | `org.springframework.boot.autoconfigure.cache.CacheAutoConfiguration` `org.springframework.boot.autoconfigure.webservices.client.WebServiceTemplateAutoConfiguration` `org.springframework.boot.test.autoconfigure.webservices.client.MockWebServiceServerAutoConfiguration` `org.springframework.boot.test.autoconfigure.webservices.client.WebServiceClientTemplateAutoConfiguration` | | `@WebServiceServerTest` | `org.springframework.boot.autoconfigure.webservices.WebServicesAutoConfiguration` `org.springframework.boot.test.autoconfigure.webservices.server.MockWebServiceClientAutoConfiguration` | spring_boot Configuration Metadata Configuration Metadata ====================== Spring Boot jars include metadata files that provide details of all supported configuration properties. The files are designed to let IDE developers offer contextual help and “code completion” as users are working with `application.properties` or `application.yml` files. The majority of the metadata file is generated automatically at compile time by processing all items annotated with `@ConfigurationProperties`. However, it is possible to [write part of the metadata manually](#appendix.configuration-metadata.annotation-processor.adding-additional-metadata) for corner cases or more advanced use cases. 1. Metadata Format ------------------- Configuration metadata files are located inside jars under `META-INF/spring-configuration-metadata.json`. They use a JSON format with items categorized under either “groups” or “properties” and additional values hints categorized under "hints", as shown in the following example: ``` {"groups": [ { "name": "server", "type": "org.springframework.boot.autoconfigure.web.ServerProperties", "sourceType": "org.springframework.boot.autoconfigure.web.ServerProperties" }, { "name": "spring.jpa.hibernate", "type": "org.springframework.boot.autoconfigure.orm.jpa.JpaProperties$Hibernate", "sourceType": "org.springframework.boot.autoconfigure.orm.jpa.JpaProperties", "sourceMethod": "getHibernate()" } ... ],"properties": [ { "name": "server.port", "type": "java.lang.Integer", "sourceType": "org.springframework.boot.autoconfigure.web.ServerProperties" }, { "name": "server.address", "type": "java.net.InetAddress", "sourceType": "org.springframework.boot.autoconfigure.web.ServerProperties" }, { "name": "spring.jpa.hibernate.ddl-auto", "type": "java.lang.String", "description": "DDL mode. This is actually a shortcut for the \"hibernate.hbm2ddl.auto\" property.", "sourceType": "org.springframework.boot.autoconfigure.orm.jpa.JpaProperties$Hibernate" } ... ],"hints": [ { "name": "spring.jpa.hibernate.ddl-auto", "values": [ { "value": "none", "description": "Disable DDL handling." }, { "value": "validate", "description": "Validate the schema, make no changes to the database." }, { "value": "update", "description": "Update the schema if necessary." }, { "value": "create", "description": "Create the schema and destroy previous data." }, { "value": "create-drop", "description": "Create and then destroy the schema at the end of the session." } ] } ]} ``` Each “property” is a configuration item that the user specifies with a given value. For example, `server.port` and `server.address` might be specified in your `application.properties`/`application.yaml`, as follows: Properties ``` server.port=9090 server.address=127.0.0.1 ``` Yaml ``` server: port: 9090 address: 127.0.0.1 ``` The “groups” are higher level items that do not themselves specify a value but instead provide a contextual grouping for properties. For example, the `server.port` and `server.address` properties are part of the `server` group. | | | | --- | --- | | | It is not required that every “property” has a “group”. Some properties might exist in their own right. | Finally, “hints” are additional information used to assist the user in configuring a given property. For example, when a developer is configuring the `spring.jpa.hibernate.ddl-auto` property, a tool can use the hints to offer some auto-completion help for the `none`, `validate`, `update`, `create`, and `create-drop` values. ### 1.1. Group Attributes The JSON object contained in the `groups` array can contain the attributes shown in the following table: | Name | Type | Purpose | | --- | --- | --- | | `name` | String | The full name of the group. This attribute is mandatory. | | `type` | String | The class name of the data type of the group. For example, if the group were based on a class annotated with `@ConfigurationProperties`, the attribute would contain the fully qualified name of that class. If it were based on a `@Bean` method, it would be the return type of that method. If the type is not known, the attribute may be omitted. | | `description` | String | A short description of the group that can be displayed to users. If no description is available, it may be omitted. It is recommended that descriptions be short paragraphs, with the first line providing a concise summary. The last line in the description should end with a period (`.`). | | `sourceType` | String | The class name of the source that contributed this group. For example, if the group were based on a `@Bean` method annotated with `@ConfigurationProperties`, this attribute would contain the fully qualified name of the `@Configuration` class that contains the method. If the source type is not known, the attribute may be omitted. | | `sourceMethod` | String | The full name of the method (include parenthesis and argument types) that contributed this group (for example, the name of a `@ConfigurationProperties` annotated `@Bean` method). If the source method is not known, it may be omitted. | ### 1.2. Property Attributes The JSON object contained in the `properties` array can contain the attributes described in the following table: | Name | Type | Purpose | | --- | --- | --- | | `name` | String | The full name of the property. Names are in lower-case period-separated form (for example, `server.address`). This attribute is mandatory. | | `type` | String | The full signature of the data type of the property (for example, `java.lang.String`) but also a full generic type (such as `java.util.Map<java.lang.String,com.example.MyEnum>`). You can use this attribute to guide the user as to the types of values that they can enter. For consistency, the type of a primitive is specified by using its wrapper counterpart (for example, `boolean` becomes `java.lang.Boolean`). Note that this class may be a complex type that gets converted from a `String` as values are bound. If the type is not known, it may be omitted. | | `description` | String | A short description of the property that can be displayed to users. If no description is available, it may be omitted. It is recommended that descriptions be short paragraphs, with the first line providing a concise summary. The last line in the description should end with a period (`.`). | | `sourceType` | String | The class name of the source that contributed this property. For example, if the property were from a class annotated with `@ConfigurationProperties`, this attribute would contain the fully qualified name of that class. If the source type is unknown, it may be omitted. | | `defaultValue` | Object | The default value, which is used if the property is not specified. If the type of the property is an array, it can be an array of value(s). If the default value is unknown, it may be omitted. | | `deprecation` | Deprecation | Specify whether the property is deprecated. If the field is not deprecated or if that information is not known, it may be omitted. The next table offers more detail about the `deprecation` attribute. | The JSON object contained in the `deprecation` attribute of each `properties` element can contain the following attributes: | Name | Type | Purpose | | --- | --- | --- | | `level` | String | The level of deprecation, which can be either `warning` (the default) or `error`. When a property has a `warning` deprecation level, it should still be bound in the environment. However, when it has an `error` deprecation level, the property is no longer managed and is not bound. | | `reason` | String | A short description of the reason why the property was deprecated. If no reason is available, it may be omitted. It is recommended that descriptions be short paragraphs, with the first line providing a concise summary. The last line in the description should end with a period (`.`). | | `replacement` | String | The full name of the property that *replaces* this deprecated property. If there is no replacement for this property, it may be omitted. | | | | | --- | --- | | | Prior to Spring Boot 1.3, a single `deprecated` boolean attribute can be used instead of the `deprecation` element. This is still supported in a deprecated fashion and should no longer be used. If no reason and replacement are available, an empty `deprecation` object should be set. | Deprecation can also be specified declaratively in code by adding the `@DeprecatedConfigurationProperty` annotation to the getter exposing the deprecated property. For instance, assume that the `my.app.target` property was confusing and was renamed to `my.app.name`. The following example shows how to handle that situation: ``` import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.context.properties.DeprecatedConfigurationProperty; @ConfigurationProperties("my.app") public class MyProperties { private String name; public String getName() { return this.name; } public void setName(String name) { this.name = name; } @Deprecated @DeprecatedConfigurationProperty(replacement = "my.app.name") public String getTarget() { return this.name; } @Deprecated public void setTarget(String target) { this.name = target; } } ``` | | | | --- | --- | | | There is no way to set a `level`. `warning` is always assumed, since code is still handling the property. | The preceding code makes sure that the deprecated property still works (delegating to the `name` property behind the scenes). Once the `getTarget` and `setTarget` methods can be removed from your public API, the automatic deprecation hint in the metadata goes away as well. If you want to keep a hint, adding manual metadata with an `error` deprecation level ensures that users are still informed about that property. Doing so is particularly useful when a `replacement` is provided. ### 1.3. Hint Attributes The JSON object contained in the `hints` array can contain the attributes shown in the following table: | Name | Type | Purpose | | --- | --- | --- | | `name` | String | The full name of the property to which this hint refers. Names are in lower-case period-separated form (such as `spring.mvc.servlet.path`). If the property refers to a map (such as `system.contexts`), the hint either applies to the *keys* of the map (`system.contexts.keys`) or the *values* (`system.contexts.values`) of the map. This attribute is mandatory. | | `values` | ValueHint[] | A list of valid values as defined by the `ValueHint` object (described in the next table). Each entry defines the value and may have a description. | | `providers` | ValueProvider[] | A list of providers as defined by the `ValueProvider` object (described later in this document). Each entry defines the name of the provider and its parameters, if any. | The JSON object contained in the `values` attribute of each `hint` element can contain the attributes described in the following table: | Name | Type | Purpose | | --- | --- | --- | | `value` | Object | A valid value for the element to which the hint refers. If the type of the property is an array, it can also be an array of value(s). This attribute is mandatory. | | `description` | String | A short description of the value that can be displayed to users. If no description is available, it may be omitted. It is recommended that descriptions be short paragraphs, with the first line providing a concise summary. The last line in the description should end with a period (`.`). | The JSON object contained in the `providers` attribute of each `hint` element can contain the attributes described in the following table: | Name | Type | Purpose | | --- | --- | --- | | `name` | String | The name of the provider to use to offer additional content assistance for the element to which the hint refers. | | `parameters` | JSON object | Any additional parameter that the provider supports (check the documentation of the provider for more details). | ### 1.4. Repeated Metadata Items Objects with the same “property” and “group” name can appear multiple times within a metadata file. For example, you could bind two separate classes to the same prefix, with each having potentially overlapping property names. While the same names appearing in the metadata multiple times should not be common, consumers of metadata should take care to ensure that they support it. 2. Providing Manual Hints -------------------------- To improve the user experience and further assist the user in configuring a given property, you can provide additional metadata that: * Describes the list of potential values for a property. * Associates a provider, to attach a well defined semantic to a property, so that a tool can discover the list of potential values based on the project’s context. ### 2.1. Value Hint The `name` attribute of each hint refers to the `name` of a property. In the [initial example shown earlier](#appendix.configuration-metadata.format), we provide five values for the `spring.jpa.hibernate.ddl-auto` property: `none`, `validate`, `update`, `create`, and `create-drop`. Each value may have a description as well. If your property is of type `Map`, you can provide hints for both the keys and the values (but not for the map itself). The special `.keys` and `.values` suffixes must refer to the keys and the values, respectively. Assume a `my.contexts` maps magic `String` values to an integer, as shown in the following example: ``` import java.util.Map; import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties("my") public class MyProperties { private Map<String, Integer> contexts; // getters/setters ... public Map<String, Integer> getContexts() { return this.contexts; } public void setContexts(Map<String, Integer> contexts) { this.contexts = contexts; } } ``` The magic values are (in this example) are `sample1` and `sample2`. In order to offer additional content assistance for the keys, you could add the following JSON to [the manual metadata of the module](#appendix.configuration-metadata.annotation-processor.adding-additional-metadata): ``` {"hints": [ { "name": "my.contexts.keys", "values": [ { "value": "sample1" }, { "value": "sample2" } ] } ]} ``` | | | | --- | --- | | | We recommend that you use an `Enum` for those two values instead. If your IDE supports it, this is by far the most effective approach to auto-completion. | ### 2.2. Value Providers Providers are a powerful way to attach semantics to a property. In this section, we define the official providers that you can use for your own hints. However, your favorite IDE may implement some of these or none of them. Also, it could eventually provide its own. | | | | --- | --- | | | As this is a new feature, IDE vendors must catch up with how it works. Adoption times naturally vary. | The following table summarizes the list of supported providers: | Name | Description | | --- | --- | | `any` | Permits any additional value to be provided. | | `class-reference` | Auto-completes the classes available in the project. Usually constrained by a base class that is specified by the `target` parameter. | | `handle-as` | Handles the property as if it were defined by the type defined by the mandatory `target` parameter. | | `logger-name` | Auto-completes valid logger names and [logger groups](features#features.logging.log-groups). Typically, package and class names available in the current project can be auto-completed as well as defined groups. | | `spring-bean-reference` | Auto-completes the available bean names in the current project. Usually constrained by a base class that is specified by the `target` parameter. | | `spring-profile-name` | Auto-completes the available Spring profile names in the project. | | | | | --- | --- | | | Only one provider can be active for a given property, but you can specify several providers if they can all manage the property *in some way*. Make sure to place the most powerful provider first, as the IDE must use the first one in the JSON section that it can handle. If no provider for a given property is supported, no special content assistance is provided, either. | #### 2.2.1. Any The special **any** provider value permits any additional values to be provided. Regular value validation based on the property type should be applied if this is supported. This provider is typically used if you have a list of values and any extra values should still be considered as valid. The following example offers `on` and `off` as auto-completion values for `system.state`: ``` {"hints": [ { "name": "system.state", "values": [ { "value": "on" }, { "value": "off" } ], "providers": [ { "name": "any" } ] } ]} ``` Note that, in the preceding example, any other value is also allowed. #### 2.2.2. Class Reference The **class-reference** provider auto-completes classes available in the project. This provider supports the following parameters: | Parameter | Type | Default value | Description | | --- | --- | --- | --- | | `target` | `String` (`Class`) | *none* | The fully qualified name of the class that should be assignable to the chosen value. Typically used to filter out-non candidate classes. Note that this information can be provided by the type itself by exposing a class with the appropriate upper bound. | | `concrete` | `boolean` | true | Specify whether only concrete classes are to be considered as valid candidates. | The following metadata snippet corresponds to the standard `server.servlet.jsp.class-name` property that defines the `JspServlet` class name to use: ``` {"hints": [ { "name": "server.servlet.jsp.class-name", "providers": [ { "name": "class-reference", "parameters": { "target": "javax.servlet.http.HttpServlet" } } ] } ]} ``` #### 2.2.3. Handle As The **handle-as** provider lets you substitute the type of the property to a more high-level type. This typically happens when the property has a `java.lang.String` type, because you do not want your configuration classes to rely on classes that may not be on the classpath. This provider supports the following parameters: | Parameter | Type | Default value | Description | | --- | --- | --- | --- | | **`target`** | `String` (`Class`) | *none* | The fully qualified name of the type to consider for the property. This parameter is mandatory. | The following types can be used: * Any `java.lang.Enum`: Lists the possible values for the property. (We recommend defining the property with the `Enum` type, as no further hint should be required for the IDE to auto-complete the values) * `java.nio.charset.Charset`: Supports auto-completion of charset/encoding values (such as `UTF-8`) * `java.util.Locale`: auto-completion of locales (such as `en_US`) * `org.springframework.util.MimeType`: Supports auto-completion of content type values (such as `text/plain`) * `org.springframework.core.io.Resource`: Supports auto-completion of Spring’s Resource abstraction to refer to a file on the filesystem or on the classpath (such as `classpath:/sample.properties`) | | | | --- | --- | | | If multiple values can be provided, use a `Collection` or *Array* type to teach the IDE about it. | The following metadata snippet corresponds to the standard `spring.liquibase.change-log` property that defines the path to the changelog to use. It is actually used internally as a `org.springframework.core.io.Resource` but cannot be exposed as such, because we need to keep the original String value to pass it to the Liquibase API. ``` {"hints": [ { "name": "spring.liquibase.change-log", "providers": [ { "name": "handle-as", "parameters": { "target": "org.springframework.core.io.Resource" } } ] } ]} ``` #### 2.2.4. Logger Name The **logger-name** provider auto-completes valid logger names and [logger groups](features#features.logging.log-groups). Typically, package and class names available in the current project can be auto-completed. If groups are enabled (default) and if a custom logger group is identified in the configuration, auto-completion for it should be provided. Specific frameworks may have extra magic logger names that can be supported as well. This provider supports the following parameters: | Parameter | Type | Default value | Description | | --- | --- | --- | --- | | `group` | `boolean` | `true` | Specify whether known groups should be considered. | Since a logger name can be any arbitrary name, this provider should allow any value but could highlight valid package and class names that are not available in the project’s classpath. The following metadata snippet corresponds to the standard `logging.level` property. Keys are *logger names*, and values correspond to the standard log levels or any custom level. As Spring Boot defines a few logger groups out-of-the-box, dedicated value hints have been added for those. ``` {"hints": [ { "name": "logging.level.keys", "values": [ { "value": "root", "description": "Root logger used to assign the default logging level." }, { "value": "sql", "description": "SQL logging group including Hibernate SQL logger." }, { "value": "web", "description": "Web logging group including codecs." } ], "providers": [ { "name": "logger-name" } ] }, { "name": "logging.level.values", "values": [ { "value": "trace" }, { "value": "debug" }, { "value": "info" }, { "value": "warn" }, { "value": "error" }, { "value": "fatal" }, { "value": "off" } ], "providers": [ { "name": "any" } ] } ]} ``` #### 2.2.5. Spring Bean Reference The **spring-bean-reference** provider auto-completes the beans that are defined in the configuration of the current project. This provider supports the following parameters: | Parameter | Type | Default value | Description | | --- | --- | --- | --- | | `target` | `String` (`Class`) | *none* | The fully qualified name of the bean class that should be assignable to the candidate. Typically used to filter out non-candidate beans. | The following metadata snippet corresponds to the standard `spring.jmx.server` property that defines the name of the `MBeanServer` bean to use: ``` {"hints": [ { "name": "spring.jmx.server", "providers": [ { "name": "spring-bean-reference", "parameters": { "target": "javax.management.MBeanServer" } } ] } ]} ``` | | | | --- | --- | | | The binder is not aware of the metadata. If you provide that hint, you still need to transform the bean name into an actual Bean reference using by the `ApplicationContext`. | #### 2.2.6. Spring Profile Name The **spring-profile-name** provider auto-completes the Spring profiles that are defined in the configuration of the current project. The following metadata snippet corresponds to the standard `spring.profiles.active` property that defines the name of the Spring profile(s) to enable: ``` {"hints": [ { "name": "spring.profiles.active", "providers": [ { "name": "spring-profile-name" } ] } ]} ``` 3. Generating Your Own Metadata by Using the Annotation Processor ------------------------------------------------------------------ You can easily generate your own configuration metadata file from items annotated with `@ConfigurationProperties` by using the `spring-boot-configuration-processor` jar. The jar includes a Java annotation processor which is invoked as your project is compiled. ### 3.1. Configuring the Annotation Processor To use the processor, include a dependency on `spring-boot-configuration-processor`. With Maven the dependency should be declared as optional, as shown in the following example: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-configuration-processor</artifactId> <optional>true</optional> </dependency> ``` With Gradle, the dependency should be declared in the `annotationProcessor` configuration, as shown in the following example: ``` dependencies { annotationProcessor "org.springframework.boot:spring-boot-configuration-processor" } ``` If you are using an `additional-spring-configuration-metadata.json` file, the `compileJava` task should be configured to depend on the `processResources` task, as shown in the following example: ``` tasks.named('compileJava') { inputs.files(tasks.named('processResources')) } ``` This dependency ensures that the additional metadata is available when the annotation processor runs during compilation. | | | | --- | --- | | | If you are using AspectJ in your project, you need to make sure that the annotation processor runs only once. There are several ways to do this. With Maven, you can configure the `maven-apt-plugin` explicitly and add the dependency to the annotation processor only there. You could also let the AspectJ plugin run all the processing and disable annotation processing in the `maven-compiler-plugin` configuration, as follows: ``` <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <proc>none</proc> </configuration> </plugin> ``` | ### 3.2. Automatic Metadata Generation The processor picks up both classes and methods that are annotated with `@ConfigurationProperties`. If the class is also annotated with `@ConstructorBinding`, a single constructor is expected and one property is created per constructor parameter. Otherwise, properties are discovered through the presence of standard getters and setters with special handling for collection and map types (that is detected even if only a getter is present). The annotation processor also supports the use of the `@Data`, `@Value`, `@Getter`, and `@Setter` lombok annotations. Consider the following example: ``` import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties(prefix = "my.server") public class MyServerProperties { /\*\* \* Name of the server. \*/ private String name; /\*\* \* IP address to listen to. \*/ private String ip = "127.0.0.1"; /\*\* \* Port to listener to. \*/ private int port = 9797; // getters/setters ... public String getName() { return this.name; } public void setName(String name) { this.name = name; } public String getIp() { return this.ip; } public void setIp(String ip) { this.ip = ip; } public int getPort() { return this.port; } public void setPort(int port) { this.port = port; } // fold:off ``` This exposes three properties where `my.server.name` has no default and `my.server.ip` and `my.server.port` defaults to `"127.0.0.1"` and `9797` respectively. The Javadoc on fields is used to populate the `description` attribute. For instance, the description of `my.server.ip` is "IP address to listen to.". | | | | --- | --- | | | You should only use plain text with `@ConfigurationProperties` field Javadoc, since they are not processed before being added to the JSON. | The annotation processor applies a number of heuristics to extract the default value from the source model. Default values have to be provided statically. In particular, do not refer to a constant defined in another class. Also, the annotation processor cannot auto-detect default values for `Enum`s and `Collections`s. For cases where the default value could not be detected, [manual metadata](#appendix.configuration-metadata.annotation-processor.adding-additional-metadata) should be provided. Consider the following example: ``` import java.util.ArrayList; import java.util.Arrays; import java.util.List; import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties(prefix = "my.messaging") public class MyMessagingProperties { private List<String> addresses = new ArrayList<>(Arrays.asList("a", "b")); private ContainerType containerType = ContainerType.SIMPLE; // getters/setters ... public List<String> getAddresses() { return this.addresses; } public void setAddresses(List<String> addresses) { this.addresses = addresses; } public ContainerType getContainerType() { return this.containerType; } public void setContainerType(ContainerType containerType) { this.containerType = containerType; } public enum ContainerType { SIMPLE, DIRECT } } ``` In order to document default values for properties in the class above, you could add the following content to [the manual metadata of the module](#appendix.configuration-metadata.annotation-processor.adding-additional-metadata): ``` {"properties": [ { "name": "my.messaging.addresses", "defaultValue": ["a", "b"] }, { "name": "my.messaging.container-type", "defaultValue": "simple" } ]} ``` | | | | --- | --- | | | Only the `name` of the property is required to document additional metadata for existing properties. | #### 3.2.1. Nested Properties The annotation processor automatically considers inner classes as nested properties. Rather than documenting the `ip` and `port` at the root of the namespace, we could create a sub-namespace for it. Consider the updated example: ``` import org.springframework.boot.context.properties.ConfigurationProperties; @ConfigurationProperties(prefix = "my.server") public class MyServerProperties { private String name; private Host host; // getters/setters ... public String getName() { return this.name; } public void setName(String name) { this.name = name; } public Host getHost() { return this.host; } public void setHost(Host host) { this.host = host; } public static class Host { private String ip; private int port; // getters/setters ... public String getIp() { return this.ip; } public void setIp(String ip) { this.ip = ip; } public int getPort() { return this.port; } public void setPort(int port) { this.port = port; } } } ``` The preceding example produces metadata information for `my.server.name`, `my.server.host.ip`, and `my.server.host.port` properties. You can use the `@NestedConfigurationProperty` annotation on a field to indicate that a regular (non-inner) class should be treated as if it were nested. | | | | --- | --- | | | This has no effect on collections and maps, as those types are automatically identified, and a single metadata property is generated for each of them. | ### 3.3. Adding Additional Metadata Spring Boot’s configuration file handling is quite flexible, and it is often the case that properties may exist that are not bound to a `@ConfigurationProperties` bean. You may also need to tune some attributes of an existing key. To support such cases and let you provide custom "hints", the annotation processor automatically merges items from `META-INF/additional-spring-configuration-metadata.json` into the main metadata file. If you refer to a property that has been detected automatically, the description, default value, and deprecation information are overridden, if specified. If the manual property declaration is not identified in the current module, it is added as a new property. The format of the `additional-spring-configuration-metadata.json` file is exactly the same as the regular `spring-configuration-metadata.json`. The additional properties file is optional. If you do not have any additional properties, do not add the file.
programming_docs
spring_boot Upgrading Spring Boot Upgrading Spring Boot ===================== Instructions for how to upgrade from earlier versions of Spring Boot are provided on the project [wiki](https://github.com/spring-projects/spring-boot/wiki). Follow the links in the [release notes](https://github.com/spring-projects/spring-boot/wiki#release-notes) section to find the version that you want to upgrade to. Upgrading instructions are always the first item in the release notes. If you are more than one release behind, please make sure that you also review the release notes of the versions that you jumped. 1. Upgrading from 1.x ---------------------- If you are upgrading from the `1.x` release of Spring Boot, check the [“migration guide” on the project wiki](https://github.com/spring-projects/spring-boot/wiki/Spring-Boot-2.0-Migration-Guide) that provides detailed upgrade instructions. Check also the [“release notes”](https://github.com/spring-projects/spring-boot/wiki) for a list of “new and noteworthy” features for each release. 2. Upgrading to a new feature release -------------------------------------- When upgrading to a new feature release, some properties may have been renamed or removed. Spring Boot provides a way to analyze your application’s environment and print diagnostics at startup, but also temporarily migrate properties at runtime for you. To enable that feature, add the following dependency to your project: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-properties-migrator</artifactId> <scope>runtime</scope> </dependency> ``` | | | | --- | --- | | | Properties that are added late to the environment, such as when using `@PropertySource`, will not be taken into account. | | | | | --- | --- | | | Once you finish the migration, please make sure to remove this module from your project’s dependencies. | 3. Upgrading the Spring Boot CLI --------------------------------- To upgrade an existing CLI installation, use the appropriate package manager command (for example, `brew upgrade`). If you manually installed the CLI, follow the [standard instructions](getting-started#getting-started.installing.cli.manual-installation), remembering to update your `PATH` environment variable to remove any older references. 4. What to Read Next --------------------- Once you’ve decided to upgrade your application, you can find detailed information regarding specific features in the rest of the document. Spring Boot’s documentation is specific to that version, so any information that you find in here will contain the most up-to-date changes that are in that version. spring_boot Messaging Messaging ========= The Spring Framework provides extensive support for integrating with messaging systems, from simplified use of the JMS API using `JmsTemplate` to a complete infrastructure to receive messages asynchronously. Spring AMQP provides a similar feature set for the Advanced Message Queuing Protocol. Spring Boot also provides auto-configuration options for `RabbitTemplate` and RabbitMQ. Spring WebSocket natively includes support for STOMP messaging, and Spring Boot has support for that through starters and a small amount of auto-configuration. Spring Boot also has support for Apache Kafka. 1. JMS ------- The `javax.jms.ConnectionFactory` interface provides a standard method of creating a `javax.jms.Connection` for interacting with a JMS broker. Although Spring needs a `ConnectionFactory` to work with JMS, you generally need not use it directly yourself and can instead rely on higher level messaging abstractions. (See the [relevant section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/integration.html#jms) of the Spring Framework reference documentation for details.) Spring Boot also auto-configures the necessary infrastructure to send and receive messages. ### 1.1. ActiveMQ Support When [ActiveMQ](https://activemq.apache.org/) is available on the classpath, Spring Boot can also configure a `ConnectionFactory`. If the broker is present, an embedded broker is automatically started and configured (provided no broker URL is specified through configuration and the embedded broker is not disabled in the configuration). | | | | --- | --- | | | If you use `spring-boot-starter-activemq`, the necessary dependencies to connect or embed an ActiveMQ instance are provided, as is the Spring infrastructure to integrate with JMS. | ActiveMQ configuration is controlled by external configuration properties in `spring.activemq.*`. By default, ActiveMQ is auto-configured to use the [VM transport](https://activemq.apache.org/vm-transport-reference.html), which starts a broker embedded in the same JVM instance. You can disable the embedded broker by configuring the `spring.activemq.in-memory` property, as shown in the following example: Properties ``` spring.activemq.in-memory=false ``` Yaml ``` spring: activemq: in-memory: false ``` The embedded broker will also be disabled if you configure the broker URL, as shown in the following example: Properties ``` spring.activemq.broker-url=tcp://192.168.1.210:9876 spring.activemq.user=admin spring.activemq.password=secret ``` Yaml ``` spring: activemq: broker-url: "tcp://192.168.1.210:9876" user: "admin" password: "secret" ``` If you want to take full control over the embedded broker, see [the ActiveMQ documentation](https://activemq.apache.org/how-do-i-embed-a-broker-inside-a-connection.html) for further information. By default, a `CachingConnectionFactory` wraps the native `ConnectionFactory` with sensible settings that you can control by external configuration properties in `spring.jms.*`: Properties ``` spring.jms.cache.session-cache-size=5 ``` Yaml ``` spring: jms: cache: session-cache-size: 5 ``` If you’d rather use native pooling, you can do so by adding a dependency to `org.messaginghub:pooled-jms` and configuring the `JmsPoolConnectionFactory` accordingly, as shown in the following example: Properties ``` spring.activemq.pool.enabled=true spring.activemq.pool.max-connections=50 ``` Yaml ``` spring: activemq: pool: enabled: true max-connections: 50 ``` | | | | --- | --- | | | See [`ActiveMQProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jms/activemq/ActiveMQProperties.java) for more of the supported options. You can also register an arbitrary number of beans that implement `ActiveMQConnectionFactoryCustomizer` for more advanced customizations. | By default, ActiveMQ creates a destination if it does not yet exist so that destinations are resolved against their provided names. ### 1.2. ActiveMQ Artemis Support Spring Boot can auto-configure a `ConnectionFactory` when it detects that [ActiveMQ Artemis](https://activemq.apache.org/components/artemis/) is available on the classpath. If the broker is present, an embedded broker is automatically started and configured (unless the mode property has been explicitly set). The supported modes are `embedded` (to make explicit that an embedded broker is required and that an error should occur if the broker is not available on the classpath) and `native` (to connect to a broker using the `netty` transport protocol). When the latter is configured, Spring Boot configures a `ConnectionFactory` that connects to a broker running on the local machine with the default settings. | | | | --- | --- | | | If you use `spring-boot-starter-artemis`, the necessary dependencies to connect to an existing ActiveMQ Artemis instance are provided, as well as the Spring infrastructure to integrate with JMS. Adding `org.apache.activemq:artemis-jms-server` to your application lets you use embedded mode. | ActiveMQ Artemis configuration is controlled by external configuration properties in `spring.artemis.*`. For example, you might declare the following section in `application.properties`: Properties ``` spring.artemis.mode=native spring.artemis.broker-url=tcp://192.168.1.210:9876 spring.artemis.user=admin spring.artemis.password=secret ``` Yaml ``` spring: artemis: mode: native broker-url: "tcp://192.168.1.210:9876" user: "admin" password: "secret" ``` When embedding the broker, you can choose if you want to enable persistence and list the destinations that should be made available. These can be specified as a comma-separated list to create them with the default options, or you can define bean(s) of type `org.apache.activemq.artemis.jms.server.config.JMSQueueConfiguration` or `org.apache.activemq.artemis.jms.server.config.TopicConfiguration`, for advanced queue and topic configurations, respectively. By default, a `CachingConnectionFactory` wraps the native `ConnectionFactory` with sensible settings that you can control by external configuration properties in `spring.jms.*`: Properties ``` spring.jms.cache.session-cache-size=5 ``` Yaml ``` spring: jms: cache: session-cache-size: 5 ``` If you’d rather use native pooling, you can do so by adding a dependency to `org.messaginghub:pooled-jms` and configuring the `JmsPoolConnectionFactory` accordingly, as shown in the following example: Properties ``` spring.artemis.pool.enabled=true spring.artemis.pool.max-connections=50 ``` Yaml ``` spring: artemis: pool: enabled: true max-connections: 50 ``` See [`ArtemisProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jms/artemis/ArtemisProperties.java) for more supported options. No JNDI lookup is involved, and destinations are resolved against their names, using either the `name` attribute in the Artemis configuration or the names provided through configuration. ### 1.3. Using a JNDI ConnectionFactory If you are running your application in an application server, Spring Boot tries to locate a JMS `ConnectionFactory` by using JNDI. By default, the `java:/JmsXA` and `java:/XAConnectionFactory` location are checked. You can use the `spring.jms.jndi-name` property if you need to specify an alternative location, as shown in the following example: Properties ``` spring.jms.jndi-name=java:/MyConnectionFactory ``` Yaml ``` spring: jms: jndi-name: "java:/MyConnectionFactory" ``` ### 1.4. Sending a Message Spring’s `JmsTemplate` is auto-configured, and you can autowire it directly into your own beans, as shown in the following example: Java ``` import org.springframework.jms.core.JmsTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final JmsTemplate jmsTemplate; public MyBean(JmsTemplate jmsTemplate) { this.jmsTemplate = jmsTemplate; } // ... public void someMethod() { this.jmsTemplate.convertAndSend("hello"); } } ``` Kotlin ``` import org.springframework.jms.core.JmsTemplate import org.springframework.stereotype.Component @Component class MyBean(private val jmsTemplate: JmsTemplate) { // ... fun someMethod() { jmsTemplate.convertAndSend("hello") } } ``` | | | | --- | --- | | | [`JmsMessagingTemplate`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/jms/core/JmsMessagingTemplate.html) can be injected in a similar manner. If a `DestinationResolver` or a `MessageConverter` bean is defined, it is associated automatically to the auto-configured `JmsTemplate`. | ### 1.5. Receiving a Message When the JMS infrastructure is present, any bean can be annotated with `@JmsListener` to create a listener endpoint. If no `JmsListenerContainerFactory` has been defined, a default one is configured automatically. If a `DestinationResolver`, a `MessageConverter`, or a `javax.jms.ExceptionListener` beans are defined, they are associated automatically with the default factory. By default, the default factory is transactional. If you run in an infrastructure where a `JtaTransactionManager` is present, it is associated to the listener container by default. If not, the `sessionTransacted` flag is enabled. In that latter scenario, you can associate your local data store transaction to the processing of an incoming message by adding `@Transactional` on your listener method (or a delegate thereof). This ensures that the incoming message is acknowledged, once the local transaction has completed. This also includes sending response messages that have been performed on the same JMS session. The following component creates a listener endpoint on the `someQueue` destination: Java ``` import org.springframework.jms.annotation.JmsListener; import org.springframework.stereotype.Component; @Component public class MyBean { @JmsListener(destination = "someQueue") public void processMessage(String content) { // ... } } ``` Kotlin ``` import org.springframework.jms.annotation.JmsListener import org.springframework.stereotype.Component @Component class MyBean { @JmsListener(destination = "someQueue") fun processMessage(content: String?) { // ... } } ``` | | | | --- | --- | | | See [the Javadoc of `@EnableJms`](https://docs.spring.io/spring-framework/docs/5.3.20/javadoc-api/org/springframework/jms/annotation/EnableJms.html) for more details. | If you need to create more `JmsListenerContainerFactory` instances or if you want to override the default, Spring Boot provides a `DefaultJmsListenerContainerFactoryConfigurer` that you can use to initialize a `DefaultJmsListenerContainerFactory` with the same settings as the one that is auto-configured. For instance, the following example exposes another factory that uses a specific `MessageConverter`: Java ``` import javax.jms.ConnectionFactory; import org.springframework.boot.autoconfigure.jms.DefaultJmsListenerContainerFactoryConfigurer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.jms.config.DefaultJmsListenerContainerFactory; @Configuration(proxyBeanMethods = false) public class MyJmsConfiguration { @Bean public DefaultJmsListenerContainerFactory myFactory(DefaultJmsListenerContainerFactoryConfigurer configurer) { DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory(); ConnectionFactory connectionFactory = getCustomConnectionFactory(); configurer.configure(factory, connectionFactory); factory.setMessageConverter(new MyMessageConverter()); return factory; } private ConnectionFactory getCustomConnectionFactory() { return ... } } ``` Kotlin ``` import org.springframework.boot.autoconfigure.jms.DefaultJmsListenerContainerFactoryConfigurer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.jms.config.DefaultJmsListenerContainerFactory import javax.jms.ConnectionFactory @Configuration(proxyBeanMethods = false) class MyJmsConfiguration { @Bean fun myFactory(configurer: DefaultJmsListenerContainerFactoryConfigurer): DefaultJmsListenerContainerFactory { val factory = DefaultJmsListenerContainerFactory() val connectionFactory = getCustomConnectionFactory() configurer.configure(factory, connectionFactory) factory.setMessageConverter(MyMessageConverter()) return factory } fun getCustomConnectionFactory() : ConnectionFactory? { return ... } } ``` Then you can use the factory in any `@JmsListener`-annotated method as follows: Java ``` import org.springframework.jms.annotation.JmsListener; import org.springframework.stereotype.Component; @Component public class MyBean { @JmsListener(destination = "someQueue", containerFactory = "myFactory") public void processMessage(String content) { // ... } } ``` Kotlin ``` import org.springframework.jms.annotation.JmsListener import org.springframework.stereotype.Component @Component class MyBean { @JmsListener(destination = "someQueue", containerFactory = "myFactory") fun processMessage(content: String?) { // ... } } ``` 2. AMQP -------- The Advanced Message Queuing Protocol (AMQP) is a platform-neutral, wire-level protocol for message-oriented middleware. The Spring AMQP project applies core Spring concepts to the development of AMQP-based messaging solutions. Spring Boot offers several conveniences for working with AMQP through RabbitMQ, including the `spring-boot-starter-amqp` “Starter”. ### 2.1. RabbitMQ support [RabbitMQ](https://www.rabbitmq.com/) is a lightweight, reliable, scalable, and portable message broker based on the AMQP protocol. Spring uses `RabbitMQ` to communicate through the AMQP protocol. RabbitMQ configuration is controlled by external configuration properties in `spring.rabbitmq.*`. For example, you might declare the following section in `application.properties`: Properties ``` spring.rabbitmq.host=localhost spring.rabbitmq.port=5672 spring.rabbitmq.username=admin spring.rabbitmq.password=secret ``` Yaml ``` spring: rabbitmq: host: "localhost" port: 5672 username: "admin" password: "secret" ``` Alternatively, you could configure the same connection using the `addresses` attribute: Properties ``` spring.rabbitmq.addresses=amqp://admin:secret@localhost ``` Yaml ``` spring: rabbitmq: addresses: "amqp://admin:secret@localhost" ``` | | | | --- | --- | | | When specifying addresses that way, the `host` and `port` properties are ignored. If the address uses the `amqps` protocol, SSL support is enabled automatically. | See [`RabbitProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/amqp/RabbitProperties.java) for more of the supported property-based configuration options. To configure lower-level details of the RabbitMQ `ConnectionFactory` that is used by Spring AMQP, define a `ConnectionFactoryCustomizer` bean. If a `ConnectionNameStrategy` bean exists in the context, it will be automatically used to name connections created by the auto-configured `CachingConnectionFactory`. | | | | --- | --- | | | See [Understanding AMQP, the protocol used by RabbitMQ](https://spring.io/blog/2010/06/14/understanding-amqp-the-protocol-used-by-rabbitmq/) for more details. | ### 2.2. Sending a Message Spring’s `AmqpTemplate` and `AmqpAdmin` are auto-configured, and you can autowire them directly into your own beans, as shown in the following example: Java ``` import org.springframework.amqp.core.AmqpAdmin; import org.springframework.amqp.core.AmqpTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final AmqpAdmin amqpAdmin; private final AmqpTemplate amqpTemplate; public MyBean(AmqpAdmin amqpAdmin, AmqpTemplate amqpTemplate) { this.amqpAdmin = amqpAdmin; this.amqpTemplate = amqpTemplate; } // ... public void someMethod() { this.amqpAdmin.getQueueInfo("someQueue"); } public void someOtherMethod() { this.amqpTemplate.convertAndSend("hello"); } } ``` Kotlin ``` import org.springframework.amqp.core.AmqpAdmin import org.springframework.amqp.core.AmqpTemplate import org.springframework.stereotype.Component @Component class MyBean(private val amqpAdmin: AmqpAdmin, private val amqpTemplate: AmqpTemplate) { // ... fun someMethod() { amqpAdmin.getQueueInfo("someQueue") } fun someOtherMethod() { amqpTemplate.convertAndSend("hello") } } ``` | | | | --- | --- | | | [`RabbitMessagingTemplate`](https://docs.spring.io/spring-amqp/docs/2.4.5/api/org/springframework/amqp/rabbit/core/RabbitMessagingTemplate.html) can be injected in a similar manner. If a `MessageConverter` bean is defined, it is associated automatically to the auto-configured `AmqpTemplate`. | If necessary, any `org.springframework.amqp.core.Queue` that is defined as a bean is automatically used to declare a corresponding queue on the RabbitMQ instance. To retry operations, you can enable retries on the `AmqpTemplate` (for example, in the event that the broker connection is lost): Properties ``` spring.rabbitmq.template.retry.enabled=true spring.rabbitmq.template.retry.initial-interval=2s ``` Yaml ``` spring: rabbitmq: template: retry: enabled: true initial-interval: "2s" ``` Retries are disabled by default. You can also customize the `RetryTemplate` programmatically by declaring a `RabbitRetryTemplateCustomizer` bean. If you need to create more `RabbitTemplate` instances or if you want to override the default, Spring Boot provides a `RabbitTemplateConfigurer` bean that you can use to initialize a `RabbitTemplate` with the same settings as the factories used by the auto-configuration. ### 2.3. Sending a Message To A Stream To send a message to a particular stream, specify the name of the stream, as shown in the following example: Properties ``` spring.rabbitmq.stream.name=my-stream ``` Yaml ``` spring: rabbitmq: stream: name: "my-stream" ``` If a `MessageConverter`, `StreamMessageConverter`, or `ProducerCustomizer` bean is defined, it is associated automatically to the auto-configured `RabbitStreamTemplate`. If you need to create more `RabbitStreamTemplate` instances or if you want to override the default, Spring Boot provides a `RabbitStreamTemplateConfigurer` bean that you can use to initialize a `RabbitStreamTemplate` with the same settings as the factories used by the auto-configuration. ### 2.4. Receiving a Message When the Rabbit infrastructure is present, any bean can be annotated with `@RabbitListener` to create a listener endpoint. If no `RabbitListenerContainerFactory` has been defined, a default `SimpleRabbitListenerContainerFactory` is automatically configured and you can switch to a direct container using the `spring.rabbitmq.listener.type` property. If a `MessageConverter` or a `MessageRecoverer` bean is defined, it is automatically associated with the default factory. The following sample component creates a listener endpoint on the `someQueue` queue: Java ``` import org.springframework.amqp.rabbit.annotation.RabbitListener; import org.springframework.stereotype.Component; @Component public class MyBean { @RabbitListener(queues = "someQueue") public void processMessage(String content) { // ... } } ``` Kotlin ``` import org.springframework.amqp.rabbit.annotation.RabbitListener import org.springframework.stereotype.Component @Component class MyBean { @RabbitListener(queues = ["someQueue"]) fun processMessage(content: String?) { // ... } } ``` | | | | --- | --- | | | See [the Javadoc of `@EnableRabbit`](https://docs.spring.io/spring-amqp/docs/2.4.5/api/org/springframework/amqp/rabbit/annotation/EnableRabbit.html) for more details. | If you need to create more `RabbitListenerContainerFactory` instances or if you want to override the default, Spring Boot provides a `SimpleRabbitListenerContainerFactoryConfigurer` and a `DirectRabbitListenerContainerFactoryConfigurer` that you can use to initialize a `SimpleRabbitListenerContainerFactory` and a `DirectRabbitListenerContainerFactory` with the same settings as the factories used by the auto-configuration. | | | | --- | --- | | | It does not matter which container type you chose. Those two beans are exposed by the auto-configuration. | For instance, the following configuration class exposes another factory that uses a specific `MessageConverter`: Java ``` import org.springframework.amqp.rabbit.config.SimpleRabbitListenerContainerFactory; import org.springframework.amqp.rabbit.connection.ConnectionFactory; import org.springframework.boot.autoconfigure.amqp.SimpleRabbitListenerContainerFactoryConfigurer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @Configuration(proxyBeanMethods = false) public class MyRabbitConfiguration { @Bean public SimpleRabbitListenerContainerFactory myFactory(SimpleRabbitListenerContainerFactoryConfigurer configurer) { SimpleRabbitListenerContainerFactory factory = new SimpleRabbitListenerContainerFactory(); ConnectionFactory connectionFactory = getCustomConnectionFactory(); configurer.configure(factory, connectionFactory); factory.setMessageConverter(new MyMessageConverter()); return factory; } private ConnectionFactory getCustomConnectionFactory() { return ... } } ``` Kotlin ``` import org.springframework.amqp.rabbit.config.SimpleRabbitListenerContainerFactory import org.springframework.amqp.rabbit.connection.ConnectionFactory import org.springframework.boot.autoconfigure.amqp.SimpleRabbitListenerContainerFactoryConfigurer import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration @Configuration(proxyBeanMethods = false) class MyRabbitConfiguration { @Bean fun myFactory(configurer: SimpleRabbitListenerContainerFactoryConfigurer): SimpleRabbitListenerContainerFactory { val factory = SimpleRabbitListenerContainerFactory() val connectionFactory = getCustomConnectionFactory() configurer.configure(factory, connectionFactory) factory.setMessageConverter(MyMessageConverter()) return factory } fun getCustomConnectionFactory() : ConnectionFactory? { return ... } } ``` Then you can use the factory in any `@RabbitListener`-annotated method, as follows: Java ``` import org.springframework.amqp.rabbit.annotation.RabbitListener; import org.springframework.stereotype.Component; @Component public class MyBean { @RabbitListener(queues = "someQueue", containerFactory = "myFactory") public void processMessage(String content) { // ... } } ``` Kotlin ``` import org.springframework.amqp.rabbit.annotation.RabbitListener import org.springframework.stereotype.Component @Component class MyBean { @RabbitListener(queues = ["someQueue"], containerFactory = "myFactory") fun processMessage(content: String?) { // ... } } ``` You can enable retries to handle situations where your listener throws an exception. By default, `RejectAndDontRequeueRecoverer` is used, but you can define a `MessageRecoverer` of your own. When retries are exhausted, the message is rejected and either dropped or routed to a dead-letter exchange if the broker is configured to do so. By default, retries are disabled. You can also customize the `RetryTemplate` programmatically by declaring a `RabbitRetryTemplateCustomizer` bean. | | | | --- | --- | | | By default, if retries are disabled and the listener throws an exception, the delivery is retried indefinitely. You can modify this behavior in two ways: Set the `defaultRequeueRejected` property to `false` so that zero re-deliveries are attempted or throw an `AmqpRejectAndDontRequeueException` to signal the message should be rejected. The latter is the mechanism used when retries are enabled and the maximum number of delivery attempts is reached. | 3. Apache Kafka Support ------------------------ [Apache Kafka](https://kafka.apache.org/) is supported by providing auto-configuration of the `spring-kafka` project. Kafka configuration is controlled by external configuration properties in `spring.kafka.*`. For example, you might declare the following section in `application.properties`: Properties ``` spring.kafka.bootstrap-servers=localhost:9092 spring.kafka.consumer.group-id=myGroup ``` Yaml ``` spring: kafka: bootstrap-servers: "localhost:9092" consumer: group-id: "myGroup" ``` | | | | --- | --- | | | To create a topic on startup, add a bean of type `NewTopic`. If the topic already exists, the bean is ignored. | See [`KafkaProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/kafka/KafkaProperties.java) for more supported options. ### 3.1. Sending a Message Spring’s `KafkaTemplate` is auto-configured, and you can autowire it directly in your own beans, as shown in the following example: Java ``` import org.springframework.kafka.core.KafkaTemplate; import org.springframework.stereotype.Component; @Component public class MyBean { private final KafkaTemplate<String, String> kafkaTemplate; public MyBean(KafkaTemplate<String, String> kafkaTemplate) { this.kafkaTemplate = kafkaTemplate; } // ... public void someMethod() { this.kafkaTemplate.send("someTopic", "Hello"); } } ``` Kotlin ``` import org.springframework.kafka.core.KafkaTemplate import org.springframework.stereotype.Component @Component class MyBean(private val kafkaTemplate: KafkaTemplate<String, String>) { // ... fun someMethod() { kafkaTemplate.send("someTopic", "Hello") } } ``` | | | | --- | --- | | | If the property `spring.kafka.producer.transaction-id-prefix` is defined, a `KafkaTransactionManager` is automatically configured. Also, if a `RecordMessageConverter` bean is defined, it is automatically associated to the auto-configured `KafkaTemplate`. | ### 3.2. Receiving a Message When the Apache Kafka infrastructure is present, any bean can be annotated with `@KafkaListener` to create a listener endpoint. If no `KafkaListenerContainerFactory` has been defined, a default one is automatically configured with keys defined in `spring.kafka.listener.*`. The following component creates a listener endpoint on the `someTopic` topic: Java ``` import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Component; @Component public class MyBean { @KafkaListener(topics = "someTopic") public void processMessage(String content) { // ... } } ``` Kotlin ``` import org.springframework.kafka.annotation.KafkaListener import org.springframework.stereotype.Component @Component class MyBean { @KafkaListener(topics = ["someTopic"]) fun processMessage(content: String?) { // ... } } ``` If a `KafkaTransactionManager` bean is defined, it is automatically associated to the container factory. Similarly, if a `RecordFilterStrategy`, `CommonErrorHandler`, `AfterRollbackProcessor` or `ConsumerAwareRebalanceListener` bean is defined, it is automatically associated to the default factory. Depending on the listener type, a `RecordMessageConverter` or `BatchMessageConverter` bean is associated to the default factory. If only a `RecordMessageConverter` bean is present for a batch listener, it is wrapped in a `BatchMessageConverter`. | | | | --- | --- | | | A custom `ChainedKafkaTransactionManager` must be marked `@Primary` as it usually references the auto-configured `KafkaTransactionManager` bean. | ### 3.3. Kafka Streams Spring for Apache Kafka provides a factory bean to create a `StreamsBuilder` object and manage the lifecycle of its streams. Spring Boot auto-configures the required `KafkaStreamsConfiguration` bean as long as `kafka-streams` is on the classpath and Kafka Streams is enabled by the `@EnableKafkaStreams` annotation. Enabling Kafka Streams means that the application id and bootstrap servers must be set. The former can be configured using `spring.kafka.streams.application-id`, defaulting to `spring.application.name` if not set. The latter can be set globally or specifically overridden only for streams. Several additional properties are available using dedicated properties; other arbitrary Kafka properties can be set using the `spring.kafka.streams.properties` namespace. See also [Additional Kafka Properties](#messaging.kafka.additional-properties) for more information. To use the factory bean, wire `StreamsBuilder` into your `@Bean` as shown in the following example: Java ``` import org.apache.kafka.common.serialization.Serdes; import org.apache.kafka.streams.KeyValue; import org.apache.kafka.streams.StreamsBuilder; import org.apache.kafka.streams.kstream.KStream; import org.apache.kafka.streams.kstream.Produced; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.annotation.EnableKafkaStreams; import org.springframework.kafka.support.serializer.JsonSerde; @Configuration(proxyBeanMethods = false) @EnableKafkaStreams public class MyKafkaStreamsConfiguration { @Bean public KStream<Integer, String> kStream(StreamsBuilder streamsBuilder) { KStream<Integer, String> stream = streamsBuilder.stream("ks1In"); stream.map(this::uppercaseValue).to("ks1Out", Produced.with(Serdes.Integer(), new JsonSerde<>())); return stream; } private KeyValue<Integer, String> uppercaseValue(Integer key, String value) { return new KeyValue<>(key, value.toUpperCase()); } } ``` Kotlin ``` import org.apache.kafka.common.serialization.Serdes import org.apache.kafka.streams.KeyValue import org.apache.kafka.streams.StreamsBuilder import org.apache.kafka.streams.kstream.KStream import org.apache.kafka.streams.kstream.Produced import org.springframework.context.annotation.Bean import org.springframework.context.annotation.Configuration import org.springframework.kafka.annotation.EnableKafkaStreams import org.springframework.kafka.support.serializer.JsonSerde @Configuration(proxyBeanMethods = false) @EnableKafkaStreams class MyKafkaStreamsConfiguration { @Bean fun kStream(streamsBuilder: StreamsBuilder): KStream<Int, String> { val stream = streamsBuilder.stream<Int, String>("ks1In") stream.map(this::uppercaseValue).to("ks1Out", Produced.with(Serdes.Integer(), JsonSerde())) return stream } private fun uppercaseValue(key: Int, value: String): KeyValue<Int?, String?> { return KeyValue(key, value.uppercase()) } } ``` By default, the streams managed by the `StreamBuilder` object it creates are started automatically. You can customize this behavior using the `spring.kafka.streams.auto-startup` property. ### 3.4. Additional Kafka Properties The properties supported by auto configuration are shown in the [“Integration Properties”](application-properties#appendix.application-properties.integration) section of the Appendix. Note that, for the most part, these properties (hyphenated or camelCase) map directly to the Apache Kafka dotted properties. See the Apache Kafka documentation for details. The first few of these properties apply to all components (producers, consumers, admins, and streams) but can be specified at the component level if you wish to use different values. Apache Kafka designates properties with an importance of HIGH, MEDIUM, or LOW. Spring Boot auto-configuration supports all HIGH importance properties, some selected MEDIUM and LOW properties, and any properties that do not have a default value. Only a subset of the properties supported by Kafka are available directly through the `KafkaProperties` class. If you wish to configure the producer or consumer with additional properties that are not directly supported, use the following properties: Properties ``` spring.kafka.properties[prop.one]=first spring.kafka.admin.properties[prop.two]=second spring.kafka.consumer.properties[prop.three]=third spring.kafka.producer.properties[prop.four]=fourth spring.kafka.streams.properties[prop.five]=fifth ``` Yaml ``` spring: kafka: properties: "[prop.one]": "first" admin: properties: "[prop.two]": "second" consumer: properties: "[prop.three]": "third" producer: properties: "[prop.four]": "fourth" streams: properties: "[prop.five]": "fifth" ``` This sets the common `prop.one` Kafka property to `first` (applies to producers, consumers and admins), the `prop.two` admin property to `second`, the `prop.three` consumer property to `third`, the `prop.four` producer property to `fourth` and the `prop.five` streams property to `fifth`. You can also configure the Spring Kafka `JsonDeserializer` as follows: Properties ``` spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer spring.kafka.consumer.properties[spring.json.value.default.type]=com.example.Invoice spring.kafka.consumer.properties[spring.json.trusted.packages]=com.example.main,com.example.another ``` Yaml ``` spring: kafka: consumer: value-deserializer: "org.springframework.kafka.support.serializer.JsonDeserializer" properties: "[spring.json.value.default.type]": "com.example.Invoice" "[spring.json.trusted.packages]": "com.example.main,com.example.another" ``` Similarly, you can disable the `JsonSerializer` default behavior of sending type information in headers: Properties ``` spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer spring.kafka.producer.properties[spring.json.add.type.headers]=false ``` Yaml ``` spring: kafka: producer: value-serializer: "org.springframework.kafka.support.serializer.JsonSerializer" properties: "[spring.json.add.type.headers]": false ``` | | | | --- | --- | | | Properties set in this way override any configuration item that Spring Boot explicitly supports. | ### 3.5. Testing with Embedded Kafka Spring for Apache Kafka provides a convenient way to test projects with an embedded Apache Kafka broker. To use this feature, annotate a test class with `@EmbeddedKafka` from the `spring-kafka-test` module. For more information, please see the Spring for Apache Kafka [reference manual](https://docs.spring.io/spring-kafka/docs/2.8.6/reference/html/#embedded-kafka-annotation). To make Spring Boot auto-configuration work with the aforementioned embedded Apache Kafka broker, you need to remap a system property for embedded broker addresses (populated by the `EmbeddedKafkaBroker`) into the Spring Boot configuration property for Apache Kafka. There are several ways to do that: * Provide a system property to map embedded broker addresses into `spring.kafka.bootstrap-servers` in the test class: Java ``` static { System.setProperty(EmbeddedKafkaBroker.BROKER_LIST_PROPERTY, "spring.kafka.bootstrap-servers"); } ``` Kotlin ``` init { System.setProperty(EmbeddedKafkaBroker.BROKER_LIST_PROPERTY, "spring.kafka.bootstrap-servers") } ``` * Configure a property name on the `@EmbeddedKafka` annotation: Java ``` import org.springframework.boot.test.context.SpringBootTest; import org.springframework.kafka.test.context.EmbeddedKafka; @SpringBootTest @EmbeddedKafka(topics = "someTopic", bootstrapServersProperty = "spring.kafka.bootstrap-servers") class MyTest { // ... } ``` Kotlin ``` import org.springframework.boot.test.context.SpringBootTest import org.springframework.kafka.test.context.EmbeddedKafka @SpringBootTest @EmbeddedKafka(topics = ["someTopic"], bootstrapServersProperty = "spring.kafka.bootstrap-servers") class MyTest { // ... } ``` * Use a placeholder in configuration properties: Properties ``` spring.kafka.bootstrap-servers=${spring.embedded.kafka.brokers} ``` Yaml ``` spring: kafka: bootstrap-servers: "${spring.embedded.kafka.brokers}" ``` 4. RSocket ----------- [RSocket](https://rsocket.io) is a binary protocol for use on byte stream transports. It enables symmetric interaction models through async message passing over a single connection. The `spring-messaging` module of the Spring Framework provides support for RSocket requesters and responders, both on the client and on the server side. See the [RSocket section](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#rsocket-spring) of the Spring Framework reference for more details, including an overview of the RSocket protocol. ### 4.1. RSocket Strategies Auto-configuration Spring Boot auto-configures an `RSocketStrategies` bean that provides all the required infrastructure for encoding and decoding RSocket payloads. By default, the auto-configuration will try to configure the following (in order): 1. [CBOR](https://cbor.io/) codecs with Jackson 2. JSON codecs with Jackson The `spring-boot-starter-rsocket` starter provides both dependencies. See the [Jackson support section](features#features.json.jackson) to know more about customization possibilities. Developers can customize the `RSocketStrategies` component by creating beans that implement the `RSocketStrategiesCustomizer` interface. Note that their `@Order` is important, as it determines the order of codecs. ### 4.2. RSocket server Auto-configuration Spring Boot provides RSocket server auto-configuration. The required dependencies are provided by the `spring-boot-starter-rsocket`. Spring Boot allows exposing RSocket over WebSocket from a WebFlux server, or standing up an independent RSocket server. This depends on the type of application and its configuration. For WebFlux application (that is of type `WebApplicationType.REACTIVE`), the RSocket server will be plugged into the Web Server only if the following properties match: Properties ``` spring.rsocket.server.mapping-path=/rsocket spring.rsocket.server.transport=websocket ``` Yaml ``` spring: rsocket: server: mapping-path: "/rsocket" transport: "websocket" ``` | | | | --- | --- | | | Plugging RSocket into a web server is only supported with Reactor Netty, as RSocket itself is built with that library. | Alternatively, an RSocket TCP or websocket server is started as an independent, embedded server. Besides the dependency requirements, the only required configuration is to define a port for that server: Properties ``` spring.rsocket.server.port=9898 ``` Yaml ``` spring: rsocket: server: port: 9898 ``` ### 4.3. Spring Messaging RSocket support Spring Boot will auto-configure the Spring Messaging infrastructure for RSocket. This means that Spring Boot will create a `RSocketMessageHandler` bean that will handle RSocket requests to your application. ### 4.4. Calling RSocket Services with RSocketRequester Once the `RSocket` channel is established between server and client, any party can send or receive requests to the other. As a server, you can get injected with an `RSocketRequester` instance on any handler method of an RSocket `@Controller`. As a client, you need to configure and establish an RSocket connection first. Spring Boot auto-configures an `RSocketRequester.Builder` for such cases with the expected codecs and applies any `RSocketConnectorConfigurer` bean. The `RSocketRequester.Builder` instance is a prototype bean, meaning each injection point will provide you with a new instance . This is done on purpose since this builder is stateful and you should not create requesters with different setups using the same instance. The following code shows a typical example: Java ``` import reactor.core.publisher.Mono; import org.springframework.messaging.rsocket.RSocketRequester; import org.springframework.stereotype.Service; @Service public class MyService { private final RSocketRequester rsocketRequester; public MyService(RSocketRequester.Builder rsocketRequesterBuilder) { this.rsocketRequester = rsocketRequesterBuilder.tcp("example.org", 9898); } public Mono<User> someRSocketCall(String name) { return this.rsocketRequester.route("user").data(name).retrieveMono(User.class); } } ``` Kotlin ``` import org.springframework.messaging.rsocket.RSocketRequester import org.springframework.stereotype.Service import reactor.core.publisher.Mono @Service class MyService(rsocketRequesterBuilder: RSocketRequester.Builder) { private val rsocketRequester: RSocketRequester init { rsocketRequester = rsocketRequesterBuilder.tcp("example.org", 9898) } fun someRSocketCall(name: String): Mono<User> { return rsocketRequester.route("user").data(name).retrieveMono( User::class.java ) } } ``` 5. Spring Integration ---------------------- Spring Boot offers several conveniences for working with [Spring Integration](https://spring.io/projects/spring-integration), including the `spring-boot-starter-integration` “Starter”. Spring Integration provides abstractions over messaging and also other transports such as HTTP, TCP, and others. If Spring Integration is available on your classpath, it is initialized through the `@EnableIntegration` annotation. Spring Integration polling logic relies [on the auto-configured `TaskScheduler`](features#features.task-execution-and-scheduling). The default `PollerMetadata` (poll unbounded number of messages every second) can be customized with `spring.integration.poller.*` configuration properties. Spring Boot also configures some features that are triggered by the presence of additional Spring Integration modules. If `spring-integration-jmx` is also on the classpath, message processing statistics are published over JMX. If `spring-integration-jdbc` is available, the default database schema can be created on startup, as shown in the following line: Properties ``` spring.integration.jdbc.initialize-schema=always ``` Yaml ``` spring: integration: jdbc: initialize-schema: "always" ``` If `spring-integration-rsocket` is available, developers can configure an RSocket server using `"spring.rsocket.server.*"` properties and let it use `IntegrationRSocketEndpoint` or `RSocketOutboundGateway` components to handle incoming RSocket messages. This infrastructure can handle Spring Integration RSocket channel adapters and `@MessageMapping` handlers (given `"spring.integration.rsocket.server.message-mapping-enabled"` is configured). Spring Boot can also auto-configure an `ClientRSocketConnector` using configuration properties: Properties ``` # Connecting to a RSocket server over TCP spring.integration.rsocket.client.host=example.org spring.integration.rsocket.client.port=9898 ``` Yaml ``` # Connecting to a RSocket server over TCP spring: integration: rsocket: client: host: "example.org" port: 9898 ``` Properties ``` # Connecting to a RSocket Server over WebSocket spring.integration.rsocket.client.uri=ws://example.org ``` Yaml ``` # Connecting to a RSocket Server over WebSocket spring: integration: rsocket: client: uri: "ws://example.org" ``` See the [`IntegrationAutoConfiguration`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/integration/IntegrationAutoConfiguration.java) and [`IntegrationProperties`](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/integration/IntegrationProperties.java) classes for more details. 6. WebSockets -------------- Spring Boot provides WebSockets auto-configuration for embedded Tomcat, Jetty, and Undertow. If you deploy a war file to a standalone container, Spring Boot assumes that the container is responsible for the configuration of its WebSocket support. Spring Framework provides [rich WebSocket support](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web.html#websocket) for MVC web applications that can be easily accessed through the `spring-boot-starter-websocket` module. WebSocket support is also available for [reactive web applications](https://docs.spring.io/spring-framework/docs/5.3.20/reference/html/web-reactive.html#webflux-websocket) and requires to include the WebSocket API alongside `spring-boot-starter-webflux`: ``` <dependency> <groupId>javax.websocket</groupId> <artifactId>javax.websocket-api</artifactId> </dependency> ``` 7. What to Read Next --------------------- The next section describes how to enable [IO capabilities](io#io) in your application. You can read about [caching](io#io.caching), [mail](io#io.email), [validation](io#io.validation), [rest clients](io#io.rest-client) and more in this section.
programming_docs
spring_boot Dependency Versions Dependency Versions =================== This appendix provides details of the dependencies that are managed by Spring Boot. 1. Managed Dependency Coordinates ---------------------------------- The following table provides details of all of the dependency versions that are provided by Spring Boot in its CLI (Command Line Interface), Maven dependency management, and Gradle plugin. When you declare a dependency on one of these artifacts without declaring a version, the version listed in the table is used. | Group ID | Artifact ID | Version | | --- | --- | --- | | `antlr` | `antlr` | `2.7.7` | | `ch.qos.logback` | `logback-access` | `1.2.11` | | `ch.qos.logback` | `logback-classic` | `1.2.11` | | `ch.qos.logback` | `logback-core` | `1.2.11` | | `com.atomikos` | `transactions-jdbc` | `4.0.6` | | `com.atomikos` | `transactions-jms` | `4.0.6` | | `com.atomikos` | `transactions-jta` | `4.0.6` | | `com.couchbase.client` | `java-client` | `3.3.0` | | `com.datastax.oss` | `java-driver-core` | `4.14.1` | | `com.datastax.oss` | `java-driver-core-shaded` | `4.14.1` | | `com.datastax.oss` | `java-driver-mapper-processor` | `4.14.1` | | `com.datastax.oss` | `java-driver-mapper-runtime` | `4.14.1` | | `com.datastax.oss` | `java-driver-metrics-micrometer` | `4.14.1` | | `com.datastax.oss` | `java-driver-metrics-microprofile` | `4.14.1` | | `com.datastax.oss` | `java-driver-query-builder` | `4.14.1` | | `com.datastax.oss` | `java-driver-shaded-guava` | `25.1-jre-graal-sub-1` | | `com.datastax.oss` | `java-driver-test-infra` | `4.14.1` | | `com.datastax.oss` | `native-protocol` | `1.5.1` | | `com.fasterxml` | `classmate` | `1.5.1` | | `com.fasterxml.jackson.core` | `jackson-annotations` | `2.13.3` | | `com.fasterxml.jackson.core` | `jackson-core` | `2.13.3` | | `com.fasterxml.jackson.core` | `jackson-databind` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-avro` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-cbor` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-csv` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-ion` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-properties` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-protobuf` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-smile` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-toml` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-xml` | `2.13.3` | | `com.fasterxml.jackson.dataformat` | `jackson-dataformat-yaml` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-eclipse-collections` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-guava` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-hibernate4` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-hibernate5` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-hibernate5-jakarta` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-hppc` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-jakarta-jsonp` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-jaxrs` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-jdk8` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-joda` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-joda-money` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-json-org` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-jsr310` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-jsr353` | `2.13.3` | | `com.fasterxml.jackson.datatype` | `jackson-datatype-pcollections` | `2.13.3` | | `com.fasterxml.jackson.jakarta.rs` | `jackson-jakarta-rs-base` | `2.13.3` | | `com.fasterxml.jackson.jakarta.rs` | `jackson-jakarta-rs-cbor-provider` | `2.13.3` | | `com.fasterxml.jackson.jakarta.rs` | `jackson-jakarta-rs-json-provider` | `2.13.3` | | `com.fasterxml.jackson.jakarta.rs` | `jackson-jakarta-rs-smile-provider` | `2.13.3` | | `com.fasterxml.jackson.jakarta.rs` | `jackson-jakarta-rs-xml-provider` | `2.13.3` | | `com.fasterxml.jackson.jakarta.rs` | `jackson-jakarta-rs-yaml-provider` | `2.13.3` | | `com.fasterxml.jackson.jaxrs` | `jackson-jaxrs-base` | `2.13.3` | | `com.fasterxml.jackson.jaxrs` | `jackson-jaxrs-cbor-provider` | `2.13.3` | | `com.fasterxml.jackson.jaxrs` | `jackson-jaxrs-json-provider` | `2.13.3` | | `com.fasterxml.jackson.jaxrs` | `jackson-jaxrs-smile-provider` | `2.13.3` | | `com.fasterxml.jackson.jaxrs` | `jackson-jaxrs-xml-provider` | `2.13.3` | | `com.fasterxml.jackson.jaxrs` | `jackson-jaxrs-yaml-provider` | `2.13.3` | | `com.fasterxml.jackson.jr` | `jackson-jr-all` | `2.13.3` | | `com.fasterxml.jackson.jr` | `jackson-jr-annotation-support` | `2.13.3` | | `com.fasterxml.jackson.jr` | `jackson-jr-objects` | `2.13.3` | | `com.fasterxml.jackson.jr` | `jackson-jr-retrofit2` | `2.13.3` | | `com.fasterxml.jackson.jr` | `jackson-jr-stree` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-afterburner` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-blackbird` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-guice` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-jakarta-xmlbind-annotations` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-jaxb-annotations` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-jsonSchema` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-kotlin` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-mrbean` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-no-ctor-deser` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-osgi` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-parameter-names` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-paranamer` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-scala_2.11` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-scala_2.12` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-scala_2.13` | `2.13.3` | | `com.fasterxml.jackson.module` | `jackson-module-scala_3` | `2.13.3` | | `com.github.ben-manes.caffeine` | `caffeine` | `2.9.3` | | `com.github.ben-manes.caffeine` | `guava` | `2.9.3` | | `com.github.ben-manes.caffeine` | `jcache` | `2.9.3` | | `com.github.ben-manes.caffeine` | `simulator` | `2.9.3` | | `com.github.mxab.thymeleaf.extras` | `thymeleaf-extras-data-attribute` | `2.0.1` | | `com.google.appengine` | `appengine-api-1.0-sdk` | `1.9.96` | | `com.google.code.gson` | `gson` | `2.9.0` | | `com.graphql-java` | `graphql-java` | `18.1` | | `com.h2database` | `h2` | `2.1.212` | | `com.hazelcast` | `hazelcast` | `5.1.1` | | `com.hazelcast` | `hazelcast-hibernate52` | `2.2.1` | | `com.hazelcast` | `hazelcast-hibernate53` | `2.2.1` | | `com.hazelcast` | `hazelcast-spring` | `5.1.1` | | `com.ibm.db2` | `jcc` | `11.5.7.0` | | `com.jayway.jsonpath` | `json-path` | `2.7.0` | | `com.jayway.jsonpath` | `json-path-assert` | `2.7.0` | | `com.microsoft.sqlserver` | `mssql-jdbc` | `10.2.1.jre8` | | `com.oracle.database.ha` | `ons` | `21.5.0.0` | | `com.oracle.database.ha` | `simplefan` | `21.5.0.0` | | `com.oracle.database.jdbc` | `ojdbc11` | `21.5.0.0` | | `com.oracle.database.jdbc` | `ojdbc11-production` | `21.5.0.0` | | `com.oracle.database.jdbc` | `ojdbc8` | `21.5.0.0` | | `com.oracle.database.jdbc` | `ojdbc8-production` | `21.5.0.0` | | `com.oracle.database.jdbc` | `rsi` | `21.5.0.0` | | `com.oracle.database.jdbc` | `ucp` | `21.5.0.0` | | `com.oracle.database.jdbc` | `ucp11` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc11-debug` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc11-observability-debug` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc11_g` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc11dms_g` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc8-debug` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc8-observability-debug` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc8_g` | `21.5.0.0` | | `com.oracle.database.jdbc.debug` | `ojdbc8dms_g` | `21.5.0.0` | | `com.oracle.database.nls` | `orai18n` | `21.5.0.0` | | `com.oracle.database.observability` | `dms` | `21.5.0.0` | | `com.oracle.database.observability` | `ojdbc11-observability` | `21.5.0.0` | | `com.oracle.database.observability` | `ojdbc11dms` | `21.5.0.0` | | `com.oracle.database.observability` | `ojdbc8-observability` | `21.5.0.0` | | `com.oracle.database.observability` | `ojdbc8dms` | `21.5.0.0` | | `com.oracle.database.r2dbc` | `oracle-r2dbc` | `0.4.0` | | `com.oracle.database.security` | `oraclepki` | `21.5.0.0` | | `com.oracle.database.security` | `osdt_cert` | `21.5.0.0` | | `com.oracle.database.security` | `osdt_core` | `21.5.0.0` | | `com.oracle.database.xml` | `xdb` | `21.5.0.0` | | `com.oracle.database.xml` | `xmlparserv2` | `21.5.0.0` | | `com.querydsl` | `querydsl-apt` | `5.0.0` | | `com.querydsl` | `querydsl-codegen` | `5.0.0` | | `com.querydsl` | `querydsl-codegen-utils` | `5.0.0` | | `com.querydsl` | `querydsl-collections` | `5.0.0` | | `com.querydsl` | `querydsl-core` | `5.0.0` | | `com.querydsl` | `querydsl-guava` | `5.0.0` | | `com.querydsl` | `querydsl-hibernate-search` | `5.0.0` | | `com.querydsl` | `querydsl-jdo` | `5.0.0` | | `com.querydsl` | `querydsl-jpa` | `5.0.0` | | `com.querydsl` | `querydsl-jpa-codegen` | `5.0.0` | | `com.querydsl` | `querydsl-kotlin` | `5.0.0` | | `com.querydsl` | `querydsl-kotlin-codegen` | `5.0.0` | | `com.querydsl` | `querydsl-lucene3` | `5.0.0` | | `com.querydsl` | `querydsl-lucene4` | `5.0.0` | | `com.querydsl` | `querydsl-lucene5` | `5.0.0` | | `com.querydsl` | `querydsl-mongodb` | `5.0.0` | | `com.querydsl` | `querydsl-scala` | `5.0.0` | | `com.querydsl` | `querydsl-spatial` | `5.0.0` | | `com.querydsl` | `querydsl-sql` | `5.0.0` | | `com.querydsl` | `querydsl-sql-codegen` | `5.0.0` | | `com.querydsl` | `querydsl-sql-spatial` | `5.0.0` | | `com.querydsl` | `querydsl-sql-spring` | `5.0.0` | | `com.rabbitmq` | `amqp-client` | `5.14.2` | | `com.rabbitmq` | `stream-client` | `0.5.0` | | `com.samskivert` | `jmustache` | `1.15` | | `com.sendgrid` | `sendgrid-java` | `4.9.2` | | `com.squareup.okhttp3` | `logging-interceptor` | `4.9.3` | | `com.squareup.okhttp3` | `mockwebserver` | `4.9.3` | | `com.squareup.okhttp3` | `okcurl` | `4.9.3` | | `com.squareup.okhttp3` | `okhttp` | `4.9.3` | | `com.squareup.okhttp3` | `okhttp-brotli` | `4.9.3` | | `com.squareup.okhttp3` | `okhttp-dnsoverhttps` | `4.9.3` | | `com.squareup.okhttp3` | `okhttp-sse` | `4.9.3` | | `com.squareup.okhttp3` | `okhttp-tls` | `4.9.3` | | `com.squareup.okhttp3` | `okhttp-urlconnection` | `4.9.3` | | `com.sun.activation` | `jakarta.activation` | `1.2.2` | | `com.sun.mail` | `jakarta.mail` | `1.6.7` | | `com.sun.xml.messaging.saaj` | `saaj-impl` | `1.5.3` | | `com.unboundid` | `unboundid-ldapsdk` | `6.0.5` | | `com.zaxxer` | `HikariCP` | `4.0.3` | | `commons-codec` | `commons-codec` | `1.15` | | `commons-pool` | `commons-pool` | `1.6` | | `de.flapdoodle.embed` | `de.flapdoodle.embed.mongo` | `3.4.5` | | `io.dropwizard.metrics` | `metrics-annotation` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-caffeine` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-caffeine3` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-collectd` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-core` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-ehcache` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-graphite` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-healthchecks` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-httpasyncclient` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-httpclient` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-httpclient5` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jakarta-servlet` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jakarta-servlets` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jcache` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jdbi` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jdbi3` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jersey2` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jersey3` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jetty10` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jetty11` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jetty9` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jmx` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-json` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-jvm` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-log4j2` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-logback` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-servlet` | `4.2.9` | | `io.dropwizard.metrics` | `metrics-servlets` | `4.2.9` | | `io.lettuce` | `lettuce-core` | `6.1.8.RELEASE` | | `io.micrometer` | `micrometer-core` | `1.9.0` | | `io.micrometer` | `micrometer-registry-appoptics` | `1.9.0` | | `io.micrometer` | `micrometer-registry-atlas` | `1.9.0` | | `io.micrometer` | `micrometer-registry-azure-monitor` | `1.9.0` | | `io.micrometer` | `micrometer-registry-cloudwatch` | `1.9.0` | | `io.micrometer` | `micrometer-registry-cloudwatch2` | `1.9.0` | | `io.micrometer` | `micrometer-registry-datadog` | `1.9.0` | | `io.micrometer` | `micrometer-registry-dynatrace` | `1.9.0` | | `io.micrometer` | `micrometer-registry-elastic` | `1.9.0` | | `io.micrometer` | `micrometer-registry-ganglia` | `1.9.0` | | `io.micrometer` | `micrometer-registry-graphite` | `1.9.0` | | `io.micrometer` | `micrometer-registry-health` | `1.9.0` | | `io.micrometer` | `micrometer-registry-humio` | `1.9.0` | | `io.micrometer` | `micrometer-registry-influx` | `1.9.0` | | `io.micrometer` | `micrometer-registry-jmx` | `1.9.0` | | `io.micrometer` | `micrometer-registry-kairos` | `1.9.0` | | `io.micrometer` | `micrometer-registry-new-relic` | `1.9.0` | | `io.micrometer` | `micrometer-registry-opentsdb` | `1.9.0` | | `io.micrometer` | `micrometer-registry-otlp` | `1.9.0` | | `io.micrometer` | `micrometer-registry-prometheus` | `1.9.0` | | `io.micrometer` | `micrometer-registry-signalfx` | `1.9.0` | | `io.micrometer` | `micrometer-registry-stackdriver` | `1.9.0` | | `io.micrometer` | `micrometer-registry-statsd` | `1.9.0` | | `io.micrometer` | `micrometer-registry-wavefront` | `1.9.0` | | `io.micrometer` | `micrometer-test` | `1.9.0` | | `io.netty` | `netty-all` | `4.1.77.Final` | | `io.netty` | `netty-buffer` | `4.1.77.Final` | | `io.netty` | `netty-codec` | `4.1.77.Final` | | `io.netty` | `netty-codec-dns` | `4.1.77.Final` | | `io.netty` | `netty-codec-haproxy` | `4.1.77.Final` | | `io.netty` | `netty-codec-http` | `4.1.77.Final` | | `io.netty` | `netty-codec-http2` | `4.1.77.Final` | | `io.netty` | `netty-codec-memcache` | `4.1.77.Final` | | `io.netty` | `netty-codec-mqtt` | `4.1.77.Final` | | `io.netty` | `netty-codec-redis` | `4.1.77.Final` | | `io.netty` | `netty-codec-smtp` | `4.1.77.Final` | | `io.netty` | `netty-codec-socks` | `4.1.77.Final` | | `io.netty` | `netty-codec-stomp` | `4.1.77.Final` | | `io.netty` | `netty-codec-xml` | `4.1.77.Final` | | `io.netty` | `netty-common` | `4.1.77.Final` | | `io.netty` | `netty-dev-tools` | `4.1.77.Final` | | `io.netty` | `netty-example` | `4.1.77.Final` | | `io.netty` | `netty-handler` | `4.1.77.Final` | | `io.netty` | `netty-handler-proxy` | `4.1.77.Final` | | `io.netty` | `netty-resolver` | `4.1.77.Final` | | `io.netty` | `netty-resolver-dns` | `4.1.77.Final` | | `io.netty` | `netty-resolver-dns-classes-macos` | `4.1.77.Final` | | `io.netty` | `netty-resolver-dns-native-macos` | `4.1.77.Final` | | `io.netty` | `netty-tcnative` | `2.0.52.Final` | | `io.netty` | `netty-tcnative-boringssl-static` | `2.0.52.Final` | | `io.netty` | `netty-tcnative-classes` | `2.0.52.Final` | | `io.netty` | `netty-transport` | `4.1.77.Final` | | `io.netty` | `netty-transport-classes-epoll` | `4.1.77.Final` | | `io.netty` | `netty-transport-classes-kqueue` | `4.1.77.Final` | | `io.netty` | `netty-transport-native-epoll` | `4.1.77.Final` | | `io.netty` | `netty-transport-native-kqueue` | `4.1.77.Final` | | `io.netty` | `netty-transport-native-unix-common` | `4.1.77.Final` | | `io.netty` | `netty-transport-rxtx` | `4.1.77.Final` | | `io.netty` | `netty-transport-sctp` | `4.1.77.Final` | | `io.netty` | `netty-transport-udt` | `4.1.77.Final` | | `io.projectreactor` | `reactor-core` | `3.4.18` | | `io.projectreactor` | `reactor-test` | `3.4.18` | | `io.projectreactor` | `reactor-tools` | `3.4.18` | | `io.projectreactor.addons` | `reactor-adapter` | `3.4.8` | | `io.projectreactor.addons` | `reactor-extra` | `3.4.8` | | `io.projectreactor.addons` | `reactor-pool` | `0.2.8` | | `io.projectreactor.kafka` | `reactor-kafka` | `1.3.11` | | `io.projectreactor.kotlin` | `reactor-kotlin-extensions` | `1.1.6` | | `io.projectreactor.netty` | `reactor-netty` | `1.0.19` | | `io.projectreactor.netty` | `reactor-netty-core` | `1.0.19` | | `io.projectreactor.netty` | `reactor-netty-http` | `1.0.19` | | `io.projectreactor.netty` | `reactor-netty-http-brave` | `1.0.19` | | `io.projectreactor.rabbitmq` | `reactor-rabbitmq` | `1.5.4` | | `io.prometheus` | `simpleclient` | `0.15.0` | | `io.prometheus` | `simpleclient_caffeine` | `0.15.0` | | `io.prometheus` | `simpleclient_common` | `0.15.0` | | `io.prometheus` | `simpleclient_dropwizard` | `0.15.0` | | `io.prometheus` | `simpleclient_graphite_bridge` | `0.15.0` | | `io.prometheus` | `simpleclient_guava` | `0.15.0` | | `io.prometheus` | `simpleclient_hibernate` | `0.15.0` | | `io.prometheus` | `simpleclient_hotspot` | `0.15.0` | | `io.prometheus` | `simpleclient_httpserver` | `0.15.0` | | `io.prometheus` | `simpleclient_jetty` | `0.15.0` | | `io.prometheus` | `simpleclient_jetty_jdk8` | `0.15.0` | | `io.prometheus` | `simpleclient_log4j` | `0.15.0` | | `io.prometheus` | `simpleclient_log4j2` | `0.15.0` | | `io.prometheus` | `simpleclient_logback` | `0.15.0` | | `io.prometheus` | `simpleclient_pushgateway` | `0.15.0` | | `io.prometheus` | `simpleclient_servlet` | `0.15.0` | | `io.prometheus` | `simpleclient_servlet_jakarta` | `0.15.0` | | `io.prometheus` | `simpleclient_spring_boot` | `0.15.0` | | `io.prometheus` | `simpleclient_spring_web` | `0.15.0` | | `io.prometheus` | `simpleclient_tracer_otel` | `0.15.0` | | `io.prometheus` | `simpleclient_tracer_otel_agent` | `0.15.0` | | `io.prometheus` | `simpleclient_vertx` | `0.15.0` | | `io.r2dbc` | `r2dbc-h2` | `0.9.1.RELEASE` | | `io.r2dbc` | `r2dbc-mssql` | `0.9.0.RELEASE` | | `io.r2dbc` | `r2dbc-pool` | `0.9.0.RELEASE` | | `io.r2dbc` | `r2dbc-proxy` | `0.9.0.RELEASE` | | `io.r2dbc` | `r2dbc-spi` | `0.9.1.RELEASE` | | `io.reactivex` | `rxjava` | `1.3.8` | | `io.reactivex` | `rxjava-reactive-streams` | `1.2.1` | | `io.reactivex.rxjava2` | `rxjava` | `2.2.21` | | `io.rest-assured` | `json-path` | `4.5.1` | | `io.rest-assured` | `json-schema-validator` | `4.5.1` | | `io.rest-assured` | `rest-assured` | `4.5.1` | | `io.rest-assured` | `scala-support` | `4.5.1` | | `io.rest-assured` | `spring-mock-mvc` | `4.5.1` | | `io.rest-assured` | `spring-web-test-client` | `4.5.1` | | `io.rest-assured` | `xml-path` | `4.5.1` | | `io.rsocket` | `rsocket-core` | `1.1.2` | | `io.rsocket` | `rsocket-load-balancer` | `1.1.2` | | `io.rsocket` | `rsocket-micrometer` | `1.1.2` | | `io.rsocket` | `rsocket-test` | `1.1.2` | | `io.rsocket` | `rsocket-transport-local` | `1.1.2` | | `io.rsocket` | `rsocket-transport-netty` | `1.1.2` | | `io.spring.gradle` | `dependency-management-plugin` | `1.0.11.RELEASE` | | `io.undertow` | `undertow-core` | `2.2.17.Final` | | `io.undertow` | `undertow-servlet` | `2.2.17.Final` | | `io.undertow` | `undertow-websockets-jsr` | `2.2.17.Final` | | `jakarta.activation` | `jakarta.activation-api` | `1.2.2` | | `jakarta.annotation` | `jakarta.annotation-api` | `1.3.5` | | `jakarta.jms` | `jakarta.jms-api` | `2.0.3` | | `jakarta.json` | `jakarta.json-api` | `1.1.6` | | `jakarta.json.bind` | `jakarta.json.bind-api` | `1.0.2` | | `jakarta.mail` | `jakarta.mail-api` | `1.6.7` | | `jakarta.management.j2ee` | `jakarta.management.j2ee-api` | `1.1.4` | | `jakarta.persistence` | `jakarta.persistence-api` | `2.2.3` | | `jakarta.servlet` | `jakarta.servlet-api` | `4.0.4` | | `jakarta.servlet.jsp.jstl` | `jakarta.servlet.jsp.jstl-api` | `1.2.7` | | `jakarta.transaction` | `jakarta.transaction-api` | `1.3.3` | | `jakarta.validation` | `jakarta.validation-api` | `2.0.2` | | `jakarta.websocket` | `jakarta.websocket-api` | `1.1.2` | | `jakarta.ws.rs` | `jakarta.ws.rs-api` | `2.1.6` | | `jakarta.xml.bind` | `jakarta.xml.bind-api` | `2.3.3` | | `jakarta.xml.soap` | `jakarta.xml.soap-api` | `1.4.2` | | `jakarta.xml.ws` | `jakarta.xml.ws-api` | `2.3.3` | | `javax.activation` | `javax.activation-api` | `1.2.0` | | `javax.annotation` | `javax.annotation-api` | `1.3.2` | | `javax.cache` | `cache-api` | `1.1.1` | | `javax.jms` | `javax.jms-api` | `2.0.1` | | `javax.json` | `javax.json-api` | `1.1.4` | | `javax.json.bind` | `javax.json.bind-api` | `1.0` | | `javax.mail` | `javax.mail-api` | `1.6.2` | | `javax.money` | `money-api` | `1.1` | | `javax.persistence` | `javax.persistence-api` | `2.2` | | `javax.servlet` | `javax.servlet-api` | `4.0.1` | | `javax.servlet` | `jstl` | `1.2` | | `javax.transaction` | `javax.transaction-api` | `1.3` | | `javax.validation` | `validation-api` | `2.0.1.Final` | | `javax.websocket` | `javax.websocket-api` | `1.1` | | `javax.xml.bind` | `jaxb-api` | `2.3.1` | | `javax.xml.ws` | `jaxws-api` | `2.3.1` | | `jaxen` | `jaxen` | `1.2.0` | | `junit` | `junit` | `4.13.2` | | `mysql` | `mysql-connector-java` | `8.0.29` | | `net.bytebuddy` | `byte-buddy` | `1.12.10` | | `net.bytebuddy` | `byte-buddy-agent` | `1.12.10` | | `net.minidev` | `json-smart` | `2.4.8` | | `net.sf.ehcache` | `ehcache` | `2.10.9.2` | | `net.sourceforge.htmlunit` | `htmlunit` | `2.60.0` | | `net.sourceforge.jtds` | `jtds` | `1.3.1` | | `net.sourceforge.nekohtml` | `nekohtml` | `1.9.22` | | `nz.net.ultraq.thymeleaf` | `thymeleaf-layout-dialect` | `3.0.0` | | `org.apache.activemq` | `activemq-amqp` | `5.16.5` | | `org.apache.activemq` | `activemq-blueprint` | `5.16.5` | | `org.apache.activemq` | `activemq-broker` | `5.16.5` | | `org.apache.activemq` | `activemq-camel` | `5.16.5` | | `org.apache.activemq` | `activemq-client` | `5.16.5` | | `org.apache.activemq` | `activemq-console` | `5.16.5` | | `org.apache.activemq` | `activemq-http` | `5.16.5` | | `org.apache.activemq` | `activemq-jaas` | `5.16.5` | | `org.apache.activemq` | `activemq-jdbc-store` | `5.16.5` | | `org.apache.activemq` | `activemq-jms-pool` | `5.16.5` | | `org.apache.activemq` | `activemq-kahadb-store` | `5.16.5` | | `org.apache.activemq` | `activemq-karaf` | `5.16.5` | | `org.apache.activemq` | `activemq-leveldb-store` | `5.16.5` | | `org.apache.activemq` | `activemq-log4j-appender` | `5.16.5` | | `org.apache.activemq` | `activemq-mqtt` | `5.16.5` | | `org.apache.activemq` | `activemq-openwire-generator` | `5.16.5` | | `org.apache.activemq` | `activemq-openwire-legacy` | `5.16.5` | | `org.apache.activemq` | `activemq-osgi` | `5.16.5` | | `org.apache.activemq` | `activemq-partition` | `5.16.5` | | `org.apache.activemq` | `activemq-pool` | `5.16.5` | | `org.apache.activemq` | `activemq-ra` | `5.16.5` | | `org.apache.activemq` | `activemq-run` | `5.16.5` | | `org.apache.activemq` | `activemq-runtime-config` | `5.16.5` | | `org.apache.activemq` | `activemq-shiro` | `5.16.5` | | `org.apache.activemq` | `activemq-spring` | `5.16.5` | | `org.apache.activemq` | `activemq-stomp` | `5.16.5` | | `org.apache.activemq` | `activemq-web` | `5.16.5` | | `org.apache.activemq` | `artemis-amqp-protocol` | `2.19.1` | | `org.apache.activemq` | `artemis-commons` | `2.19.1` | | `org.apache.activemq` | `artemis-core-client` | `2.19.1` | | `org.apache.activemq` | `artemis-jdbc-store` | `2.19.1` | | `org.apache.activemq` | `artemis-jms-client` | `2.19.1` | | `org.apache.activemq` | `artemis-jms-server` | `2.19.1` | | `org.apache.activemq` | `artemis-journal` | `2.19.1` | | `org.apache.activemq` | `artemis-quorum-api` | `2.19.1` | | `org.apache.activemq` | `artemis-selector` | `2.19.1` | | `org.apache.activemq` | `artemis-server` | `2.19.1` | | `org.apache.activemq` | `artemis-service-extensions` | `2.19.1` | | `org.apache.commons` | `commons-dbcp2` | `2.9.0` | | `org.apache.commons` | `commons-lang3` | `3.12.0` | | `org.apache.commons` | `commons-pool2` | `2.11.1` | | `org.apache.derby` | `derby` | `10.14.2.0` | | `org.apache.derby` | `derbyclient` | `10.14.2.0` | | `org.apache.httpcomponents` | `fluent-hc` | `4.5.13` | | `org.apache.httpcomponents` | `httpasyncclient` | `4.1.5` | | `org.apache.httpcomponents` | `httpclient` | `4.5.13` | | `org.apache.httpcomponents` | `httpclient-cache` | `4.5.13` | | `org.apache.httpcomponents` | `httpclient-osgi` | `4.5.13` | | `org.apache.httpcomponents` | `httpclient-win` | `4.5.13` | | `org.apache.httpcomponents` | `httpcore` | `4.4.15` | | `org.apache.httpcomponents` | `httpcore-nio` | `4.4.15` | | `org.apache.httpcomponents` | `httpmime` | `4.5.13` | | `org.apache.httpcomponents.client5` | `httpclient5` | `5.1.3` | | `org.apache.httpcomponents.client5` | `httpclient5-cache` | `5.1.3` | | `org.apache.httpcomponents.client5` | `httpclient5-fluent` | `5.1.3` | | `org.apache.httpcomponents.client5` | `httpclient5-win` | `5.1.3` | | `org.apache.httpcomponents.core5` | `httpcore5` | `5.1.3` | | `org.apache.httpcomponents.core5` | `httpcore5-h2` | `5.1.3` | | `org.apache.httpcomponents.core5` | `httpcore5-reactive` | `5.1.3` | | `org.apache.johnzon` | `johnzon-core` | `1.2.18` | | `org.apache.johnzon` | `johnzon-jaxrs` | `1.2.18` | | `org.apache.johnzon` | `johnzon-jsonb` | `1.2.18` | | `org.apache.johnzon` | `johnzon-jsonb-extras` | `1.2.18` | | `org.apache.johnzon` | `johnzon-jsonschema` | `1.2.18` | | `org.apache.johnzon` | `johnzon-mapper` | `1.2.18` | | `org.apache.johnzon` | `johnzon-websocket` | `1.2.18` | | `org.apache.kafka` | `connect` | `3.1.1` | | `org.apache.kafka` | `connect-api` | `3.1.1` | | `org.apache.kafka` | `connect-basic-auth-extension` | `3.1.1` | | `org.apache.kafka` | `connect-file` | `3.1.1` | | `org.apache.kafka` | `connect-json` | `3.1.1` | | `org.apache.kafka` | `connect-mirror` | `3.1.1` | | `org.apache.kafka` | `connect-mirror-client` | `3.1.1` | | `org.apache.kafka` | `connect-runtime` | `3.1.1` | | `org.apache.kafka` | `connect-transforms` | `3.1.1` | | `org.apache.kafka` | `generator` | `3.1.1` | | `org.apache.kafka` | `kafka-clients` | `3.1.1` | | `org.apache.kafka` | `kafka-log4j-appender` | `3.1.1` | | `org.apache.kafka` | `kafka-metadata` | `3.1.1` | | `org.apache.kafka` | `kafka-raft` | `3.1.1` | | `org.apache.kafka` | `kafka-server-common` | `3.1.1` | | `org.apache.kafka` | `kafka-shell` | `3.1.1` | | `org.apache.kafka` | `kafka-storage` | `3.1.1` | | `org.apache.kafka` | `kafka-storage-api` | `3.1.1` | | `org.apache.kafka` | `kafka-streams` | `3.1.1` | | `org.apache.kafka` | `kafka-streams-scala_2.12` | `3.1.1` | | `org.apache.kafka` | `kafka-streams-scala_2.13` | `3.1.1` | | `org.apache.kafka` | `kafka-streams-test-utils` | `3.1.1` | | `org.apache.kafka` | `kafka-tools` | `3.1.1` | | `org.apache.kafka` | `kafka_2.12` | `3.1.1` | | `org.apache.kafka` | `kafka_2.13` | `3.1.1` | | `org.apache.kafka` | `trogdor` | `3.1.1` | | `org.apache.logging.log4j` | `log4j-1.2-api` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-api` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-appserver` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-cassandra` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-core` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-couchdb` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-docker` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-flume-ng` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-iostreams` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-jcl` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-jmx-gui` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-jpa` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-jpl` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-jul` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-kubernetes` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-layout-template-json` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-liquibase` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-mongodb3` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-mongodb4` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-slf4j-impl` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-slf4j18-impl` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-spring-boot` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-spring-cloud-config-client` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-taglib` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-to-slf4j` | `2.17.2` | | `org.apache.logging.log4j` | `log4j-web` | `2.17.2` | | `org.apache.solr` | `solr-analysis-extras` | `8.11.1` | | `org.apache.solr` | `solr-analytics` | `8.11.1` | | `org.apache.solr` | `solr-cell` | `8.11.1` | | `org.apache.solr` | `solr-core` | `8.11.1` | | `org.apache.solr` | `solr-dataimporthandler` | `8.11.1` | | `org.apache.solr` | `solr-dataimporthandler-extras` | `8.11.1` | | `org.apache.solr` | `solr-gcs-repository` | `8.11.1` | | `org.apache.solr` | `solr-jaegertracer-configurator` | `8.11.1` | | `org.apache.solr` | `solr-langid` | `8.11.1` | | `org.apache.solr` | `solr-ltr` | `8.11.1` | | `org.apache.solr` | `solr-prometheus-exporter` | `8.11.1` | | `org.apache.solr` | `solr-s3-repository` | `8.11.1` | | `org.apache.solr` | `solr-solrj` | `8.11.1` | | `org.apache.solr` | `solr-test-framework` | `8.11.1` | | `org.apache.solr` | `solr-velocity` | `8.11.1` | | `org.apache.tomcat` | `tomcat-annotations-api` | `9.0.63` | | `org.apache.tomcat` | `tomcat-jdbc` | `9.0.63` | | `org.apache.tomcat` | `tomcat-jsp-api` | `9.0.63` | | `org.apache.tomcat.embed` | `tomcat-embed-core` | `9.0.63` | | `org.apache.tomcat.embed` | `tomcat-embed-el` | `9.0.63` | | `org.apache.tomcat.embed` | `tomcat-embed-jasper` | `9.0.63` | | `org.apache.tomcat.embed` | `tomcat-embed-websocket` | `9.0.63` | | `org.aspectj` | `aspectjrt` | `1.9.7` | | `org.aspectj` | `aspectjtools` | `1.9.7` | | `org.aspectj` | `aspectjweaver` | `1.9.7` | | `org.assertj` | `assertj-core` | `3.22.0` | | `org.awaitility` | `awaitility` | `4.2.0` | | `org.awaitility` | `awaitility-groovy` | `4.2.0` | | `org.awaitility` | `awaitility-kotlin` | `4.2.0` | | `org.awaitility` | `awaitility-scala` | `4.2.0` | | `org.cache2k` | `cache2k-api` | `2.6.1.Final` | | `org.cache2k` | `cache2k-config` | `2.6.1.Final` | | `org.cache2k` | `cache2k-core` | `2.6.1.Final` | | `org.cache2k` | `cache2k-jcache` | `2.6.1.Final` | | `org.cache2k` | `cache2k-micrometer` | `2.6.1.Final` | | `org.cache2k` | `cache2k-spring` | `2.6.1.Final` | | `org.codehaus.groovy` | `groovy` | `3.0.10` | | `org.codehaus.groovy` | `groovy-ant` | `3.0.10` | | `org.codehaus.groovy` | `groovy-astbuilder` | `3.0.10` | | `org.codehaus.groovy` | `groovy-bsf` | `3.0.10` | | `org.codehaus.groovy` | `groovy-cli-commons` | `3.0.10` | | `org.codehaus.groovy` | `groovy-cli-picocli` | `3.0.10` | | `org.codehaus.groovy` | `groovy-console` | `3.0.10` | | `org.codehaus.groovy` | `groovy-datetime` | `3.0.10` | | `org.codehaus.groovy` | `groovy-dateutil` | `3.0.10` | | `org.codehaus.groovy` | `groovy-docgenerator` | `3.0.10` | | `org.codehaus.groovy` | `groovy-groovydoc` | `3.0.10` | | `org.codehaus.groovy` | `groovy-groovysh` | `3.0.10` | | `org.codehaus.groovy` | `groovy-jaxb` | `3.0.10` | | `org.codehaus.groovy` | `groovy-jmx` | `3.0.10` | | `org.codehaus.groovy` | `groovy-json` | `3.0.10` | | `org.codehaus.groovy` | `groovy-jsr223` | `3.0.10` | | `org.codehaus.groovy` | `groovy-macro` | `3.0.10` | | `org.codehaus.groovy` | `groovy-nio` | `3.0.10` | | `org.codehaus.groovy` | `groovy-servlet` | `3.0.10` | | `org.codehaus.groovy` | `groovy-sql` | `3.0.10` | | `org.codehaus.groovy` | `groovy-swing` | `3.0.10` | | `org.codehaus.groovy` | `groovy-templates` | `3.0.10` | | `org.codehaus.groovy` | `groovy-test` | `3.0.10` | | `org.codehaus.groovy` | `groovy-test-junit5` | `3.0.10` | | `org.codehaus.groovy` | `groovy-testng` | `3.0.10` | | `org.codehaus.groovy` | `groovy-xml` | `3.0.10` | | `org.codehaus.groovy` | `groovy-yaml` | `3.0.10` | | `org.codehaus.janino` | `commons-compiler` | `3.1.7` | | `org.codehaus.janino` | `commons-compiler-jdk` | `3.1.7` | | `org.codehaus.janino` | `janino` | `3.1.7` | | `org.eclipse.jetty` | `apache-jsp` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `apache-jstl` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `infinispan-common` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `infinispan-embedded-query` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `infinispan-remote-query` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-client` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-conscrypt-client` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-conscrypt-server` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-java-client` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-java-server` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-openjdk8-client` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-openjdk8-server` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-alpn-server` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-annotations` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-ant` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-client` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-continuation` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-deploy` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-distribution` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-hazelcast` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-home` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-http` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-http-spi` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-io` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-jaas` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-jaspi` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-jmx` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-jndi` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-nosql` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-openid` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-plus` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-proxy` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-quickstart` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-reactive-httpclient` | `1.1.11` | | `org.eclipse.jetty` | `jetty-rewrite` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-security` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-server` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-servlet` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-servlets` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-spring` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-unixsocket` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-util` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-util-ajax` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-webapp` | `9.4.46.v20220331` | | `org.eclipse.jetty` | `jetty-xml` | `9.4.46.v20220331` | | `org.eclipse.jetty.fcgi` | `fcgi-client` | `9.4.46.v20220331` | | `org.eclipse.jetty.fcgi` | `fcgi-server` | `9.4.46.v20220331` | | `org.eclipse.jetty.gcloud` | `jetty-gcloud-session-manager` | `9.4.46.v20220331` | | `org.eclipse.jetty.http2` | `http2-client` | `9.4.46.v20220331` | | `org.eclipse.jetty.http2` | `http2-common` | `9.4.46.v20220331` | | `org.eclipse.jetty.http2` | `http2-hpack` | `9.4.46.v20220331` | | `org.eclipse.jetty.http2` | `http2-http-client-transport` | `9.4.46.v20220331` | | `org.eclipse.jetty.http2` | `http2-server` | `9.4.46.v20220331` | | `org.eclipse.jetty.memcached` | `jetty-memcached-sessions` | `9.4.46.v20220331` | | `org.eclipse.jetty.orbit` | `javax.servlet.jsp` | `2.2.0.v201112011158` | | `org.eclipse.jetty.osgi` | `jetty-httpservice` | `9.4.46.v20220331` | | `org.eclipse.jetty.osgi` | `jetty-osgi-boot` | `9.4.46.v20220331` | | `org.eclipse.jetty.osgi` | `jetty-osgi-boot-jsp` | `9.4.46.v20220331` | | `org.eclipse.jetty.osgi` | `jetty-osgi-boot-warurl` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `javax-websocket-client-impl` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `javax-websocket-server-impl` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `websocket-api` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `websocket-client` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `websocket-common` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `websocket-server` | `9.4.46.v20220331` | | `org.eclipse.jetty.websocket` | `websocket-servlet` | `9.4.46.v20220331` | | `org.ehcache` | `ehcache` | `3.10.0` | | `org.ehcache` | `ehcache-clustered` | `3.10.0` | | `org.ehcache` | `ehcache-transactions` | `3.10.0` | | `org.elasticsearch` | `elasticsearch` | `7.17.3` | | `org.elasticsearch.client` | `elasticsearch-rest-client` | `7.17.3` | | `org.elasticsearch.client` | `elasticsearch-rest-client-sniffer` | `7.17.3` | | `org.elasticsearch.client` | `elasticsearch-rest-high-level-client` | `7.17.3` | | `org.elasticsearch.client` | `transport` | `7.17.3` | | `org.elasticsearch.distribution.integ-test-zip` | `elasticsearch` | `7.17.3` | | `org.elasticsearch.plugin` | `transport-netty4-client` | `7.17.3` | | `org.firebirdsql.jdbc` | `jaybird` | `4.0.6.java8` | | `org.firebirdsql.jdbc` | `jaybird-jdk18` | `4.0.6.java8` | | `org.flywaydb` | `flyway-core` | `8.5.11` | | `org.flywaydb` | `flyway-firebird` | `8.5.11` | | `org.flywaydb` | `flyway-mysql` | `8.5.11` | | `org.flywaydb` | `flyway-sqlserver` | `8.5.11` | | `org.freemarker` | `freemarker` | `2.3.31` | | `org.glassfish` | `jakarta.el` | `3.0.4` | | `org.glassfish.jaxb` | `codemodel` | `2.3.6` | | `org.glassfish.jaxb` | `codemodel-annotation-compiler` | `2.3.6` | | `org.glassfish.jaxb` | `jaxb-jxc` | `2.3.6` | | `org.glassfish.jaxb` | `jaxb-runtime` | `2.3.6` | | `org.glassfish.jaxb` | `jaxb-xjc` | `2.3.6` | | `org.glassfish.jaxb` | `txw2` | `2.3.6` | | `org.glassfish.jaxb` | `txwc2` | `2.3.6` | | `org.glassfish.jaxb` | `xsom` | `2.3.6` | | `org.glassfish.jersey.bundles` | `jaxrs-ri` | `2.35` | | `org.glassfish.jersey.connectors` | `jersey-apache-connector` | `2.35` | | `org.glassfish.jersey.connectors` | `jersey-grizzly-connector` | `2.35` | | `org.glassfish.jersey.connectors` | `jersey-helidon-connector` | `2.35` | | `org.glassfish.jersey.connectors` | `jersey-jdk-connector` | `2.35` | | `org.glassfish.jersey.connectors` | `jersey-jetty-connector` | `2.35` | | `org.glassfish.jersey.connectors` | `jersey-netty-connector` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-grizzly2-http` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-grizzly2-servlet` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-jdk-http` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-jetty-http` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-jetty-servlet` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-netty-http` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-servlet` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-servlet-core` | `2.35` | | `org.glassfish.jersey.containers` | `jersey-container-simple-http` | `2.35` | | `org.glassfish.jersey.containers.glassfish` | `jersey-gf-ejb` | `2.35` | | `org.glassfish.jersey.core` | `jersey-client` | `2.35` | | `org.glassfish.jersey.core` | `jersey-common` | `2.35` | | `org.glassfish.jersey.core` | `jersey-server` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-bean-validation` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-declarative-linking` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-entity-filtering` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-metainf-services` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-mvc` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-mvc-bean-validation` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-mvc-freemarker` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-mvc-jsp` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-mvc-mustache` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-proxy-client` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-servlet-portability` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-spring4` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-spring5` | `2.35` | | `org.glassfish.jersey.ext` | `jersey-wadl-doclet` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-cdi-rs-inject` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-cdi1x` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-cdi1x-ban-custom-hk2-binding` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-cdi1x-servlet` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-cdi1x-transaction` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-cdi1x-validation` | `2.35` | | `org.glassfish.jersey.ext.cdi` | `jersey-weld2-se` | `2.35` | | `org.glassfish.jersey.ext.microprofile` | `jersey-mp-config` | `2.35` | | `org.glassfish.jersey.ext.microprofile` | `jersey-mp-rest-client` | `2.35` | | `org.glassfish.jersey.ext.rx` | `jersey-rx-client-guava` | `2.35` | | `org.glassfish.jersey.ext.rx` | `jersey-rx-client-rxjava` | `2.35` | | `org.glassfish.jersey.ext.rx` | `jersey-rx-client-rxjava2` | `2.35` | | `org.glassfish.jersey.inject` | `jersey-cdi2-se` | `2.35` | | `org.glassfish.jersey.inject` | `jersey-hk2` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-jaxb` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-json-binding` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-json-jackson` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-json-jettison` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-json-processing` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-kryo` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-moxy` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-multipart` | `2.35` | | `org.glassfish.jersey.media` | `jersey-media-sse` | `2.35` | | `org.glassfish.jersey.security` | `oauth1-client` | `2.35` | | `org.glassfish.jersey.security` | `oauth1-server` | `2.35` | | `org.glassfish.jersey.security` | `oauth1-signature` | `2.35` | | `org.glassfish.jersey.security` | `oauth2-client` | `2.35` | | `org.glassfish.jersey.test-framework` | `jersey-test-framework-core` | `2.35` | | `org.glassfish.jersey.test-framework` | `jersey-test-framework-util` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-bundle` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-external` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-grizzly2` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-inmemory` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-jdk-http` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-jetty` | `2.35` | | `org.glassfish.jersey.test-framework.providers` | `jersey-test-framework-provider-simple` | `2.35` | | `org.glassfish.web` | `jakarta.servlet.jsp.jstl` | `1.2.6` | | `org.hamcrest` | `hamcrest` | `2.2` | | `org.hamcrest` | `hamcrest-core` | `2.2` | | `org.hamcrest` | `hamcrest-library` | `2.2` | | `org.hibernate` | `hibernate-c3p0` | `5.6.9.Final` | | `org.hibernate` | `hibernate-core` | `5.6.9.Final` | | `org.hibernate` | `hibernate-ehcache` | `5.6.9.Final` | | `org.hibernate` | `hibernate-entitymanager` | `5.6.9.Final` | | `org.hibernate` | `hibernate-envers` | `5.6.9.Final` | | `org.hibernate` | `hibernate-hikaricp` | `5.6.9.Final` | | `org.hibernate` | `hibernate-java8` | `5.6.9.Final` | | `org.hibernate` | `hibernate-jcache` | `5.6.9.Final` | | `org.hibernate` | `hibernate-jpamodelgen` | `5.6.9.Final` | | `org.hibernate` | `hibernate-micrometer` | `5.6.9.Final` | | `org.hibernate` | `hibernate-proxool` | `5.6.9.Final` | | `org.hibernate` | `hibernate-spatial` | `5.6.9.Final` | | `org.hibernate` | `hibernate-testing` | `5.6.9.Final` | | `org.hibernate` | `hibernate-vibur` | `5.6.9.Final` | | `org.hibernate.validator` | `hibernate-validator` | `6.2.3.Final` | | `org.hibernate.validator` | `hibernate-validator-annotation-processor` | `6.2.3.Final` | | `org.hsqldb` | `hsqldb` | `2.5.2` | | `org.infinispan` | `infinispan-anchored-keys` | `13.0.10.Final` | | `org.infinispan` | `infinispan-api` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cachestore-jdbc` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cachestore-jpa` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cachestore-remote` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cachestore-rocksdb` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cachestore-sql` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cdi-common` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cdi-embedded` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cdi-remote` | `13.0.10.Final` | | `org.infinispan` | `infinispan-checkstyle` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cli-client` | `13.0.10.Final` | | `org.infinispan` | `infinispan-client-hotrod` | `13.0.10.Final` | | `org.infinispan` | `infinispan-client-rest` | `13.0.10.Final` | | `org.infinispan` | `infinispan-cloudevents-integration` | `13.0.10.Final` | | `org.infinispan` | `infinispan-clustered-counter` | `13.0.10.Final` | | `org.infinispan` | `infinispan-clustered-lock` | `13.0.10.Final` | | `org.infinispan` | `infinispan-commons` | `13.0.10.Final` | | `org.infinispan` | `infinispan-commons-test` | `13.0.10.Final` | | `org.infinispan` | `infinispan-component-annotations` | `13.0.10.Final` | | `org.infinispan` | `infinispan-component-processor` | `13.0.10.Final` | | `org.infinispan` | `infinispan-console` | `0.15.5.Final` | | `org.infinispan` | `infinispan-core` | `13.0.10.Final` | | `org.infinispan` | `infinispan-extended-statistics` | `13.0.10.Final` | | `org.infinispan` | `infinispan-hibernate-cache-commons` | `13.0.10.Final` | | `org.infinispan` | `infinispan-hibernate-cache-spi` | `13.0.10.Final` | | `org.infinispan` | `infinispan-hibernate-cache-v53` | `13.0.10.Final` | | `org.infinispan` | `infinispan-jboss-marshalling` | `13.0.10.Final` | | `org.infinispan` | `infinispan-jcache` | `13.0.10.Final` | | `org.infinispan` | `infinispan-jcache-commons` | `13.0.10.Final` | | `org.infinispan` | `infinispan-jcache-remote` | `13.0.10.Final` | | `org.infinispan` | `infinispan-key-value-store-client` | `13.0.10.Final` | | `org.infinispan` | `infinispan-marshaller-kryo` | `13.0.10.Final` | | `org.infinispan` | `infinispan-marshaller-kryo-bundle` | `13.0.10.Final` | | `org.infinispan` | `infinispan-marshaller-protostuff` | `13.0.10.Final` | | `org.infinispan` | `infinispan-marshaller-protostuff-bundle` | `13.0.10.Final` | | `org.infinispan` | `infinispan-multimap` | `13.0.10.Final` | | `org.infinispan` | `infinispan-objectfilter` | `13.0.10.Final` | | `org.infinispan` | `infinispan-query` | `13.0.10.Final` | | `org.infinispan` | `infinispan-query-core` | `13.0.10.Final` | | `org.infinispan` | `infinispan-query-dsl` | `13.0.10.Final` | | `org.infinispan` | `infinispan-remote-query-client` | `13.0.10.Final` | | `org.infinispan` | `infinispan-remote-query-server` | `13.0.10.Final` | | `org.infinispan` | `infinispan-scripting` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-core` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-hotrod` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-memcached` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-rest` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-router` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-runtime` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-testdriver-core` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-testdriver-junit4` | `13.0.10.Final` | | `org.infinispan` | `infinispan-server-testdriver-junit5` | `13.0.10.Final` | | `org.infinispan` | `infinispan-spring-boot-starter-embedded` | `13.0.10.Final` | | `org.infinispan` | `infinispan-spring-boot-starter-remote` | `13.0.10.Final` | | `org.infinispan` | `infinispan-spring5-common` | `13.0.10.Final` | | `org.infinispan` | `infinispan-spring5-embedded` | `13.0.10.Final` | | `org.infinispan` | `infinispan-spring5-remote` | `13.0.10.Final` | | `org.infinispan` | `infinispan-tasks` | `13.0.10.Final` | | `org.infinispan` | `infinispan-tasks-api` | `13.0.10.Final` | | `org.infinispan` | `infinispan-tools` | `13.0.10.Final` | | `org.infinispan.protostream` | `protostream` | `4.4.3.Final` | | `org.infinispan.protostream` | `protostream-processor` | `4.4.3.Final` | | `org.infinispan.protostream` | `protostream-types` | `4.4.3.Final` | | `org.influxdb` | `influxdb-java` | `2.22` | | `org.jboss.logging` | `jboss-logging` | `3.4.3.Final` | | `org.jdom` | `jdom2` | `2.0.6.1` | | `org.jetbrains.kotlin` | `kotlin-compiler` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-compiler-embeddable` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-daemon-client` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-main-kts` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-osgi-bundle` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-reflect` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-script-runtime` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-script-util` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-scripting-common` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-scripting-ide-services` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-scripting-jvm` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-scripting-jvm-host` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-stdlib` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-stdlib-common` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-stdlib-jdk7` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-stdlib-jdk8` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-stdlib-js` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test-annotations-common` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test-common` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test-js` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test-junit` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test-junit5` | `1.6.21` | | `org.jetbrains.kotlin` | `kotlin-test-testng` | `1.6.21` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-android` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-core` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-core-jvm` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-debug` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-guava` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-javafx` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-jdk8` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-jdk9` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-play-services` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-reactive` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-reactor` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-rx2` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-rx3` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-slf4j` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-swing` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-test` | `1.6.1` | | `org.jetbrains.kotlinx` | `kotlinx-coroutines-test-jvm` | `1.6.1` | | `org.jolokia` | `jolokia-core` | `1.7.1` | | `org.jooq` | `jooq` | `3.14.15` | | `org.jooq` | `jooq-codegen` | `3.14.15` | | `org.jooq` | `jooq-kotlin` | `3.14.15` | | `org.jooq` | `jooq-meta` | `3.14.15` | | `org.junit.jupiter` | `junit-jupiter` | `5.8.2` | | `org.junit.jupiter` | `junit-jupiter-api` | `5.8.2` | | `org.junit.jupiter` | `junit-jupiter-engine` | `5.8.2` | | `org.junit.jupiter` | `junit-jupiter-migrationsupport` | `5.8.2` | | `org.junit.jupiter` | `junit-jupiter-params` | `5.8.2` | | `org.junit.platform` | `junit-platform-commons` | `1.8.2` | | `org.junit.platform` | `junit-platform-console` | `1.8.2` | | `org.junit.platform` | `junit-platform-engine` | `1.8.2` | | `org.junit.platform` | `junit-platform-jfr` | `1.8.2` | | `org.junit.platform` | `junit-platform-launcher` | `1.8.2` | | `org.junit.platform` | `junit-platform-reporting` | `1.8.2` | | `org.junit.platform` | `junit-platform-runner` | `1.8.2` | | `org.junit.platform` | `junit-platform-suite` | `1.8.2` | | `org.junit.platform` | `junit-platform-suite-api` | `1.8.2` | | `org.junit.platform` | `junit-platform-suite-commons` | `1.8.2` | | `org.junit.platform` | `junit-platform-suite-engine` | `1.8.2` | | `org.junit.platform` | `junit-platform-testkit` | `1.8.2` | | `org.junit.vintage` | `junit-vintage-engine` | `5.8.2` | | `org.jvnet.mimepull` | `mimepull` | `1.10.0` | | `org.liquibase` | `liquibase-cdi` | `4.9.1` | | `org.liquibase` | `liquibase-core` | `4.9.1` | | `org.mariadb` | `r2dbc-mariadb` | `1.1.1-rc` | | `org.mariadb.jdbc` | `mariadb-java-client` | `3.0.4` | | `org.messaginghub` | `pooled-jms` | `1.2.4` | | `org.mockito` | `mockito-android` | `4.5.1` | | `org.mockito` | `mockito-core` | `4.5.1` | | `org.mockito` | `mockito-errorprone` | `4.5.1` | | `org.mockito` | `mockito-inline` | `4.5.1` | | `org.mockito` | `mockito-junit-jupiter` | `4.5.1` | | `org.mockito` | `mockito-proxy` | `4.5.1` | | `org.mongodb` | `bson` | `4.6.0` | | `org.mongodb` | `mongodb-driver-core` | `4.6.0` | | `org.mongodb` | `mongodb-driver-legacy` | `4.6.0` | | `org.mongodb` | `mongodb-driver-reactivestreams` | `4.6.0` | | `org.mongodb` | `mongodb-driver-sync` | `4.6.0` | | `org.mortbay.jasper` | `apache-el` | `9.0.52` | | `org.neo4j.driver` | `neo4j-java-driver` | `4.4.5` | | `org.postgresql` | `postgresql` | `42.3.5` | | `org.postgresql` | `r2dbc-postgresql` | `0.9.1.RELEASE` | | `org.projectlombok` | `lombok` | `1.18.24` | | `org.quartz-scheduler` | `quartz` | `2.3.2` | | `org.quartz-scheduler` | `quartz-jobs` | `2.3.2` | | `org.reactivestreams` | `reactive-streams` | `1.0.3` | | `org.seleniumhq.selenium` | `htmlunit-driver` | `3.61.0` | | `org.seleniumhq.selenium` | `selenium-api` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-chrome-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-edge-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-firefox-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-ie-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-java` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-opera-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-remote-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-safari-driver` | `4.1.4` | | `org.seleniumhq.selenium` | `selenium-support` | `4.1.4` | | `org.skyscreamer` | `jsonassert` | `1.5.0` | | `org.slf4j` | `jcl-over-slf4j` | `1.7.36` | | `org.slf4j` | `jul-to-slf4j` | `1.7.36` | | `org.slf4j` | `log4j-over-slf4j` | `1.7.36` | | `org.slf4j` | `slf4j-api` | `1.7.36` | | `org.slf4j` | `slf4j-ext` | `1.7.36` | | `org.slf4j` | `slf4j-jcl` | `1.7.36` | | `org.slf4j` | `slf4j-jdk14` | `1.7.36` | | `org.slf4j` | `slf4j-log4j12` | `1.7.36` | | `org.slf4j` | `slf4j-nop` | `1.7.36` | | `org.slf4j` | `slf4j-simple` | `1.7.36` | | `org.springframework` | `spring-aop` | `5.3.20` | | `org.springframework` | `spring-aspects` | `5.3.20` | | `org.springframework` | `spring-beans` | `5.3.20` | | `org.springframework` | `spring-context` | `5.3.20` | | `org.springframework` | `spring-context-indexer` | `5.3.20` | | `org.springframework` | `spring-context-support` | `5.3.20` | | `org.springframework` | `spring-core` | `5.3.20` | | `org.springframework` | `spring-expression` | `5.3.20` | | `org.springframework` | `spring-instrument` | `5.3.20` | | `org.springframework` | `spring-jcl` | `5.3.20` | | `org.springframework` | `spring-jdbc` | `5.3.20` | | `org.springframework` | `spring-jms` | `5.3.20` | | `org.springframework` | `spring-messaging` | `5.3.20` | | `org.springframework` | `spring-orm` | `5.3.20` | | `org.springframework` | `spring-oxm` | `5.3.20` | | `org.springframework` | `spring-r2dbc` | `5.3.20` | | `org.springframework` | `spring-test` | `5.3.20` | | `org.springframework` | `spring-tx` | `5.3.20` | | `org.springframework` | `spring-web` | `5.3.20` | | `org.springframework` | `spring-webflux` | `5.3.20` | | `org.springframework` | `spring-webmvc` | `5.3.20` | | `org.springframework` | `spring-websocket` | `5.3.20` | | `org.springframework.amqp` | `spring-amqp` | `2.4.5` | | `org.springframework.amqp` | `spring-rabbit` | `2.4.5` | | `org.springframework.amqp` | `spring-rabbit-junit` | `2.4.5` | | `org.springframework.amqp` | `spring-rabbit-stream` | `2.4.5` | | `org.springframework.amqp` | `spring-rabbit-test` | `2.4.5` | | `org.springframework.batch` | `spring-batch-core` | `4.3.6` | | `org.springframework.batch` | `spring-batch-infrastructure` | `4.3.6` | | `org.springframework.batch` | `spring-batch-integration` | `4.3.6` | | `org.springframework.batch` | `spring-batch-test` | `4.3.6` | | `org.springframework.boot` | `spring-boot` | `2.7.0` | | `org.springframework.boot` | `spring-boot-actuator` | `2.7.0` | | `org.springframework.boot` | `spring-boot-actuator-autoconfigure` | `2.7.0` | | `org.springframework.boot` | `spring-boot-autoconfigure` | `2.7.0` | | `org.springframework.boot` | `spring-boot-autoconfigure-processor` | `2.7.0` | | `org.springframework.boot` | `spring-boot-buildpack-platform` | `2.7.0` | | `org.springframework.boot` | `spring-boot-configuration-metadata` | `2.7.0` | | `org.springframework.boot` | `spring-boot-configuration-processor` | `2.7.0` | | `org.springframework.boot` | `spring-boot-devtools` | `2.7.0` | | `org.springframework.boot` | `spring-boot-jarmode-layertools` | `2.7.0` | | `org.springframework.boot` | `spring-boot-loader` | `2.7.0` | | `org.springframework.boot` | `spring-boot-loader-tools` | `2.7.0` | | `org.springframework.boot` | `spring-boot-properties-migrator` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-activemq` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-actuator` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-amqp` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-aop` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-artemis` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-batch` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-cache` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-cassandra` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-cassandra-reactive` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-couchbase` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-couchbase-reactive` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-elasticsearch` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-jdbc` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-jpa` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-ldap` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-mongodb` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-mongodb-reactive` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-neo4j` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-r2dbc` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-redis` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-redis-reactive` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-data-rest` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-freemarker` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-graphql` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-groovy-templates` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-hateoas` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-integration` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-jdbc` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-jersey` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-jetty` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-jooq` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-json` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-jta-atomikos` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-log4j2` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-logging` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-mail` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-mustache` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-oauth2-client` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-oauth2-resource-server` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-quartz` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-reactor-netty` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-rsocket` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-security` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-test` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-thymeleaf` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-tomcat` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-undertow` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-validation` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-web` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-web-services` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-webflux` | `2.7.0` | | `org.springframework.boot` | `spring-boot-starter-websocket` | `2.7.0` | | `org.springframework.boot` | `spring-boot-test` | `2.7.0` | | `org.springframework.boot` | `spring-boot-test-autoconfigure` | `2.7.0` | | `org.springframework.data` | `spring-data-cassandra` | `3.4.0` | | `org.springframework.data` | `spring-data-commons` | `2.7.0` | | `org.springframework.data` | `spring-data-couchbase` | `4.4.0` | | `org.springframework.data` | `spring-data-elasticsearch` | `4.4.0` | | `org.springframework.data` | `spring-data-envers` | `2.7.0` | | `org.springframework.data` | `spring-data-geode` | `2.7.0` | | `org.springframework.data` | `spring-data-jdbc` | `2.4.0` | | `org.springframework.data` | `spring-data-jpa` | `2.7.0` | | `org.springframework.data` | `spring-data-keyvalue` | `2.7.0` | | `org.springframework.data` | `spring-data-ldap` | `2.7.0` | | `org.springframework.data` | `spring-data-mongodb` | `3.4.0` | | `org.springframework.data` | `spring-data-neo4j` | `6.3.0` | | `org.springframework.data` | `spring-data-r2dbc` | `1.5.0` | | `org.springframework.data` | `spring-data-redis` | `2.7.0` | | `org.springframework.data` | `spring-data-relational` | `2.4.0` | | `org.springframework.data` | `spring-data-rest-core` | `3.7.0` | | `org.springframework.data` | `spring-data-rest-hal-explorer` | `3.7.0` | | `org.springframework.data` | `spring-data-rest-webmvc` | `3.7.0` | | `org.springframework.graphql` | `spring-graphql` | `1.0.0` | | `org.springframework.graphql` | `spring-graphql-test` | `1.0.0` | | `org.springframework.hateoas` | `spring-hateoas` | `1.5.0` | | `org.springframework.integration` | `spring-integration-amqp` | `5.5.12` | | `org.springframework.integration` | `spring-integration-core` | `5.5.12` | | `org.springframework.integration` | `spring-integration-event` | `5.5.12` | | `org.springframework.integration` | `spring-integration-feed` | `5.5.12` | | `org.springframework.integration` | `spring-integration-file` | `5.5.12` | | `org.springframework.integration` | `spring-integration-ftp` | `5.5.12` | | `org.springframework.integration` | `spring-integration-gemfire` | `5.5.12` | | `org.springframework.integration` | `spring-integration-groovy` | `5.5.12` | | `org.springframework.integration` | `spring-integration-http` | `5.5.12` | | `org.springframework.integration` | `spring-integration-ip` | `5.5.12` | | `org.springframework.integration` | `spring-integration-jdbc` | `5.5.12` | | `org.springframework.integration` | `spring-integration-jms` | `5.5.12` | | `org.springframework.integration` | `spring-integration-jmx` | `5.5.12` | | `org.springframework.integration` | `spring-integration-jpa` | `5.5.12` | | `org.springframework.integration` | `spring-integration-kafka` | `5.5.12` | | `org.springframework.integration` | `spring-integration-mail` | `5.5.12` | | `org.springframework.integration` | `spring-integration-mongodb` | `5.5.12` | | `org.springframework.integration` | `spring-integration-mqtt` | `5.5.12` | | `org.springframework.integration` | `spring-integration-r2dbc` | `5.5.12` | | `org.springframework.integration` | `spring-integration-redis` | `5.5.12` | | `org.springframework.integration` | `spring-integration-rmi` | `5.5.12` | | `org.springframework.integration` | `spring-integration-rsocket` | `5.5.12` | | `org.springframework.integration` | `spring-integration-scripting` | `5.5.12` | | `org.springframework.integration` | `spring-integration-security` | `5.5.12` | | `org.springframework.integration` | `spring-integration-sftp` | `5.5.12` | | `org.springframework.integration` | `spring-integration-stomp` | `5.5.12` | | `org.springframework.integration` | `spring-integration-stream` | `5.5.12` | | `org.springframework.integration` | `spring-integration-syslog` | `5.5.12` | | `org.springframework.integration` | `spring-integration-test` | `5.5.12` | | `org.springframework.integration` | `spring-integration-test-support` | `5.5.12` | | `org.springframework.integration` | `spring-integration-webflux` | `5.5.12` | | `org.springframework.integration` | `spring-integration-websocket` | `5.5.12` | | `org.springframework.integration` | `spring-integration-ws` | `5.5.12` | | `org.springframework.integration` | `spring-integration-xml` | `5.5.12` | | `org.springframework.integration` | `spring-integration-xmpp` | `5.5.12` | | `org.springframework.integration` | `spring-integration-zeromq` | `5.5.12` | | `org.springframework.integration` | `spring-integration-zookeeper` | `5.5.12` | | `org.springframework.kafka` | `spring-kafka` | `2.8.6` | | `org.springframework.kafka` | `spring-kafka-test` | `2.8.6` | | `org.springframework.ldap` | `spring-ldap-core` | `2.4.0` | | `org.springframework.ldap` | `spring-ldap-core-tiger` | `2.4.0` | | `org.springframework.ldap` | `spring-ldap-ldif-batch` | `2.4.0` | | `org.springframework.ldap` | `spring-ldap-ldif-core` | `2.4.0` | | `org.springframework.ldap` | `spring-ldap-odm` | `2.4.0` | | `org.springframework.ldap` | `spring-ldap-test` | `2.4.0` | | `org.springframework.restdocs` | `spring-restdocs-asciidoctor` | `2.0.6.RELEASE` | | `org.springframework.restdocs` | `spring-restdocs-core` | `2.0.6.RELEASE` | | `org.springframework.restdocs` | `spring-restdocs-mockmvc` | `2.0.6.RELEASE` | | `org.springframework.restdocs` | `spring-restdocs-restassured` | `2.0.6.RELEASE` | | `org.springframework.restdocs` | `spring-restdocs-webtestclient` | `2.0.6.RELEASE` | | `org.springframework.retry` | `spring-retry` | `1.3.3` | | `org.springframework.security` | `spring-security-acl` | `5.7.1` | | `org.springframework.security` | `spring-security-aspects` | `5.7.1` | | `org.springframework.security` | `spring-security-cas` | `5.7.1` | | `org.springframework.security` | `spring-security-config` | `5.7.1` | | `org.springframework.security` | `spring-security-core` | `5.7.1` | | `org.springframework.security` | `spring-security-crypto` | `5.7.1` | | `org.springframework.security` | `spring-security-data` | `5.7.1` | | `org.springframework.security` | `spring-security-ldap` | `5.7.1` | | `org.springframework.security` | `spring-security-messaging` | `5.7.1` | | `org.springframework.security` | `spring-security-oauth2-client` | `5.7.1` | | `org.springframework.security` | `spring-security-oauth2-core` | `5.7.1` | | `org.springframework.security` | `spring-security-oauth2-jose` | `5.7.1` | | `org.springframework.security` | `spring-security-oauth2-resource-server` | `5.7.1` | | `org.springframework.security` | `spring-security-openid` | `5.7.1` | | `org.springframework.security` | `spring-security-remoting` | `5.7.1` | | `org.springframework.security` | `spring-security-rsocket` | `5.7.1` | | `org.springframework.security` | `spring-security-saml2-service-provider` | `5.7.1` | | `org.springframework.security` | `spring-security-taglibs` | `5.7.1` | | `org.springframework.security` | `spring-security-test` | `5.7.1` | | `org.springframework.security` | `spring-security-web` | `5.7.1` | | `org.springframework.session` | `spring-session-core` | `2.7.0` | | `org.springframework.session` | `spring-session-data-geode` | `2.7.0` | | `org.springframework.session` | `spring-session-data-mongodb` | `2.7.0` | | `org.springframework.session` | `spring-session-data-redis` | `2.7.0` | | `org.springframework.session` | `spring-session-hazelcast` | `2.7.0` | | `org.springframework.session` | `spring-session-jdbc` | `2.7.0` | | `org.springframework.ws` | `spring-ws-core` | `3.1.3` | | `org.springframework.ws` | `spring-ws-security` | `3.1.3` | | `org.springframework.ws` | `spring-ws-support` | `3.1.3` | | `org.springframework.ws` | `spring-ws-test` | `3.1.3` | | `org.springframework.ws` | `spring-xml` | `3.1.3` | | `org.testcontainers` | `azure` | `1.16.2` | | `org.testcontainers` | `cassandra` | `1.16.2` | | `org.testcontainers` | `clickhouse` | `1.16.2` | | `org.testcontainers` | `cockroachdb` | `1.16.2` | | `org.testcontainers` | `couchbase` | `1.16.2` | | `org.testcontainers` | `database-commons` | `1.16.2` | | `org.testcontainers` | `db2` | `1.16.2` | | `org.testcontainers` | `dynalite` | `1.16.2` | | `org.testcontainers` | `elasticsearch` | `1.16.2` | | `org.testcontainers` | `gcloud` | `1.16.2` | | `org.testcontainers` | `influxdb` | `1.16.2` | | `org.testcontainers` | `jdbc` | `1.16.2` | | `org.testcontainers` | `junit-jupiter` | `1.16.2` | | `org.testcontainers` | `kafka` | `1.16.2` | | `org.testcontainers` | `localstack` | `1.16.2` | | `org.testcontainers` | `mariadb` | `1.16.2` | | `org.testcontainers` | `mockserver` | `1.16.2` | | `org.testcontainers` | `mongodb` | `1.16.2` | | `org.testcontainers` | `mssqlserver` | `1.16.2` | | `org.testcontainers` | `mysql` | `1.16.2` | | `org.testcontainers` | `neo4j` | `1.16.2` | | `org.testcontainers` | `nginx` | `1.16.2` | | `org.testcontainers` | `oracle-xe` | `1.16.2` | | `org.testcontainers` | `orientdb` | `1.16.2` | | `org.testcontainers` | `postgresql` | `1.16.2` | | `org.testcontainers` | `presto` | `1.16.2` | | `org.testcontainers` | `pulsar` | `1.16.2` | | `org.testcontainers` | `r2dbc` | `1.16.2` | | `org.testcontainers` | `rabbitmq` | `1.16.2` | | `org.testcontainers` | `selenium` | `1.16.2` | | `org.testcontainers` | `solr` | `1.16.2` | | `org.testcontainers` | `spock` | `1.16.2` | | `org.testcontainers` | `testcontainers` | `1.16.2` | | `org.testcontainers` | `toxiproxy` | `1.16.2` | | `org.testcontainers` | `trino` | `1.16.2` | | `org.testcontainers` | `vault` | `1.16.2` | | `org.thymeleaf` | `thymeleaf` | `3.0.15.RELEASE` | | `org.thymeleaf` | `thymeleaf-spring5` | `3.0.15.RELEASE` | | `org.thymeleaf.extras` | `thymeleaf-extras-java8time` | `3.0.4.RELEASE` | | `org.thymeleaf.extras` | `thymeleaf-extras-springsecurity5` | `3.0.4.RELEASE` | | `org.webjars` | `webjars-locator-core` | `0.50` | | `org.xerial` | `sqlite-jdbc` | `3.36.0.3` | | `org.xmlunit` | `xmlunit-assertj` | `2.9.0` | | `org.xmlunit` | `xmlunit-core` | `2.9.0` | | `org.xmlunit` | `xmlunit-legacy` | `2.9.0` | | `org.xmlunit` | `xmlunit-matchers` | `2.9.0` | | `org.xmlunit` | `xmlunit-placeholders` | `2.9.0` | | `org.yaml` | `snakeyaml` | `1.30` | | `redis.clients` | `jedis` | `3.8.0` | | `wsdl4j` | `wsdl4j` | `1.6.3` | 2. Version Properties ---------------------- The following table provides all properties that can be used to override the versions managed by Spring Boot. Browse the [`spring-boot-dependencies` build.gradle](https://github.com/spring-projects/spring-boot/tree/v2.7.0/spring-boot-project/spring-boot-dependencies/build.gradle) for a complete list of dependencies. You can learn how to customize these versions in your application in the [Build Tool Plugins documentation](build-tool-plugins#build-tool-plugins). | Library | Version Property | | --- | --- | | `ActiveMQ` | `activemq.version` | | `ANTLR2` | `antlr2.version` | | `AppEngine SDK` | `appengine-sdk.version` | | `Artemis` | `artemis.version` | | `AspectJ` | `aspectj.version` | | `AssertJ` | `assertj.version` | | `Atomikos` | `atomikos.version` | | `Awaitility` | `awaitility.version` | | `Build Helper Maven Plugin` | `build-helper-maven-plugin.version` | | `Byte Buddy` | `byte-buddy.version` | | `cache2k` | `cache2k.version` | | `Caffeine` | `caffeine.version` | | `Cassandra Driver` | `cassandra-driver.version` | | `Classmate` | `classmate.version` | | `Commons Codec` | `commons-codec.version` | | `Commons DBCP2` | `commons-dbcp2.version` | | `Commons Lang3` | `commons-lang3.version` | | `Commons Pool` | `commons-pool.version` | | `Commons Pool2` | `commons-pool2.version` | | `Couchbase Client` | `couchbase-client.version` | | `DB2 JDBC` | `db2-jdbc.version` | | `Dependency Management Plugin` | `dependency-management-plugin.version` | | `Derby` | `derby.version` | | `Dropwizard Metrics` | `dropwizard-metrics.version` | | `Ehcache` | `ehcache.version` | | `Ehcache3` | `ehcache3.version` | | `Elasticsearch` | `elasticsearch.version` | | `Embedded Mongo` | `embedded-mongo.version` | | `Flyway` | `flyway.version` | | `FreeMarker` | `freemarker.version` | | `Git Commit ID Plugin` | `git-commit-id-plugin.version` | | `Glassfish EL` | `glassfish-el.version` | | `Glassfish JAXB` | `glassfish-jaxb.version` | | `Glassfish JSTL` | `glassfish-jstl.version` | | `GraphQL Java` | `graphql-java.version` | | `Groovy` | `groovy.version` | | `Gson` | `gson.version` | | `H2` | `h2.version` | | `Hamcrest` | `hamcrest.version` | | `Hazelcast` | `hazelcast.version` | | `Hazelcast Hibernate5` | `hazelcast-hibernate5.version` | | `Hibernate` | `hibernate.version` | | `Hibernate Validator` | `hibernate-validator.version` | | `HikariCP` | `hikaricp.version` | | `HSQLDB` | `hsqldb.version` | | `HtmlUnit` | `htmlunit.version` | | `HttpAsyncClient` | `httpasyncclient.version` | | `HttpClient` | `httpclient.version` | | `HttpClient5` | `httpclient5.version` | | `HttpCore` | `httpcore.version` | | `HttpCore5` | `httpcore5.version` | | `Infinispan` | `infinispan.version` | | `InfluxDB Java` | `influxdb-java.version` | | `Jackson Bom` | `jackson-bom.version` | | `Jakarta Activation` | `jakarta-activation.version` | | `Jakarta Annotation` | `jakarta-annotation.version` | | `Jakarta JMS` | `jakarta-jms.version` | | `Jakarta Json` | `jakarta-json.version` | | `Jakarta Json Bind` | `jakarta-json-bind.version` | | `Jakarta Mail` | `jakarta-mail.version` | | `Jakarta Management` | `jakarta-management.version` | | `Jakarta Persistence` | `jakarta-persistence.version` | | `Jakarta Servlet` | `jakarta-servlet.version` | | `Jakarta Servlet JSP JSTL` | `jakarta-servlet-jsp-jstl.version` | | `Jakarta Transaction` | `jakarta-transaction.version` | | `Jakarta Validation` | `jakarta-validation.version` | | `Jakarta WebSocket` | `jakarta-websocket.version` | | `Jakarta WS RS` | `jakarta-ws-rs.version` | | `Jakarta XML Bind` | `jakarta-xml-bind.version` | | `Jakarta XML SOAP` | `jakarta-xml-soap.version` | | `Jakarta XML WS` | `jakarta-xml-ws.version` | | `Janino` | `janino.version` | | `Javax Activation` | `javax-activation.version` | | `Javax Annotation` | `javax-annotation.version` | | `Javax Cache` | `javax-cache.version` | | `Javax JAXB` | `javax-jaxb.version` | | `Javax JAXWS` | `javax-jaxws.version` | | `Javax JMS` | `javax-jms.version` | | `Javax Json` | `javax-json.version` | | `Javax JsonB` | `javax-jsonb.version` | | `Javax Mail` | `javax-mail.version` | | `Javax Money` | `javax-money.version` | | `Javax Persistence` | `javax-persistence.version` | | `Javax Transaction` | `javax-transaction.version` | | `Javax Validation` | `javax-validation.version` | | `Javax WebSocket` | `javax-websocket.version` | | `Jaxen` | `jaxen.version` | | `Jaybird` | `jaybird.version` | | `JBoss Logging` | `jboss-logging.version` | | `JDOM2` | `jdom2.version` | | `Jedis` | `jedis.version` | | `Jersey` | `jersey.version` | | `Jetty` | `jetty.version` | | `Jetty EL` | `jetty-el.version` | | `Jetty JSP` | `jetty-jsp.version` | | `Jetty Reactive HTTPClient` | `jetty-reactive-httpclient.version` | | `JMustache` | `jmustache.version` | | `Johnzon` | `johnzon.version` | | `Jolokia` | `jolokia.version` | | `jOOQ` | `jooq.version` | | `Json Path` | `json-path.version` | | `Json-smart` | `json-smart.version` | | `JsonAssert` | `jsonassert.version` | | `JSTL` | `jstl.version` | | `JTDS` | `jtds.version` | | `JUnit` | `junit.version` | | `JUnit Jupiter` | `junit-jupiter.version` | | `Kafka` | `kafka.version` | | `Kotlin` | `kotlin.version` | | `Kotlin Coroutines` | `kotlin-coroutines.version` | | `Lettuce` | `lettuce.version` | | `Liquibase` | `liquibase.version` | | `Log4j2` | `log4j2.version` | | `Logback` | `logback.version` | | `Lombok` | `lombok.version` | | `MariaDB` | `mariadb.version` | | `Maven AntRun Plugin` | `maven-antrun-plugin.version` | | `Maven Assembly Plugin` | `maven-assembly-plugin.version` | | `Maven Clean Plugin` | `maven-clean-plugin.version` | | `Maven Compiler Plugin` | `maven-compiler-plugin.version` | | `Maven Dependency Plugin` | `maven-dependency-plugin.version` | | `Maven Deploy Plugin` | `maven-deploy-plugin.version` | | `Maven Enforcer Plugin` | `maven-enforcer-plugin.version` | | `Maven Failsafe Plugin` | `maven-failsafe-plugin.version` | | `Maven Help Plugin` | `maven-help-plugin.version` | | `Maven Install Plugin` | `maven-install-plugin.version` | | `Maven Invoker Plugin` | `maven-invoker-plugin.version` | | `Maven Jar Plugin` | `maven-jar-plugin.version` | | `Maven Javadoc Plugin` | `maven-javadoc-plugin.version` | | `Maven Resources Plugin` | `maven-resources-plugin.version` | | `Maven Shade Plugin` | `maven-shade-plugin.version` | | `Maven Source Plugin` | `maven-source-plugin.version` | | `Maven Surefire Plugin` | `maven-surefire-plugin.version` | | `Maven War Plugin` | `maven-war-plugin.version` | | `Micrometer` | `micrometer.version` | | `MIMEPull` | `mimepull.version` | | `Mockito` | `mockito.version` | | `MongoDB` | `mongodb.version` | | `MSSQL JDBC` | `mssql-jdbc.version` | | `MySQL` | `mysql.version` | | `NekoHTML` | `nekohtml.version` | | `Neo4j Java Driver` | `neo4j-java-driver.version` | | `Netty` | `netty.version` | | `OkHttp` | `okhttp.version` | | `Oracle Database` | `oracle-database.version` | | `Pooled JMS` | `pooled-jms.version` | | `Postgresql` | `postgresql.version` | | `Prometheus Client` | `prometheus-client.version` | | `Quartz` | `quartz.version` | | `QueryDSL` | `querydsl.version` | | `R2DBC Bom` | `r2dbc-bom.version` | | `Rabbit AMQP Client` | `rabbit-amqp-client.version` | | `Rabbit Stream Client` | `rabbit-stream-client.version` | | `Reactive Streams` | `reactive-streams.version` | | `Reactor Bom` | `reactor-bom.version` | | `REST Assured` | `rest-assured.version` | | `RSocket` | `rsocket.version` | | `RxJava` | `rxjava.version` | | `RxJava Adapter` | `rxjava-adapter.version` | | `RxJava2` | `rxjava2.version` | | `SAAJ Impl` | `saaj-impl.version` | | `Selenium` | `selenium.version` | | `Selenium HtmlUnit` | `selenium-htmlunit.version` | | `SendGrid` | `sendgrid.version` | | `Servlet API` | `servlet-api.version` | | `SLF4J` | `slf4j.version` | | `SnakeYAML` | `snakeyaml.version` | | `Solr` | `solr.version` | | `Spring AMQP` | `spring-amqp.version` | | `Spring Batch` | `spring-batch.version` | | `Spring Data Bom` | `spring-data-bom.version` | | `Spring Framework` | `spring-framework.version` | | `Spring GraphQL` | `spring-graphql.version` | | `Spring HATEOAS` | `spring-hateoas.version` | | `Spring Integration` | `spring-integration.version` | | `Spring Kafka` | `spring-kafka.version` | | `Spring LDAP` | `spring-ldap.version` | | `Spring RESTDocs` | `spring-restdocs.version` | | `Spring Retry` | `spring-retry.version` | | `Spring Security` | `spring-security.version` | | `Spring Session Bom` | `spring-session-bom.version` | | `Spring WS` | `spring-ws.version` | | `SQLite JDBC` | `sqlite-jdbc.version` | | `Sun Mail` | `sun-mail.version` | | `Thymeleaf` | `thymeleaf.version` | | `Thymeleaf Extras Data Attribute` | `thymeleaf-extras-data-attribute.version` | | `Thymeleaf Extras Java8Time` | `thymeleaf-extras-java8time.version` | | `Thymeleaf Extras SpringSecurity` | `thymeleaf-extras-springsecurity.version` | | `Thymeleaf Layout Dialect` | `thymeleaf-layout-dialect.version` | | `Tomcat` | `tomcat.version` | | `UnboundID LDAPSDK` | `unboundid-ldapsdk.version` | | `Undertow` | `undertow.version` | | `Versions Maven Plugin` | `versions-maven-plugin.version` | | `WebJars Locator Core` | `webjars-locator-core.version` | | `WSDL4j` | `wsdl4j.version` | | `XML Maven Plugin` | `xml-maven-plugin.version` | | `XmlUnit2` | `xmlunit2.version` |
programming_docs
fish Interactive use Interactive use =============== Fish prides itself on being really nice to use interactively. That’s down to a few features we’ll explain in the next few sections. Fish is used by giving commands in the fish language, see [The Fish Language](language#language) for information on that. Help ---- Fish has an extensive help system. Use the [help](cmds/help) command to obtain help on a specific subject or command. For instance, writing `help syntax` displays the [syntax section](language#syntax) of this documentation. Fish also has man pages for its commands, and translates the help pages to man pages. For example, `man set` will show the documentation for `set` as a man page. Help on a specific builtin can also be obtained with the `-h` parameter. For instance, to obtain help on the [fg](cmds/fg) builtin, either type `fg -h` or `help fg`. The main page can be viewed via `help index` (or just `help`) or `man fish-doc`. The tutorial can be viewed with `help tutorial` or `man fish-tutorial`. Autosuggestions --------------- fish suggests commands as you type, based on [command history](#history-search), completions, and valid file paths. As you type commands, you will see a suggestion offered after the cursor, in a muted gray color (which can be changed with the `fish_color_autosuggestion` variable). To accept the autosuggestion (replacing the command line contents), press `→` or `Control`+`F`. To accept the first suggested word, press `Alt`+`→` or `Alt`+`F`. If the autosuggestion is not what you want, just ignore it: it won’t execute unless you accept it. Autosuggestions are a powerful way to quickly summon frequently entered commands, by typing the first few characters. They are also an efficient technique for navigating through directory hierarchies. If you don’t like autosuggestions, you can disable them by setting `$fish_autosuggestion_enabled` to 0: ``` set -g fish_autosuggestion_enabled 0 ``` Tab Completion -------------- Tab completion is a time saving feature of any modern shell. When you type `Tab`, fish tries to guess the rest of the word under the cursor. If it finds just one possibility, it inserts it. If it finds more, it inserts the longest unambiguous part and then opens a menu (the “pager”) that you can navigate to find what you’re looking for. The pager can be navigated with the arrow keys, `Page Up` / ``Page` `Down``, `Tab` or `Shift`+`Tab`. Pressing `Control`+`S` (the `pager-toggle-search` binding - `/` in vi-mode) opens up a search menu that you can use to filter the list. Fish provides some general purpose completions, like for commands, variable names, usernames or files. It also provides a large number of program specific scripted completions. Most of these completions are simple options like the `-l` option for `ls`, but a lot are more advanced. For example: * `man` and `whatis` show the installed manual pages as completions. * `make` uses targets in the Makefile in the current directory as completions. * `mount` uses mount points specified in fstab as completions. * `apt`, `rpm` and `yum` show installed or installable packages You can also write your own completions or install some you got from someone else. For that, see [Writing your own completions](completions#completion-own). Completion scripts are loaded on demand, just like [functions are](language#syntax-function-autoloading). The difference is the `$fish_complete_path` [list](language#variables-lists) is used instead of `$fish_function_path`. Typically you can drop new completions in ~/.config/fish/completions/name-of-command.fish and fish will find them automatically. Syntax highlighting ------------------- Fish interprets the command line as it is typed and uses syntax highlighting to provide feedback. The most important feedback is the detection of potential errors. By default, errors are marked red. Detected errors include: * Non-existing commands. * Reading from or appending to a non-existing file. * Incorrect use of output redirects * Mismatched parenthesis To customize the syntax highlighting, you can set the environment variables listed in the [Variables for changing highlighting colors](#variables-color) section. Fish also provides pre-made color themes you can pick with [fish\_config](cmds/fish_config). Running just `fish_config` opens a browser interface, or you can use `fish_config theme` in the terminal. For example, to disable nearly all coloring: ``` fish_config theme choose none ``` Or, to see all themes, right in your terminal: ``` fish_config theme show ``` ### Syntax highlighting variables The colors used by fish for syntax highlighting can be configured by changing the values of various variables. The value of these variables can be one of the colors accepted by the [set\_color](cmds/set_color) command. The modifier switches accepted by `set_color` like `--bold`, `--dim`, `--italics`, `--reverse` and `--underline` are also accepted. Example: to make errors highlighted and red, use: ``` set fish_color_error red --bold ``` The following variables are available to change the highlighting colors in fish: | Variable | Meaning | | --- | --- | | `fish_color_normal` | default color | | `fish_color_command` | commands like echo | | `fish_color_keyword` | keywords like if - this falls back on the command color if unset | | `fish_color_quote` | quoted text like `"abc"` | | `fish_color_redirection` | IO redirections like >/dev/null | | `fish_color_end` | process separators like `;` and `&` | | `fish_color_error` | syntax errors | | `fish_color_param` | ordinary command parameters | | `fish_color_valid_path` | parameters that are filenames (if the file exists) | | `fish_color_option` | options starting with “-”, up to the first “--” parameter | | `fish_color_comment` | comments like ‘# important’ | | `fish_color_selection` | selected text in vi visual mode | | `fish_color_operator` | parameter expansion operators like `*` and `~` | | `fish_color_escape` | character escapes like `\n` and `\x70` | | `fish_color_autosuggestion` | autosuggestions (the proposed rest of a command) | | `fish_color_cwd` | the current working directory in the default prompt | | `fish_color_cwd_root` | the current working directory in the default prompt for the root user | | `fish_color_user` | the username in the default prompt | | `fish_color_host` | the hostname in the default prompt | | `fish_color_host_remote` | the hostname in the default prompt for remote sessions (like ssh) | | `fish_color_status` | the last command’s nonzero exit code in the default prompt | | `fish_color_cancel` | the ‘^C’ indicator on a canceled command | | `fish_color_search_match` | history search matches and selected pager items (background only) | If a variable isn’t set or is empty, fish usually tries `$fish_color_normal`, except for: * `$fish_color_keyword`, where it tries `$fish_color_command` first. * `$fish_color_option`, where it tries `$fish_color_param` first. * For `$fish_color_valid_path`, if that doesn’t have a color, but only modifiers, it adds those to the color that would otherwise be used, like `$fish_color_param`. But if valid paths have a color, it uses that and adds in modifiers from the other color. ### Pager color variables fish will sometimes present a list of choices in a table, called the pager. Example: to set the background of each pager row, use: ``` set fish_pager_color_background --background=white ``` To have black text on alternating white and gray backgrounds: ``` set fish_pager_color_prefix black set fish_pager_color_completion black set fish_pager_color_description black set fish_pager_color_background --background=white set fish_pager_color_secondary_background --background=brwhite ``` Variables affecting the pager colors: | Variable | Meaning | | --- | --- | | `fish_pager_color_progress` | the progress bar at the bottom left corner | | `fish_pager_color_background` | the background color of a line | | `fish_pager_color_prefix` | the prefix string, i.e. the string that is to be completed | | `fish_pager_color_completion` | the completion itself, i.e. the proposed rest of the string | | `fish_pager_color_description` | the completion description | | `fish_pager_color_selected_background` | background of the selected completion | | `fish_pager_color_selected_prefix` | prefix of the selected completion | | `fish_pager_color_selected_completion` | suffix of the selected completion | | `fish_pager_color_selected_description` | description of the selected completion | | `fish_pager_color_secondary_background` | background of every second unselected completion | | `fish_pager_color_secondary_prefix` | prefix of every second unselected completion | | `fish_pager_color_secondary_completion` | suffix of every second unselected completion | | `fish_pager_color_secondary_description` | description of every second unselected completion | When the secondary or selected variables aren’t set or are empty, the normal variables are used, except for `$fish_pager_color_selected_background`, where the background of `$fish_color_search_match` is tried first. Abbreviations ------------- To avoid needless typing, a frequently-run command like `git checkout` can be abbreviated to `gco` using the [abbr](cmds/abbr) command. ``` abbr -a gco git checkout ``` After entering `gco` and pressing `Space` or `Enter`, a `gco` in command position will turn into `git checkout` in the command line. If you want to use a literal `gco` sometimes, use `Control`+`Space` [[1]](#id5). This is a lot more powerful, for example you can make going up a number of directories easier with this: ``` function multicd echo cd (string repeat -n (math (string length -- $argv[1]) - 1) ../) end abbr --add dotdot --regex '^\.\.+$' --function multicd ``` Now, `..` transforms to `cd ../`, while `...` turns into `cd ../../` and `....` expands to `cd ../../../`. The advantage over aliases is that you can see the actual command before using it, add to it or change it, and the actual command will be stored in history. Programmable title ------------------ When using most virtual terminals, it is possible to set the message displayed in the titlebar of the terminal window. This can be done automatically in fish by defining the [fish\_title](cmds/fish_title) function. The [fish\_title](cmds/fish_title) function is executed before and after a new command is executed or put into the foreground and the output is used as a titlebar message. The [status current-command](cmds/status) builtin will always return the name of the job to be put into the foreground (or `fish` if control is returning to the shell) when the [fish\_prompt](cmds/fish_prompt) function is called. The first argument to fish\_title will contain the most recently executed foreground command as a string. The default fish title shows the hostname if connected via ssh, the currently running command (unless it is fish) and the current working directory. All of this is shortened to not make the tab too wide. Examples: To show the last command and working directory in the title: ``` function fish_title # `prompt_pwd` shortens the title. This helps prevent tabs from becoming very wide. echo $argv[1] (prompt_pwd) pwd end ``` Programmable prompt ------------------- When it is fish’s turn to ask for input (like after it started or the command ended), it will show a prompt. It does this by running the [fish\_prompt](cmds/fish_prompt) and [fish\_right\_prompt](cmds/fish_right_prompt) functions. The output of the former is displayed on the left and the latter’s output on the right side of the terminal. The output of [fish\_mode\_prompt](cmds/fish_mode_prompt) will be prepended on the left, though the default function only does this when in [vi-mode](#vi-mode). Configurable greeting --------------------- When it is started interactively, fish tries to run the [fish\_greeting](cmds/fish_greeting) function. The default fish\_greeting prints a simple greeting. You can change its text by changing the `$fish_greeting` variable. Private mode ------------ If `$fish_private_mode` is set to a non-empty value, commands will not be written to the history file on disk. You can also launch with `fish --private` (or `fish -P` for short). This both hides old history and prevents writing history to disk. This is useful to avoid leaking personal information (e.g. for screencasts) or when dealing with sensitive information. You can query the variable `fish_private_mode` (`if test -n "$fish_private_mode" ...`) if you would like to respect the user’s wish for privacy and alter the behavior of your own fish scripts. Command line editor ------------------- The fish editor features copy and paste, a [searchable history](#history-search) and many editor functions that can be bound to special keyboard shortcuts. Like bash and other shells, fish includes two sets of keyboard shortcuts (or key bindings): one inspired by the Emacs text editor, and one by the Vi text editor. The default editing mode is Emacs. You can switch to Vi mode by running `fish_vi_key_bindings` and switch back with `fish_default_key_bindings`. You can also make your own key bindings by creating a function and setting the `fish_key_bindings` variable to its name. For example: ``` function fish_hybrid_key_bindings --description \ "Vi-style bindings that inherit emacs-style bindings in all modes" for mode in default insert visual fish_default_key_bindings -M $mode end fish_vi_key_bindings --no-erase end set -g fish_key_bindings fish_hybrid_key_bindings ``` While the key bindings included with fish include many of the shortcuts popular from the respective text editors, they are not a complete implementation. They include a shortcut to open the current command line in your preferred editor (`Alt`+`E` by default) if you need the full power of your editor. ### Shared bindings Some bindings are common across Emacs and Vi mode, because they aren’t text editing bindings, or because what Vi/Vim does for a particular key doesn’t make sense for a shell. * `Tab` [completes](#tab-completion) the current token. `Shift`+`Tab` completes the current token and starts the pager’s search mode. `Tab` is the same as `Control`+`I`. * `←` (Left) and `→` (Right) move the cursor left or right by one character. If the cursor is already at the end of the line, and an autosuggestion is available, `→` accepts the autosuggestion. * `Enter` executes the current commandline or inserts a newline if it’s not complete yet (e.g. a `)` or `end` is missing). * `Alt`+`Enter` inserts a newline at the cursor position. * `Alt`+`←` and `Alt`+`→` move the cursor one word left or right (to the next space or punctuation mark), or moves forward/backward in the directory history if the command line is empty. If the cursor is already at the end of the line, and an autosuggestion is available, `Alt`+`→` (or `Alt`+`F`) accepts the first word in the suggestion. * `Control`+`←` and `Control`+`→` move the cursor one word left or right. These accept one word of the autosuggestion - the part they’d move over. * `Shift`+`←` and `Shift`+`→` move the cursor one word left or right, without stopping on punctuation. These accept one big word of the autosuggestion. * `↑` (Up) and `↓` (Down) (or `Control`+`P` and `Control`+`N` for emacs aficionados) search the command history for the previous/next command containing the string that was specified on the commandline before the search was started. If the commandline was empty when the search started, all commands match. See the [history](#history-search) section for more information on history searching. * `Alt`+`↑` and `Alt`+`↓` search the command history for the previous/next token containing the token under the cursor before the search was started. If the commandline was not on a token when the search started, all tokens match. See the [history](#history-search) section for more information on history searching. * `Control`+`C` interrupt/kill whatever is running (SIGINT). * `Control`+`D` delete one character to the right of the cursor. If the command line is empty, `Control`+`D` will exit fish. * `Control`+`U` removes contents from the beginning of line to the cursor (moving it to the [killring](#killring)). * `Control`+`L` clears and repaints the screen. * `Control`+`W` removes the previous path component (everything up to the previous “/”, “:” or “@”) (moving it to the [Copy and paste (Kill Ring)](#killring)). * `Control`+`X` copies the current buffer to the system’s clipboard, `Control`+`V` inserts the clipboard contents. (see [fish\_clipboard\_copy](cmds/fish_clipboard_copy) and [fish\_clipboard\_paste](cmds/fish_clipboard_paste)) * `Alt`+`D` moves the next word to the [Copy and paste (Kill Ring)](#killring). * `Alt`+`H` (or `F1`) shows the manual page for the current command, if one exists. * `Alt`+`L` lists the contents of the current directory, unless the cursor is over a directory argument, in which case the contents of that directory will be listed. * `Alt`+`O` opens the file at the cursor in a pager. * `Alt`+`P` adds the string `&| less;` to the end of the job under the cursor. The result is that the output of the command will be paged. * `Alt`+`W` prints a short description of the command under the cursor. * `Alt`+`E` edit the current command line in an external editor. The editor is chosen from the first available of the `$VISUAL` or `$EDITOR` variables. * `Alt`+`V` Same as `Alt`+`E`. * `Alt`+`S` Prepends `sudo` to the current commandline. If the commandline is empty, prepend `sudo` to the last commandline. * `Control`+`Space` Inserts a space without expanding an [abbreviation](#abbreviations). For vi-mode this only applies to insert-mode. ### Emacs mode commands To enable emacs mode, use `fish_default_key_bindings`. This is also the default. * `Home` or `Control`+`A` moves the cursor to the beginning of the line. * `End` or `Control`+`E` moves to the end of line. If the cursor is already at the end of the line, and an autosuggestion is available, `End` or `Control`+`E` accepts the autosuggestion. * `Control`+`B`, `Control`+`F` move the cursor one character left or right or accept the autosuggestion just like the `←` (Left) and `→` (Right) shared bindings (which are available as well). * `Control`+`N`, `Control`+`P` move the cursor up/down or through history, like the up and down arrow shared bindings. * `Delete` or `Backspace` removes one character forwards or backwards respectively. This also goes for `Control`+`H`, which is indistinguishable from backspace. * `Alt`+`Backspace` removes one word backwards. * `Alt`+`<` moves to the beginning of the commandline, `Alt`+`>` moves to the end. * `Control`+`K` deletes from the cursor to the end of line (moving it to the [Copy and paste (Kill Ring)](#killring)). * `Alt`+`C` capitalizes the current word. * `Alt`+`U` makes the current word uppercase. * `Control`+`T` transposes the last two characters. * `Alt`+`T` transposes the last two words. * `Control`+`Z`, `Control`+`\_` (`Control`+`/` on some terminals) undo the most recent edit of the line. * `Alt`+`/` reverts the most recent undo. * `Control`+`R` opens the history in a pager. This will show history entries matching the search, a few at a time. Pressing `Control`+`R` again will search older entries, pressing `Control`+`S` (that otherwise toggles pager search) will go to newer entries. The search bar will always be selected. You can change these key bindings using the [bind](cmds/bind) builtin. ### Vi mode commands Vi mode allows for the use of Vi-like commands at the prompt. Initially, [insert mode](#vi-mode-insert) is active. `Escape` enters [command mode](#vi-mode-command). The commands available in command, insert and visual mode are described below. Vi mode shares [some bindings](#shared-binds) with [Emacs mode](#emacs-mode). To enable vi mode, use `fish_vi_key_bindings`. It is also possible to add all emacs-mode bindings to vi-mode by using something like: ``` function fish_user_key_bindings # Execute this once per mode that emacs bindings should be used in fish_default_key_bindings -M insert # Then execute the vi-bindings so they take precedence when there's a conflict. # Without --no-erase fish_vi_key_bindings will default to # resetting all bindings. # The argument specifies the initial mode (insert, "default" or visual). fish_vi_key_bindings --no-erase insert end ``` When in vi-mode, the [fish\_mode\_prompt](cmds/fish_mode_prompt) function will display a mode indicator to the left of the prompt. To disable this feature, override it with an empty function. To display the mode elsewhere (like in your right prompt), use the output of the `fish_default_mode_prompt` function. When a binding switches the mode, it will repaint the mode-prompt if it exists, and the rest of the prompt only if it doesn’t. So if you want a mode-indicator in your `fish_prompt`, you need to erase `fish_mode_prompt` e.g. by adding an empty file at `~/.config/fish/functions/fish_mode_prompt.fish`. (Bindings that change the mode are supposed to call the `repaint-mode` bind function, see [bind](cmds/bind)) The `fish_vi_cursor` function will be used to change the cursor’s shape depending on the mode in supported terminals. The following snippet can be used to manually configure cursors after enabling vi-mode: ``` # Emulates vim's cursor shape behavior # Set the normal and visual mode cursors to a block set fish_cursor_default block # Set the insert mode cursor to a line set fish_cursor_insert line # Set the replace mode cursor to an underscore set fish_cursor_replace_one underscore # The following variable can be used to configure cursor shape in # visual mode, but due to fish_cursor_default, is redundant here set fish_cursor_visual block ``` Additionally, `blink` can be added after each of the cursor shape parameters to set a blinking cursor in the specified shape. If the cursor shape does not appear to be changing after setting the above variables, it’s likely your terminal emulator does not support the capabilities necessary to do this. It may also be the case, however, that `fish_vi_cursor` has not detected your terminal’s features correctly (for example, if you are using `tmux`). If this is the case, you can force `fish_vi_cursor` to set the cursor shape by setting `$fish_vi_force_cursor` in `config.fish`. You’ll have to restart fish for any changes to take effect. If cursor shape setting remains broken after this, it’s almost certainly an issue with your terminal emulator, and not fish. #### Command mode Command mode is also known as normal mode. * `h` moves the cursor left. * `l` moves the cursor right. * `k` and `j` search the command history for the previous/next command containing the string that was specified on the commandline before the search was started. If the commandline was empty when the search started, all commands match. See the [history](#history-search) section for more information on history searching. In multi-line commands, they move the cursor up and down respectively. * `i` enters [insert mode](#vi-mode-insert) at the current cursor position. * `Shift`+`R` enters [insert mode](#vi-mode-insert) at the beginning of the line. * `v` enters [visual mode](#vi-mode-visual) at the current cursor position. * `a` enters [insert mode](#vi-mode-insert) after the current cursor position. * `Shift`+`A` enters [insert mode](#vi-mode-insert) at the end of the line. * `0` (zero) moves the cursor to beginning of line (remaining in command mode). * `d`+`d` deletes the current line and moves it to the [Copy and paste (Kill Ring)](#killring). * `Shift`+`D` deletes text after the current cursor position and moves it to the [Copy and paste (Kill Ring)](#killring). * `p` pastes text from the [Copy and paste (Kill Ring)](#killring). * `u` undoes the most recent edit of the command line. * `[` and `]` search the command history for the previous/next token containing the token under the cursor before the search was started. See the [history](#history-search) section for more information on history searching. * `/` opens the history in a pager. This will show history entries matching the search, a few at a time. Pressing it again will search older entries, pressing `Control`+`S` (that otherwise toggles pager search) will go to newer entries. The search bar will always be selected. * `Backspace` moves the cursor left. #### Insert mode * `Escape` enters [command mode](#vi-mode-command). * `Backspace` removes one character to the left. #### Visual mode * `←` (Left) and `→` (Right) extend the selection backward/forward by one character. * `h` moves the cursor left. * `l` moves the cursor right. * `k` moves the cursor up. * `j` moves the cursor down. * `b` and `w` extend the selection backward/forward by one word. * `d` and `x` move the selection to the [Copy and paste (Kill Ring)](#killring) and enter [command mode](#vi-mode-command). * `Escape` and `Control`+`C` enter [command mode](#vi-mode-command). * `c` and `s` remove the selection and switch to insert mode. * `X` moves the entire line to the [Copy and paste (Kill Ring)](#killring), and enters [command mode](#vi-mode-command). * `y` copies the selection to the [Copy and paste (Kill Ring)](#killring), and enters [command mode](#vi-mode-command). * `~` toggles the case (upper/lower) on the selection, and enters [command mode](#vi-mode-command). * `"\*y` copies the selection to the clipboard, and enters [command mode](#vi-mode-command). ### Custom bindings In addition to the standard bindings listed here, you can also define your own with [bind](cmds/bind): ``` # Just clear the commandline on control-c bind \cc 'commandline -r ""' ``` Put `bind` statements into [config.fish](language#configuration) or a function called `fish_user_key_bindings`. The key sequence (the `\cc`) here depends on your setup, in particular the terminal. To find out what the terminal sends use [fish\_key\_reader](cmds/fish_key_reader): ``` > fish_key_reader # pressing control-c Press a key: Press [ctrl-C] again to exit bind \cC 'do something' > fish_key_reader # pressing the right-arrow Press a key: bind \e\[C 'do something' ``` Note that some key combinations are indistinguishable or unbindable. For instance control-i *is the same* as the tab key. This is a terminal limitation that fish can’t do anything about. Also, `Escape` is the same thing as `Alt` in a terminal. To distinguish between pressing `Escape` and then another key, and pressing `Alt` and that key (or an escape sequence the key sends), fish waits for a certain time after seeing an escape character. This is configurable via the `fish_escape_delay_ms` variable. If you want to be able to press `Escape` and then a character and have it count as `Alt`+that character, set it to a higher value, e.g.: ``` set -g fish_escape_delay_ms 100 ``` ### Copy and paste (Kill Ring) Fish uses an Emacs-style kill ring for copy and paste functionality. For example, use `Control`+`K` (`kill-line`) to cut from the current cursor position to the end of the line. The string that is cut (a.k.a. killed in emacs-ese) is inserted into a list of kills, called the kill ring. To paste the latest value from the kill ring (emacs calls this “yanking”) use `Control`+`Y` (the `yank` input function). After pasting, use `Alt`+`Y` (`yank-pop`) to rotate to the previous kill. Copy and paste from outside are also supported, both via the `Control`+`X` / `Control`+`V` bindings (the `fish_clipboard_copy` and `fish_clipboard_paste` functions [[2]](#id8)) and via the terminal’s paste function, for which fish enables “Bracketed Paste Mode”, so it can tell a paste from manually entered text. In addition, when pasting inside single quotes, pasted single quotes and backslashes are automatically escaped so that the result can be used as a single token simply by closing the quote after. Kill ring entries are stored in `fish_killring` variable. The commands `begin-selection` and `end-selection` (unbound by default; used for selection in vi visual mode) control text selection together with cursor movement commands that extend the current selection. The variable [`fish_cursor_selection_mode`](language#envvar-fish_cursor_selection_mode) can be used to configure if that selection should include the character under the cursor (`inclusive`) or not (`exclusive`). The default is `exclusive`, which works well with any cursor shape. For vi mode, and particularly for the `block` or `underscore` cursor shapes you may prefer `inclusive`. ### Multiline editing The fish commandline editor can be used to work on commands that are several lines long. There are three ways to make a command span more than a single line: * Pressing the `Enter` key while a block of commands is unclosed, such as when one or more block commands such as `for`, `begin` or `if` do not have a corresponding [end](cmds/end) command. * Pressing `Alt`+`Enter` instead of pressing the `Enter` key. * By inserting a backslash (`\`) character before pressing the `Enter` key, escaping the newline. The fish commandline editor works exactly the same in single line mode and in multiline mode. To move between lines use the left and right arrow keys and other such keyboard shortcuts. ### Searchable command history After a command has been executed, it is remembered in the history list. Any duplicate history items are automatically removed. By pressing the up and down keys, you can search forwards and backwards in the history. If the current command line is not empty when starting a history search, only the commands containing the string entered into the command line are shown. By pressing `Alt`+`↑` and `Alt`+`↓`, a history search is also performed, but instead of searching for a complete commandline, each commandline is broken into separate elements just like it would be before execution, and the history is searched for an element matching that under the cursor. For more complicated searches, you can press `Ctrl`+`R` to open a pager that allows you to search the history. It shows a limited number of entries in one page, press `Ctrl`+`R` [[3]](#id11) again to move to the next page and `Ctrl`+`S` [[4]](#id12) to move to the previous page. You can change the text to refine your search. History searches are case-insensitive unless the search string contains an uppercase character. You can stop a search to edit your search string by pressing `Esc` or ``Page` `Down``. Prefixing the commandline with a space will prevent the entire line from being stored in the history. It will still be available for recall until the next command is executed, but will not be stored on disk. This is to allow you to fix misspellings and such. The command history is stored in the file `~/.local/share/fish/fish_history` (or `$XDG_DATA_HOME/fish/fish_history` if that variable is set) by default. However, you can set the `fish_history` environment variable to change the name of the history session (resulting in a `<session>_history` file); both before starting the shell and while the shell is running. See the [history](cmds/history) command for other manipulations. Examples: To search for previous entries containing the word ‘make’, type `make` in the console and press the up key. If the commandline reads `cd m`, place the cursor over the `m` character and press `Alt`+`↑` to search for previously typed words containing ‘m’. Navigating directories ---------------------- Navigating directories is usually done with the [cd](cmds/cd) command, but fish offers some advanced features as well. The current working directory can be displayed with the [pwd](cmds/pwd) command, or the `$PWD` [special variable](language#variables-special). Usually your prompt already does this. ### Directory history Fish automatically keeps a trail of the recent visited directories with [cd](cmds/cd) by storing this history in the `dirprev` and `dirnext` variables. Several commands are provided to interact with this directory history: * [dirh](cmds/dirh) prints the history * [cdh](cmds/cdh) displays a prompt to quickly navigate the history * [prevd](cmds/prevd) moves backward through the history. It is bound to `Alt`+`←` * [nextd](cmds/nextd) moves forward through the history. It is bound to `Alt`+`→` ### Directory stack Another set of commands, usually also available in other shells like bash, deal with the directory stack. Stack handling is not automatic and needs explicit calls of the following commands: * [dirs](cmds/dirs) prints the stack * [pushd](cmds/pushd) adds a directory on top of the stack and makes it the current working directory * [popd](cmds/popd) removes the directory on top of the stack and changes the current working directory
programming_docs
fish Tutorial Tutorial ======== Why fish? --------- Fish is a fully-equipped command line shell (like bash or zsh) that is smart and user-friendly. Fish supports powerful features like syntax highlighting, autosuggestions, and tab completions that just work, with nothing to learn or configure. If you want to make your command line more productive, more useful, and more fun, without learning a bunch of arcane syntax and configuration options, then fish might be just what you’re looking for! Getting started --------------- Once installed, just type in `fish` into your current shell to try it out! You will be greeted by the standard fish prompt, which means you are all set up and can start using fish: ``` > fish Welcome to fish, the friendly interactive shell Type help for instructions on how to use fish you@hostname ~> ``` This prompt that you see above is the fish default prompt: it shows your username, hostname, and working directory. - to change this prompt see [how to change your prompt](interactive#prompt) - to switch to fish permanently see [Default Shell](index#default-shell). From now on, we’ll pretend your prompt is just a `>` to save space. Learning fish ------------- This tutorial assumes a basic understanding of command line shells and Unix commands, and that you have a working copy of fish. If you have a strong understanding of other shells, and want to know what fish does differently, search for the magic phrase *unlike other shells*, which is used to call out important differences. Or, if you want a quick overview over the differences to other shells like Bash, see [Fish For Bash Users](fish_for_bash_users#fish-for-bash-users). For the full, detailed description of how to use fish interactively, see [Interactive Use](interactive#interactive). For a comprehensive description of fish’s scripting language, see [The Fish Language](language#language). Running Commands ---------------- Fish runs commands like other shells: you type a command, followed by its arguments. Spaces are separators: ``` > echo hello world hello world ``` This runs the command `echo` with the arguments `hello` and `world`. In this case that’s the same as one argument `hello world`, but in many cases it’s not. If you need to pass an argument that includes a space, you can [escape](language#escapes) with a backslash, or [quote](language#quotes) it using single or double quotes: ``` > mkdir My\ Files # Makes a directory called "My Files", with a space in the name > cp ~/Some\ File 'My Files' # Copies a file called "Some File" in the home directory to "My Files" > ls "My Files" Some File ``` Getting Help ------------ Run `help` to open fish’s help in a web browser, and `man` with the page (like `fish-language`) to open it in a man page. You can also ask for help with a specific command, for example, `help set` to open in a web browser, or `man set` to see it in the terminal. ``` > man set set - handle shell variables Synopsis... ``` To open this section, use `help getting-help`. Fish works by running commands, which are often also installed on your computer. Usually these commands also provide help in the man system, so you can get help for them there. Try `man ls` to get help on your computer’s `ls` command. Syntax Highlighting ------------------- You’ll quickly notice that fish performs syntax highlighting as you type. Invalid commands are colored red by default: ``` > /bin/mkd ``` A command may be invalid because it does not exist, or refers to a file that you cannot execute. When the command becomes valid, it is shown in a different color: ``` > /bin/mkdir ``` Valid file paths are underlined as you type them: ``` > cat ~/somefi ``` This tells you that there exists a file that starts with `somefi`, which is useful feedback as you type. These colors, and many more, can be changed by running `fish_config`, or by modifying [color variables](interactive#variables-color) directly. For example, if you want to disable (almost) all coloring: ``` fish_config theme choose none ``` This picks the “none” theme. To see all themes: ``` fish_config theme show ``` Just running `fish_config` will open up a browser interface that allows you to pick from the available themes. Wildcards --------- Fish supports the familiar wildcard `*`. To list all JPEG files: ``` > ls *.jpg lena.jpg meena.jpg santa maria.jpg ``` You can include multiple wildcards: ``` > ls l*.p* lena.png lesson.pdf ``` The recursive wildcard `**` searches directories recursively: ``` > ls /var/**.log /var/log/system.log /var/run/sntp.log ``` If that directory traversal is taking a long time, you can `Control`+`C` out of it. For more, see [Wildcards](language#expand-wildcard). Pipes and Redirections ---------------------- You can pipe between commands with the usual vertical bar: ``` > echo hello world | wc 1 2 12 ``` stdin and stdout can be redirected via the familiar `<` and `>`. stderr is redirected with a `2>`. ``` > grep fish < /etc/shells > ~/output.txt 2> ~/errors.txt ``` To redirect stdout and stderr into one file, you can use `&>`: ``` > make &> make_output.txt ``` For more, see [Input and output redirections](language#redirects) and [Pipes](language#pipes). Autosuggestions --------------- As you type fish will suggest commands to the right of the cursor, in gray. For example: ``` > /bin/hostname ``` It knows about paths and options: ``` > grep --ignore-case ``` And history too. Type a command once, and you can re-summon it by just typing a few letters: ``` > rsync -avze ssh . [email protected]:/some/long/path/doo/dee/doo/dee/doo ``` To accept the autosuggestion, hit `→` (right arrow) or `Control`+`F`. To accept a single word of the autosuggestion, `Alt`+`→` (right arrow). If the autosuggestion is not what you want, just ignore it. If you don’t like autosuggestions, you can disable them by setting `$fish_autosuggestion_enabled` to 0: ``` set -g fish_autosuggestion_enabled 0 ``` Tab Completions --------------- A rich set of tab completions work “out of the box”. Press `Tab` and fish will attempt to complete the command, argument, or path: ``` > /priTab => /private/ ``` If there’s more than one possibility, it will list them: ``` > ~/stuff/sTab ~/stuff/script.sh (command) ~/stuff/sources/ (directory) ``` Hit tab again to cycle through the possibilities. The part in parentheses there (that “command” and “directory”) is the completion description. It’s just a short hint to explain what kind of argument it is. fish can also complete many commands, like git branches: ``` > git merge prTab => git merge prompt\_designer > git checkout bTab builtin_list_io_merge (Branch) builtin_set_color (Branch) busted_events (Tag) ``` Try hitting tab and see what fish can do! Variables --------- Like other shells, a dollar sign followed by a variable name is replaced with the value of that variable: ``` > echo My home directory is $HOME My home directory is /home/tutorial ``` This is known as variable substitution, and it also happens in double quotes, but not single quotes: ``` > echo "My current directory is $PWD" My current directory is /home/tutorial > echo 'My current directory is $PWD' My current directory is $PWD ``` Unlike other shells, fish has an ordinary command to set variables: `set`, which takes a variable name, and then its value. ``` > set name 'Mister Noodle' > echo $name Mister Noodle ``` (Notice the quotes: without them, `Mister` and `Noodle` would have been separate arguments, and `$name` would have been made into a list of two elements.) Unlike other shells, variables are not further split after substitution: ``` > mkdir $name > ls Mister Noodle ``` In bash, this would have created two directories “Mister” and “Noodle”. In fish, it created only one: the variable had the value “Mister Noodle”, so that is the argument that was passed to `mkdir`, spaces and all. You can erase (or “delete”) a variable with `-e` or `--erase` ``` > set -e MyVariable > env | grep MyVariable (no output) ``` For more, see [Variable expansion](language#expand-variable). Exports (Shell Variables) ------------------------- Sometimes you need to have a variable available to an external command, often as a setting. For example many programs like `git` or `man` read the `$PAGER` variable to figure out your preferred pager (the program that lets you scroll text). Other variables used like this include `$BROWSER`, `$LANG` (to configure your language) and `$PATH`. You’ll note these are written in ALLCAPS, but that’s just a convention. To give a variable to an external command, it needs to be “exported”. This is done with a flag to `set`, either `--export` or just `-x`. ``` > set -x MyVariable SomeValue > env | grep MyVariable MyVariable=SomeValue ``` It can also be unexported with `--unexport` or `-u`. This works the other way around as well! If fish is started by something else, it inherits that parents exported variables. So if your terminal emulator starts fish, and it exports `$LANG` set to `en_US.UTF-8`, fish will receive that setting. And whatever started your terminal emulator also gave *it* some variables that it will then pass on unless it specifically decides not to. This is how fish usually receives the values for things like `$LANG`, `$PATH` and `$TERM`, without you having to specify them again. Exported variables can be local or global or universal - “exported” is not a [scope](language#variables-scope)! Usually you’d make them global via `set -gx MyVariable SomeValue`. For more, see [Exporting variables](language#variables-export). Lists ----- The `set` command above used quotes to ensure that `Mister Noodle` was one argument. If it had been two arguments, then `name` would have been a list of length 2. In fact, all variables in fish are really lists, that can contain any number of values, or none at all. Some variables, like `$PWD`, only have one value. By convention, we talk about that variable’s value, but we really mean its first (and only) value. Other variables, like `$PATH`, really do have multiple values. During variable expansion, the variable expands to become multiple arguments: ``` > echo $PATH /usr/bin /bin /usr/sbin /sbin /usr/local/bin ``` Variables whose name ends in “PATH” are automatically split on colons to become lists. They are joined using colons when exported to subcommands. This is for compatibility with other tools, which expect $PATH to use colons. You can also explicitly add this quirk to a variable with `set --path`, or remove it with `set --unpath`. Lists cannot contain other lists: there is no recursion. A variable is a list of strings, full stop. Get the length of a list with `count`: ``` > count $PATH 5 ``` You can append (or prepend) to a list by setting the list to itself, with some additional arguments. Here we append /usr/local/bin to $PATH: ``` > set PATH $PATH /usr/local/bin ``` You can access individual elements with square brackets. Indexing starts at 1 from the beginning, and -1 from the end: ``` > echo $PATH /usr/bin /bin /usr/sbin /sbin /usr/local/bin > echo $PATH[1] /usr/bin > echo $PATH[-1] /usr/local/bin ``` You can also access ranges of elements, known as “slices”: ``` > echo $PATH[1..2] /usr/bin /bin > echo $PATH[-1..2] /usr/local/bin /sbin /usr/sbin /bin ``` You can iterate over a list (or a slice) with a for loop: ``` for val in $PATH echo "entry: $val" end # Will print: # entry: /usr/bin/ # entry: /bin # entry: /usr/sbin # entry: /sbin # entry: /usr/local/bin ``` Lists adjacent to other lists or strings are expanded as [cartesian products](language#cartesian-product) unless quoted (see [Variable expansion](language#expand-variable)): ``` > set a 1 2 3 > set 1 a b c > echo $a$1 1a 2a 3a 1b 2b 3b 1c 2c 3c > echo $a" banana" 1 banana 2 banana 3 banana > echo "$a banana" 1 2 3 banana ``` This is similar to [Brace expansion](language#expand-brace). For more, see [Lists](language#variables-lists). Command Substitutions --------------------- Command substitutions use the output of one command as an argument to another. Unlike other shells, fish does not use backticks `` for command substitutions. Instead, it uses parentheses with or without a dollar: ``` > echo In (pwd), running $(uname) In /home/tutorial, running FreeBSD ``` A common idiom is to capture the output of a command in a variable: ``` > set os (uname) > echo $os Linux ``` Command substitutions without a dollar are not expanded within quotes, so the version with a dollar is simpler: ``` > touch "testing_$(date +%s).txt" > ls *.txt testing_1360099791.txt ``` Unlike other shells, fish does not split command substitutions on any whitespace (like spaces or tabs), only newlines. Usually this is a big help because unix commands operate on a line-by-line basis. Sometimes it can be an issue with commands like `pkg-config` that print what is meant to be multiple arguments on a single line. To split it on spaces too, use `string split`. ``` > printf '%s\n' (pkg-config --libs gio-2.0) -lgio-2.0 -lgobject-2.0 -lglib-2.0 > printf '%s\n' (pkg-config --libs gio-2.0 | string split -n " ") -lgio-2.0 -lgobject-2.0 -lglib-2.0 ``` If you need a command substitutions output as one argument, without any splits, use quoted command substitution: ``` > echo "first line second line" > myfile > set myfile "$(cat myfile)" > printf '|%s|' $myfile |first line second line| ``` For more, see [Command substitution](language#expand-command-substitution). Separating Commands (Semicolon) ------------------------------- Like other shells, fish allows multiple commands either on separate lines or the same line. To write them on the same line, use the semicolon (“;”). That means the following two examples are equivalent: ``` echo fish; echo chips # or echo fish echo chips ``` This is useful interactively to enter multiple commands. In a script it’s easier to read if the commands are on separate lines. Exit Status ----------- When a command exits, it returns a status code as a non-negative integer (that’s a whole number >= 0). Unlike other shells, fish stores the exit status of the last command in `$status` instead of `$?`. ``` > false > echo $status 1 ``` This indicates how the command fared - 0 usually means success, while the others signify kinds of failure. For instance fish’s `set --query` returns the number of variables it queried that weren’t set - `set --query PATH` usually returns 0, `set --query arglbargl boogagoogoo` usually returns 2. There is also a `$pipestatus` list variable for the exit statuses [[1]](#id3) of processes in a pipe. For more, see [The status variable](language#variables-status). Combiners (And, Or, Not) ------------------------ fish supports the familiar `&&` and `||` to combine commands, and `!` to negate them: ``` > ./configure && make && sudo make install ``` Here, `make` is only executed if `./configure` succeeds (returns 0), and `sudo make install` is only executed if both `./configure` and `make` succeed. fish also supports [and](cmds/and), [or](cmds/or), and [not](cmds/not). The first two are job modifiers and have lower precedence. Example usage: ``` > cp file1 file1_bak && cp file2 file2_bak; and echo "Backup successful"; or echo "Backup failed" Backup failed ``` As mentioned in [the section on the semicolon](#tut-semicolon), this can also be written in multiple lines, like so: ``` cp file1 file1_bak && cp file2 file2_bak and echo "Backup successful" or echo "Backup failed" ``` Conditionals (If, Else, Switch) ------------------------------- Use [if](cmds/if) and [else](cmds/else) to conditionally execute code, based on the exit status of a command. ``` if grep fish /etc/shells echo Found fish else if grep bash /etc/shells echo Found bash else echo Got nothing end ``` To compare strings or numbers or check file properties (whether a file exists or is writeable and such), use [test](cmds/test), like ``` if test "$fish" = "flounder" echo FLOUNDER end # or if test "$number" -gt 5 echo $number is greater than five else echo $number is five or less end # or # This test is true if the path /etc/hosts exists # - it could be a file or directory or symlink (or possibly something else). if test -e /etc/hosts echo We most likely have a hosts file else echo We do not have a hosts file end ``` [Combiners](#tut-combiners) can also be used to make more complex conditions, like ``` if command -sq fish; and grep fish /etc/shells echo fish is installed and configured end ``` For even more complex conditions, use [begin](cmds/begin) and [end](cmds/end) to group parts of them. There is also a [switch](cmds/switch) command: ``` switch (uname) case Linux echo Hi Tux! case Darwin echo Hi Hexley! case FreeBSD NetBSD DragonFly echo Hi Beastie! case '*' echo Hi, stranger! end ``` As you see, [case](cmds/case) does not fall through, and can accept multiple arguments or (quoted) wildcards. For more, see [Conditions](language#syntax-conditional). Functions --------- A fish function is a list of commands, which may optionally take arguments. Unlike other shells, arguments are not passed in “numbered variables” like `$1`, but instead in a single list `$argv`. To create a function, use the [function](cmds/function) builtin: ``` function say_hello echo Hello $argv end say_hello # prints: Hello say_hello everybody! # prints: Hello everybody! ``` Unlike other shells, fish does not have aliases or special prompt syntax. Functions take their place. [[2]](#id5) You can list the names of all functions with the [functions](cmds/functions) builtin (note the plural!). fish starts out with a number of functions: ``` > functions N_, abbr, alias, bg, cd, cdh, contains_seq, dirh, dirs, disown, down-or-search, edit_command_buffer, export, fg, fish_add_path, fish_breakpoint_prompt, fish_clipboard_copy, fish_clipboard_paste, fish_config, fish_default_key_bindings, fish_default_mode_prompt, fish_git_prompt, fish_hg_prompt, fish_hybrid_key_bindings, fish_indent, fish_is_root_user, fish_job_summary, fish_key_reader, fish_md5, fish_mode_prompt, fish_npm_helper, fish_opt, fish_print_git_action, fish_print_hg_root, fish_prompt, fish_sigtrap_handler, fish_svn_prompt, fish_title, fish_update_completions, fish_vcs_prompt, fish_vi_cursor, fish_vi_key_bindings, funced, funcsave, grep, help, history, hostname, isatty, kill, la, ll, ls, man, nextd, open, popd, prevd, prompt_hostname, prompt_pwd, psub, pushd, realpath, seq, setenv, suspend, trap, type, umask, up-or-search, vared, wait ``` You can see the source for any function by passing its name to `functions`: ``` > functions ls function ls --description 'List contents of directory' command ls -G $argv end ``` For more, see [Functions](language#syntax-function). Loops ----- While loops: ``` while true echo "Loop forever" end # Prints: # Loop forever # Loop forever # Loop forever # yes, this really will loop forever. Unless you abort it with ctrl-c. ``` For loops can be used to iterate over a list. For example, a list of files: ``` for file in *.txt cp $file $file.bak end ``` Iterating over a list of numbers can be done with `seq`: ``` for x in (seq 5) touch file_$x.txt end ``` For more, see [Loops and blocks](language#syntax-loops-and-blocks). Prompt ------ Unlike other shells, there is no prompt variable like `PS1`. To display your prompt, fish executes the [fish\_prompt](cmds/fish_prompt) function and uses its output as the prompt. And if it exists, fish also executes the [fish\_right\_prompt](cmds/fish_right_prompt) function and uses its output as the right prompt. You can define your own prompt from the command line: ``` > function fish_prompt; echo "New Prompt % "; end New Prompt % _ ``` Then, if you are happy with it, you can save it to disk by typing `funcsave fish_prompt`. This saves the prompt in `~/.config/fish/functions/fish_prompt.fish`. (Or, if you want, you can create that file manually from the start.) Multiple lines are OK. Colors can be set via [set\_color](cmds/set_color), passing it named ANSI colors, or hex RGB values: ``` function fish_prompt set_color purple date "+%m/%d/%y" set_color F00 echo (pwd) '>' (set_color normal) end ``` This prompt would look like: ``` 02/06/13 /home/tutorial > _ ``` You can choose among some sample prompts by running `fish_config` for a web UI or `fish_config prompt` for a simpler version inside your terminal. $PATH ----- `$PATH` is an environment variable containing the directories that fish searches for commands. Unlike other shells, $PATH is a [list](#tut-lists), not a colon-delimited string. Fish takes care to set `$PATH` to a default, but typically it is just inherited from fish’s parent process and is set to a value that makes sense for the system - see [Exports](#tut-exports). To prepend /usr/local/bin and /usr/sbin to `$PATH`, you can write: ``` > set PATH /usr/local/bin /usr/sbin $PATH ``` To remove /usr/local/bin from `$PATH`, you can write: ``` > set PATH (string match -v /usr/local/bin $PATH) ``` For compatibility with other shells and external commands, $PATH is a [path variable](language#variables-path), and so will be joined with colons (not spaces) when you quote it: ``` > echo "$PATH" /usr/local/sbin:/usr/local/bin:/usr/bin ``` and it will be exported like that, and when fish starts it splits the $PATH it receives into a list on colon. You can do so directly in `config.fish`, like you might do in other shells with `.profile`. See [this example](#path-example). A faster way is to use the [fish\_add\_path](cmds/fish_add_path) function, which adds given directories to the path if they aren’t already included. It does this by modifying the `$fish_user_paths` [universal variable](#tut-universal), which is automatically prepended to `$PATH`. For example, to permanently add `/usr/local/bin` to your `$PATH`, you could write: ``` > fish_add_path /usr/local/bin ``` The advantage is that you don’t have to go mucking around in files: just run this once at the command line, and it will affect the current session and all future instances too. You can also add this line to [config.fish](#tut-config), as it only adds the component if necessary. Or you can modify $fish\_user\_paths yourself, but you should be careful *not* to append to it unconditionally in config.fish, or it will grow longer and longer. Startup (Where’s .bashrc?) -------------------------- Fish starts by executing commands in `~/.config/fish/config.fish`. You can create it if it does not exist. It is possible to directly create functions and variables in `config.fish` file, using the commands shown above. For example: ``` > cat ~/.config/fish/config.fish set -x PATH $PATH /sbin/ function ll ls -lh $argv end ``` However, it is more common and efficient to use autoloading functions and universal variables. If you want to organize your configuration, fish also reads commands in .fish files in `~/.config/fish/conf.d/`. See [Configuration Files](language#configuration) for the details. Autoloading Functions --------------------- When fish encounters a command, it attempts to autoload a function for that command, by looking for a file with the name of that command in `~/.config/fish/functions/`. For example, if you wanted to have a function `ll`, you would add a text file `ll.fish` to `~/.config/fish/functions`: ``` > cat ~/.config/fish/functions/ll.fish function ll ls -lh $argv end ``` This is the preferred way to define your prompt as well: ``` > cat ~/.config/fish/functions/fish_prompt.fish function fish_prompt echo (pwd) "> " end ``` See the documentation for [funced](cmds/funced) and [funcsave](cmds/funcsave) for ways to create these files automatically, and [$fish\_function\_path](language#syntax-function-autoloading) to control their location. Universal Variables ------------------- A universal variable is a variable whose value is shared across all instances of fish, now and in the future – even after a reboot. You can make a variable universal with `set -U`: ``` > set -U EDITOR vim ``` Now in another shell: ``` > echo $EDITOR vim ``` Ready for more? --------------- If you want to learn more about fish, there is [lots of detailed documentation](index#intro), the [official gitter channel](https://gitter.im/fish-shell/fish-shell), an [official mailing list](https://lists.sourceforge.net/lists/listinfo/fish-users), and the [github page](https://github.com/fish-shell/fish-shell/).
programming_docs
fish Writing your own completions Writing your own completions ============================ To specify a completion, use the `complete` command. `complete` takes as a parameter the name of the command to specify a completion for. For example, to add a completion for the program `myprog`, one would start the completion command with `complete -c myprog ...` To provide a list of possible completions for myprog, use the `-a` switch. If `myprog` accepts the arguments start and stop, this can be specified as `complete -c myprog -a 'start stop'`. The argument to the `-a` switch is always a single string. At completion time, it will be tokenized on spaces and tabs, and variable expansion, command substitution and other forms of parameter expansion will take place. `fish` has a special syntax to support specifying switches accepted by a command. The switches `-s`, `-l` and `-o` are used to specify a short switch (single character, such as `-l`), a gnu style long switch (such as `--color`) and an old-style long switch (like `-shuffle`), respectively. If the command ‘myprog’ has an option ‘-o’ which can also be written as `--output`, and which can take an additional value of either ‘yes’ or ‘no’, this can be specified by writing: ``` complete -c myprog -s o -l output -a "yes no" ``` There are also special switches for specifying that a switch requires an argument, to disable filename completion, to create completions that are only available in some combinations, etc.. For a complete description of the various switches accepted by the `complete` command, see the documentation for the [complete](cmds/complete) builtin, or write `complete --help` inside the `fish` shell. As a more comprehensive example, here’s a commented excerpt of the completions for systemd’s `timedatectl`: ``` # All subcommands that timedatectl knows - this is useful for later. set -l commands status set-time set-timezone list-timezones set-local-rtc set-ntp # Disable file completions for the entire command # because it does not take files anywhere # Note that this can be undone by using "-F". # # File completions also need to be disabled # if you want to have more control over what files are offered # (e.g. just directories, or just files ending in ".mp3"). complete -c timedatectl -f # This line offers the subcommands # -"status", # -"set-timezone", # -"set-time" # -"list-timezones" # if no subcommand has been given so far. # # The `-n`/`--condition` option takes script as a string, which it executes. # If it returns true, the completion is offered. # Here the condition is the `__fish_seen_subcommands_from` helper function. # If returns true if any of the given commands is used on the commandline, # as determined by a simple heuristic. # For more complex uses, you can write your own function. # See e.g. the git completions for an example. # complete -c timedatectl -n "not __fish_seen_subcommand_from $commands" \ -a "status set-time set-timezone list-timezones" # If the "set-timezone" subcommand is used, # offer the output of `timedatectl list-timezones` as completions. # Each line of output is used as a separate candidate, # and anything after a tab is taken as the description. # It's often useful to transform command output with `string` into that form. complete -c timedatectl -n "__fish_seen_subcommand_from set-timezone" \ -a "(timedatectl list-timezones)" # Completion candidates can also be described via `-d`, # which is useful if the description is constant. # Try to keep these short, because that means the user gets to see more at once. complete -c timedatectl -n "not __fish_seen_subcommand_from $commands" \ -a "set-local-rtc" -d "Maintain RTC in local time" # We can also limit options to certain subcommands by using conditions. complete -c timedatectl -n "__fish_seen_subcommand_from set-local-rtc" \ -l adjust-system-clock -d 'Synchronize system clock from the RTC' # These are simple options that can be used everywhere. complete -c timedatectl -s h -l help -d 'Print a short help text and exit' complete -c timedatectl -l version -d 'Print a short version string and exit' complete -c timedatectl -l no-pager -d 'Do not pipe output into a pager' ``` For examples of how to write your own complex completions, study the completions in `/usr/share/fish/completions`. (The exact path depends on your chosen installation prefix and may be slightly different) Useful functions for writing completions ---------------------------------------- `fish` ships with several functions that are very useful when writing command specific completions. Most of these functions name begins with the string `__fish_`. Such functions are internal to `fish` and their name and interface may change in future fish versions. Still, some of them may be very useful when writing completions. A few of these functions are described here. Be aware that they may be removed or changed in future versions of fish. Functions beginning with the string `__fish_print_` print a newline separated list of strings. For example, `__fish_print_filesystems` prints a list of all known file systems. Functions beginning with `__fish_complete_` print out a newline separated list of completions with descriptions. The description is separated from the completion by a tab character. * `__fish_complete_directories STRING DESCRIPTION` performs path completion on STRING, allowing only directories, and giving them the description DESCRIPTION. * `__fish_complete_path STRING DESCRIPTION` performs path completion on STRING, giving them the description DESCRIPTION. * `__fish_complete_groups` prints a list of all user groups with the groups members as description. * `__fish_complete_pids` prints a list of all processes IDs with the command name as description. * `__fish_complete_suffix SUFFIX` performs file completion but sorts files ending in SUFFIX first. This is useful in conjunction with `complete --keep-order`. * `__fish_complete_users` prints a list of all users with their full name as description. * `__fish_print_filesystems` prints a list of all known file systems. Currently, this is a static list, and not dependent on what file systems the host operating system actually understands. * `__fish_print_hostnames` prints a list of all known hostnames. This function searches the fstab for nfs servers, ssh for known hosts and checks the `/etc/hosts` file. * `__fish_print_interfaces` prints a list of all known network interfaces. * `__fish_print_packages` prints a list of all installed packages. This function currently handles Debian, rpm and Gentoo packages. Where to put completions ------------------------ Completions can be defined on the commandline or in a configuration file, but they can also be automatically loaded. Fish automatically searches through any directories in the list variable `$fish_complete_path`, and any completions defined are automatically loaded when needed. A completion file must have a filename consisting of the name of the command to complete and the suffix `.fish`. By default, Fish searches the following for completions, using the first available file that it finds: * A directory for end-users to keep their own completions, usually `~/.config/fish/completions` (controlled by the `XDG_CONFIG_HOME` environment variable); * A directory for systems administrators to install completions for all users on the system, usually `/etc/fish/completions`; * A user-specified directory for third-party vendor completions, usually `~/.local/share/fish/vendor_completions.d` (controlled by the `XDG_DATA_HOME` environment variable); * A directory for third-party software vendors to ship their own completions for their software, usually `/usr/share/fish/vendor_completions.d`; * The completions shipped with fish, usually installed in `/usr/share/fish/completions`; and * Completions automatically generated from the operating system’s manual, usually stored in `~/.local/share/fish/generated_completions`. These paths are controlled by parameters set at build, install, or run time, and may vary from the defaults listed above. This wide search may be confusing. If you are unsure, your completions probably belong in `~/.config/fish/completions`. If you have written new completions for a common Unix command, please consider sharing your work by submitting it via the instructions in [Further help and development](index#more-help) If you are developing another program and would like to ship completions with your program, install them to the “vendor” completions directory. As this path may vary from system to system, the `pkgconfig` framework should be used to discover this path with the output of `pkg-config --variable completionsdir fish`. fish Introduction Introduction ============ This is the documentation for **fish**, the **f**riendly **i**nteractive **sh**ell. A shell is a program that helps you operate your computer by starting other programs. fish offers a command-line interface focused on usability and interactive use. Some of the special features of fish are: * **Extensive UI**: [Syntax highlighting](interactive#color), [Autosuggestions](interactive#autosuggestions), [tab completion](interactive#tab-completion) and selection lists that can be navigated and filtered. * **No configuration needed**: fish is designed to be ready to use immediately, without requiring extensive configuration. * **Easy scripting**: New [functions](language#syntax-function) can be added on the fly. The syntax is easy to learn and use. This page explains how to install and set up fish and where to get more information. Where to go? ------------ If this is your first time using fish, see the [tutorial](tutorial#tutorial). If you are already familiar with other shells like bash and want to see the scripting differences, see [Fish For Bash Users](fish_for_bash_users#fish-for-bash-users). For a comprehensive overview of fish’s scripting language, see [The Fish Language](language#language). For information on using fish interactively, see [Interactive use](interactive#interactive). If you need to install fish first, read on, the rest of this document will tell you how to get, install and configure fish. Installation ------------ This section describes how to install, uninstall, start, and exit **fish**. It also explains how to make fish the default shell. ### Installation Up-to-date instructions for installing the latest version of fish are on the [fish homepage](https://fishshell.com/). To install the development version of fish, see the instructions on the [project’s GitHub page](https://github.com/fish-shell/fish-shell). ### Starting and Exiting Once fish has been installed, open a terminal. If fish is not the default shell: * Type **fish** to start a shell: ``` > fish ``` * Type **exit** to end the session: ``` > exit ``` ### Default Shell There are multiple ways to switch to fish (or any other shell) as your default. The simplest method is to set your terminal emulator (eg GNOME Terminal, Apple’s Terminal.app, or Konsole) to start fish directly. See its configuration and set the program to start to `/usr/local/bin/fish` (if that’s where fish is installed - substitute another location as appropriate). Alternatively, you can set fish as your login shell so that it will be started by all terminal logins, including SSH. Warning Setting fish as your login shell may cause issues, such as an incorrect [`PATH`](language#envvar-PATH). Some operating systems, including a number of Linux distributions, require the login shell to be Bourne-compatible and to read configuration from `/etc/profile`. fish may not be suitable as a login shell on these systems. To change your login shell to fish: 1. Add the shell to `/etc/shells` with: ``` > echo /usr/local/bin/fish | sudo tee -a /etc/shells ``` 2. Change your default shell with: ``` > chsh -s /usr/local/bin/fish ``` Again, substitute the path to fish for `/usr/local/bin/fish` - see `command -s fish` inside fish. To change it back to another shell, just substitute `/usr/local/bin/fish` with `/bin/bash`, `/bin/tcsh` or `/bin/zsh` as appropriate in the steps above. ### Uninstalling For uninstalling fish: see [FAQ: Uninstalling fish](faq#faq-uninstalling). ### Shebang Line Because shell scripts are written in many different languages, they need to carry information about which interpreter should be used to execute them. For this, they are expected to have a first line, the shebang line, which names the interpreter executable. A script written in **bash** would need a first line like this: ``` #!/bin/bash ``` When the shell tells the kernel to execute the file, it will use the interpreter `/bin/bash`. For a script written in another language, just replace `/bin/bash` with the interpreter for that language (for example: `/usr/bin/python` for a python script, or `/usr/local/bin/fish` for a fish script). This line is only needed when scripts are executed without specifying the interpreter. For functions inside fish or when executing a script with `fish /path/to/script`, a shebang is not required (but it doesn’t hurt!). Configuration ------------- To store configuration write it to a file called `~/.config/fish/config.fish`. `.fish` scripts in `~/.config/fish/conf.d/` are also automatically executed before `config.fish`. These files are read on the startup of every shell, whether interactive and/or if they’re login shells. Use `status --is-interactive` and `status --is-login` to do things only in interactive/login shells, respectively. This is the short version; for a full explanation, like for sysadmins or integration for developers of other software, see [Configuration files](language#configuration). If you want to see what you changed over fish’s defaults, see [fish\_delta](cmds/fish_delta). ### Examples: To add `~/linux/bin` to PATH variable when using a login shell, add this to `~/.config/fish/config.fish` file: ``` if status --is-login set -gx PATH $PATH ~/linux/bin end ``` This is just an example; using [fish\_add\_path](cmds/fish_add_path) e.g. `fish_add_path ~/linux/bin` which only adds the path if it isn’t included yet is easier. To run commands on exit, use an [event handler](language#event) that is triggered by the exit of the shell: ``` function on_exit --on-event fish_exit echo fish is now exiting end ``` Resources --------- * The [GitHub page](https://github.com/fish-shell/fish-shell/) * The official [Gitter channel](https://gitter.im/fish-shell/fish-shell) * The official mailing list at [[email protected]](https://lists.sourceforge.net/lists/listinfo/fish-users) If you have an improvement for fish, you can submit it via the GitHub page. Other help pages ---------------- * [Introduction](#) * [Frequently asked questions](faq) * [Interactive use](interactive) * [The fish language](language) * [Commands](commands) * [Fish for bash users](fish_for_bash_users) * [Tutorial](tutorial) * [Writing your own completions](completions) * [Design](https://fishshell.com/docs/3.6/design.html) * [Release notes](https://fishshell.com/docs/3.6/relnotes.html) * [License](https://fishshell.com/docs/3.6/license.html) fish Commands Commands ======== This is a list of all the commands fish ships with. Broadly speaking, these fall into a few categories: Keywords -------- Core language keywords that make up the syntax, like * [if](cmds/if) for conditions. * [for](cmds/for) and [while](cmds/while) for loops. * [break](cmds/break) and [continue](cmds/continue) to control loops. * [function](cmds/function) to define functions. * [return](cmds/return) to return a status from a function. * [begin](cmds/begin) to begin a block and [end](cmds/end) to end any block (including ifs and loops). * [and](cmds/and), [or](cmds/or) and [not](cmds/not) to combine commands logically. * [switch](cmds/switch) and [case](cmds/case) to make multiple blocks depending on the value of a variable. * [command](cmds/command) or [builtin](cmds/builtin) to tell fish what sort of thing to execute * [time](cmds/time) to time execution * [exec](cmds/exec) tells fish to replace itself with a command. Tools ----- Builtins to do a task, like * [cd](cmds/cd) to change the current directory. * [echo](cmds/echo) or [printf](cmds/printf) to produce output. * [set\_color](cmds/set_color) to colorize output. * [set](cmds/set) to set, query or erase variables. * [read](cmds/read) to read input. * [string](cmds/string) for string manipulation. * [math](cmds/math) does arithmetic. * [argparse](cmds/argparse) to make arguments easier to handle. * [count](cmds/count) to count arguments. * [type](cmds/type) to find out what sort of thing (command, builtin or function) fish would call, or if it exists at all. * [test](cmds/test) checks conditions like if a file exists or a string is empty. * [contains](cmds/contains) to see if a list contains an entry. * [eval](cmds/eval) and [source](cmds/source) to run fish code from a string or file. * [status](cmds/status) to get shell information, like whether it’s interactive or a login shell, or which file it is currently running. * [abbr](cmds/abbr) manages [Abbreviations](interactive#abbreviations). * [bind](cmds/bind) to change bindings. * [complete](cmds/complete) manages [completions](interactive#tab-completion). * [commandline](cmds/commandline) to get or change the commandline contents. * [fish\_config](cmds/fish_config) to easily change fish’s configuration, like the prompt or colorscheme. * [random](cmds/random) to generate random numbers or pick from a list. Known functions --------------- Known functions are a customization point. You can change them to change how your fish behaves. This includes: * [fish\_prompt](cmds/fish_prompt) and [fish\_right\_prompt](cmds/fish_right_prompt) and [fish\_mode\_prompt](cmds/fish_mode_prompt) to print your prompt. * [fish\_command\_not\_found](cmds/fish_command_not_found) to tell fish what to do when a command is not found. * [fish\_title](cmds/fish_title) to change the terminal’s title. * [fish\_greeting](cmds/fish_greeting) to show a greeting when fish starts. Helper functions ---------------- Some helper functions, often to give you information for use in your prompt: * [fish\_git\_prompt](cmds/fish_git_prompt) and [fish\_hg\_prompt](cmds/fish_hg_prompt) to print information about the current git or mercurial repository. * [fish\_vcs\_prompt](cmds/fish_vcs_prompt) to print information for either. * [fish\_svn\_prompt](cmds/fish_svn_prompt) to print information about the current svn repository. * [fish\_status\_to\_signal](cmds/fish_status_to_signal) to give a signal name from a return status. * [prompt\_pwd](cmds/prompt_pwd) to give the current directory in a nicely formatted and shortened way. * [prompt\_login](cmds/prompt_login) to describe the current login, with user and hostname, and to explain if you are in a chroot or connected via ssh. * [prompt\_hostname](cmds/prompt_hostname) to give the hostname, shortened for use in the prompt. * [fish\_is\_root\_user](cmds/fish_is_root_user) to check if the current user is an administrator user like root. * [fish\_add\_path](cmds/fish_add_path) to easily add a path to $PATH. * [alias](cmds/alias) to quickly define wrapper functions (“aliases”). * [fish\_delta](cmds/fish_delta) to show what you have changed from the default configuration. Helper commands --------------- fish also ships some things as external commands so they can be easily called from elsewhere. This includes [fish\_indent](cmds/fish_indent) to format fish code and [fish\_key\_reader](cmds/fish_key_reader) to show you what escape sequence a keypress produces. The full list ------------- And here is the full list: * [\_ - call fish’s translations](cmds/_) * [abbr - manage fish abbreviations](cmds/abbr) * [alias - create a function](cmds/alias) * [and - conditionally execute a command](cmds/and) * [argparse - parse options passed to a fish script or function](cmds/argparse) * [begin - start a new block of code](cmds/begin) * [bg - send jobs to background](cmds/bg) * [bind - handle fish key bindings](cmds/bind) * [block - temporarily block delivery of events](cmds/block) * [break - stop the current inner loop](cmds/break) * [breakpoint - launch debug mode](cmds/breakpoint) * [builtin - run a builtin command](cmds/builtin) * [case - conditionally execute a block of commands](cmds/case) * [cd - change directory](cmds/cd) * [cdh - change to a recently visited directory](cmds/cdh) * [command - run a program](cmds/command) * [commandline - set or get the current command line buffer](cmds/commandline) * [complete - edit command specific tab-completions](cmds/complete) * [contains - test if a word is present in a list](cmds/contains) * [continue - skip the remainder of the current iteration of the current inner loop](cmds/continue) * [count - count the number of elements of a list](cmds/count) * [dirh - print directory history](cmds/dirh) * [dirs - print directory stack](cmds/dirs) * [disown - remove a process from the list of jobs](cmds/disown) * [echo - display a line of text](cmds/echo) * [else - execute command if a condition is not met](cmds/else) * [emit - emit a generic event](cmds/emit) * [end - end a block of commands](cmds/end) * [eval - evaluate the specified commands](cmds/eval) * [exec - execute command in current process](cmds/exec) * [exit - exit the shell](cmds/exit) * [false - return an unsuccessful result](cmds/false) * [fg - bring job to foreground](cmds/fg) * [fish - the friendly interactive shell](cmds/fish) * [fish\_add\_path - add to the path](cmds/fish_add_path) * [fish\_breakpoint\_prompt - define the prompt when stopped at a breakpoint](cmds/fish_breakpoint_prompt) * [fish\_clipboard\_copy - copy text to the system’s clipboard](cmds/fish_clipboard_copy) * [fish\_clipboard\_paste - get text from the system’s clipboard](cmds/fish_clipboard_paste) * [fish\_command\_not\_found - what to do when a command wasn’t found](cmds/fish_command_not_found) * [fish\_config - start the web-based configuration interface](cmds/fish_config) * [fish\_delta - compare functions and completions to the default](cmds/fish_delta) * [fish\_git\_prompt - output git information for use in a prompt](cmds/fish_git_prompt) * [fish\_greeting - display a welcome message in interactive shells](cmds/fish_greeting) * [fish\_hg\_prompt - output Mercurial information for use in a prompt](cmds/fish_hg_prompt) * [fish\_indent - indenter and prettifier](cmds/fish_indent) * [fish\_is\_root\_user - check if the current user is root](cmds/fish_is_root_user) * [fish\_key\_reader - explore what characters keyboard keys send](cmds/fish_key_reader) * [fish\_mode\_prompt - define the appearance of the mode indicator](cmds/fish_mode_prompt) * [fish\_opt - create an option specification for the argparse command](cmds/fish_opt) * [fish\_prompt - define the appearance of the command line prompt](cmds/fish_prompt) * [fish\_right\_prompt - define the appearance of the right-side command line prompt](cmds/fish_right_prompt) * [fish\_status\_to\_signal - convert exit codes to human-friendly signals](cmds/fish_status_to_signal) * [fish\_svn\_prompt - output Subversion information for use in a prompt](cmds/fish_svn_prompt) * [fish\_title - define the terminal’s title](cmds/fish_title) * [fish\_update\_completions - update completions using manual pages](cmds/fish_update_completions) * [fish\_vcs\_prompt - output version control system information for use in a prompt](cmds/fish_vcs_prompt) * [for - perform a set of commands multiple times](cmds/for) * [funced - edit a function interactively](cmds/funced) * [funcsave - save the definition of a function to the user’s autoload directory](cmds/funcsave) * [function - create a function](cmds/function) * [functions - print or erase functions](cmds/functions) * [help - display fish documentation](cmds/help) * [history - show and manipulate command history](cmds/history) * [if - conditionally execute a command](cmds/if) * [isatty - test if a file descriptor is a terminal](cmds/isatty) * [jobs - print currently running jobs](cmds/jobs) * [math - perform mathematics calculations](cmds/math) * [nextd - move forward through directory history](cmds/nextd) * [not - negate the exit status of a job](cmds/not) * [open - open file in its default application](cmds/open) * [or - conditionally execute a command](cmds/or) * [path - manipulate and check paths](cmds/path) * [popd - move through directory stack](cmds/popd) * [prevd - move backward through directory history](cmds/prevd) * [printf - display text according to a format string](cmds/printf) * [prompt\_hostname - print the hostname, shortened for use in the prompt](cmds/prompt_hostname) * [prompt\_login - describe the login suitable for prompt](cmds/prompt_login) * [prompt\_pwd - print pwd suitable for prompt](cmds/prompt_pwd) * [psub - perform process substitution](cmds/psub) * [pushd - push directory to directory stack](cmds/pushd) * [pwd - output the current working directory](cmds/pwd) * [random - generate random number](cmds/random) * [read - read line of input into variables](cmds/read) * [realpath - convert a path to an absolute path without symlinks](cmds/realpath) * [return - stop the current inner function](cmds/return) * [set - display and change shell variables](cmds/set) * [set\_color - set the terminal color](cmds/set_color) * [source - evaluate contents of file](cmds/source) * [status - query fish runtime information](cmds/status) * [string - manipulate strings](cmds/string) * [string-collect - join strings into one](cmds/string-collect) * [string-escape - escape special characters](cmds/string-escape) * [string-join - join strings with delimiter](cmds/string-join) * [string-join0 - join strings with zero bytes](cmds/string-join0) * [string-length - print string lengths](cmds/string-length) * [string-lower - convert strings to lowercase](cmds/string-lower) * [string-match - match substrings](cmds/string-match) * [string-pad - pad strings to a fixed width](cmds/string-pad) * [string-repeat - multiply a string](cmds/string-repeat) * [string-replace - replace substrings](cmds/string-replace) * [string-shorten - shorten strings to a width, with an ellipsis](cmds/string-shorten) * [string-split - split strings by delimiter](cmds/string-split) * [string-split0 - split on zero bytes](cmds/string-split0) * [string-sub - extract substrings](cmds/string-sub) * [string-trim - remove trailing whitespace](cmds/string-trim) * [string-unescape - expand escape sequences](cmds/string-unescape) * [string-upper - convert strings to uppercase](cmds/string-upper) * [suspend - suspend the current shell](cmds/suspend) * [switch - conditionally execute a block of commands](cmds/switch) * [test - perform tests on files and text](cmds/test) * [time - measure how long a command or block takes](cmds/time) * [trap - perform an action when the shell receives a signal](cmds/trap) * [true - return a successful result](cmds/true) * [type - locate a command and describe its type](cmds/type) * [ulimit - set or get resource usage limits](cmds/ulimit) * [umask - set or get the file creation mode mask](cmds/umask) * [vared - interactively edit the value of an environment variable](cmds/vared) * [wait - wait for jobs to complete](cmds/wait) * [while - perform a set of commands multiple times](cmds/while)
programming_docs
fish Fish for bash users Fish for bash users =================== This is to give you a quick overview if you come from bash (or to a lesser extent other shells like zsh or ksh) and want to know how fish differs. Fish is intentionally not POSIX-compatible and as such some of the things you are used to work differently. Many things are similar - they both fundamentally expand commandlines to execute commands, have pipes, redirections, variables, globs, use command output in various ways. This document is there to quickly show you the differences. Command substitutions --------------------- Fish spells command substitutions as `$(command)` or `(command)`, but not ``command``. In addition, it only splits them on newlines instead of $IFS. If you want to split on something else, use [string split](cmds/string-split), [string split0](cmds/string-split) or [string collect](cmds/string-collect). If those are used as the last command in a command substitution the splits they create are carried over. So: ``` for i in (find . -print0 | string split0) ``` will correctly handle all possible filenames. Variables --------- Fish sets and erases variables with [set](cmds/set) instead of `VAR=VAL` and a variety of separate builtins like `declare` and `unset` and `export`. `set` takes options to determine the scope and exportedness of a variable: ``` # Define $PAGER *g*lobal and e*x*ported, # so this is like ``export PAGER=less`` set -gx PAGER less # Define $alocalvariable only locally, # like ``local alocalvariable=foo`` set -l alocalvariable foo ``` or to erase variables: ``` set -e PAGER ``` `VAR=VAL` statements are available as environment overrides: ``` PAGER=cat git log ``` Fish does not perform word splitting. Once a variable has been set to a value, that value stays as it is, so double-quoting variable expansions isn’t the necessity it is in bash. [[1]](#id3) For instance, here’s bash ``` > foo="bar baz" > printf '"%s"\n' $foo # will print two lines, because we didn't double-quote # this is word splitting "bar" "baz" ``` And here is fish: ``` > set foo "bar baz" > printf '"%s"\n' $foo # foo was set as one element, so it will be passed as one element, so this is one line "bar baz" ``` All variables are “arrays” (we use the term “lists”), and expanding a variable expands to all its elements, with each element as its own argument (like bash’s `"${var[@]}"`: ``` > set var "foo bar" banana > printf %s\n $var foo bar banana ``` Specific elements of a list can be selected: ``` echo $list[5..7] ``` The arguments to `set` are ordinary, so you can also set a variable to the output of a command: ``` # Set lines to all the lines in file, one element per line set lines (cat file) ``` or a mixture of literal values and output: ``` > set numbers 1 2 3 (seq 5 8) 9 > printf '%s\n' $numbers 1 2 3 5 6 7 8 9 ``` A `=` is unnecessary and unhelpful with `set` - `set foo = bar` will set the variable “foo” to two values: “=” and “bar”. `set foo=bar` will print an error. See [Shell variables](language#variables) for more. Wildcards (globs) ----------------- Fish only supports the `*` and `**` glob (and the deprecated `?` glob) as syntax. If a glob doesn’t match it fails the command (like with bash’s `failglob`) unless the command is `for`, `set` or `count` or the glob is used with an environment override (`VAR=* command`), in which case it expands to nothing (like with bash’s `nullglob` option). Globbing doesn’t happen on expanded variables, so: ``` set foo "*" echo $foo ``` will not match any files. There are no options to control globbing so it always behaves like that. See [Wildcards](language#expand-wildcard) for more. Quoting ------- Fish has two quoting styles: `""` and `''`. Variables are expanded in double-quotes, nothing is expanded in single-quotes. There is no `$''`, instead the sequences that would transform are transformed *when unquoted*: ``` > echo a\nb a b ``` See [Quotes](language#quotes) for more. String manipulation ------------------- Fish does not have `${foo%bar}`, `${foo#bar}` and `${foo/bar/baz}`. Instead string manipulation is done by the [string](cmds/string) builtin. For example, to replace “bar” with “baz”: ``` > string replace bar baz "bar luhrmann" baz luhrmann ``` It can also split strings: ``` > string split "," "foo,bar" foo bar ``` Match regular expressions as a replacement for `grep`: ``` > echo bababa | string match -r 'aba$' aba ``` Pad strings to a given width, with arbitrary characters: ``` > string pad -c x -w 20 "foo" xxxxxxxxxxxxxxxxxfoo ``` Make strings lower/uppercase: ``` > string lower Foo foo > string upper Foo FOO ``` repeat strings, trim strings, escape strings or print a string’s length or width (in terminal cells). Special variables ----------------- Some bash variables and their closest fish equivalent: * `$*`, `$@`, `$1` and so on: `$argv` * `$?`: `$status` * `$$`: `$fish_pid` * `$#`: No variable, instead use `count $argv` * `$!`: `$last_pid` * `$0`: `status filename` * `$-`: Mostly `status is-interactive` and `status is-login` Process substitution -------------------- Instead of `<(command)` fish uses `(command | psub)`. There is no equivalent to `>(command)`. Note that both of these are bashisms, and most things can easily be expressed without. E.g. instead of: ``` source (command | psub) ``` just use: ``` command | source ``` as fish’s [source](cmds/source) can read from stdin. Heredocs -------- Fish does not have `<<EOF` “heredocs”. Instead of ``` cat <<EOF some string some more string EOF ``` use: ``` printf %s\n "some string" "some more string" ``` or: ``` echo "some string some more string" # or if you want the quotes on separate lines: echo "\ some string some more string\ " ``` Quotes are followed across newlines. What “heredocs” do is: 1. Read/interpret the string, with special rules, up to the terminator. [[2]](#id5) 2. Write the resulting string to a temporary file. 3. Start the command the heredoc is attached to with that file as stdin. This means it is essentially the same as just reading from a pipe, so: ``` echo "foo" | cat ``` is mostly the same as ``` cat <<EOF foo EOF ``` Just like with heredocs, the command has to be prepared to read from stdin. Sometimes this requires special options to be used, often giving a filename of `-` turns it on. For example: ``` echo "xterm rxvt-unicode" | pacman --remove - # is the same as (the `-` makes pacman read arguments from stdin) pacman --remove xterm rxvt-unicode ``` and could be written in other shells as ``` # This "-" is still necessary - the heredoc is *also* passed over stdin! pacman --remove - << EOF xterm rxvt-unicode EOF ``` So heredocs really are just minor syntactical sugar that introduces a lot of special rules, which is why fish doesn’t have them. Pipes are a core concept, and are simpler and compose nicer. Test (`test`, `[`, `[[`) ------------------------ Fish has a POSIX-compatible `test` or `[` builtin. There is no `[[` and `test` does not accept `==` as a synonym for `=`. It can compare floating point numbers, however. `set -q` can be used to determine if a variable exists or has a certain number of elements (`set -q foo[2]`). Arithmetic Expansion -------------------- Fish does not have `$((i+1))` arithmetic expansion, computation is handled by [math](cmds/math): ``` math $i + 1 ``` Unlike bash’s arithmetic, it can handle floating point numbers: ``` > math 5 / 2 2.5 ``` And also has some functions, like for trigonometry: ``` > math cos 2 x pi 1 ``` You can pass arguments to `math` separately like above or in quotes. Because fish uses `()` parentheses for [command substitutions](#bash-command-substitutions), quoting is needed if you want to use them in your expression: ``` > math '(5 + 2) * 4' ``` Both `*` and `x` are valid ways to spell multiplication, but `*` needs to be quoted because it looks like a [glob](#bash-globs). Prompts ------- Fish does not use the `$PS1`, `$PS2` and so on variables. Instead the prompt is the output of the [fish\_prompt](cmds/fish_prompt) function, plus the [fish\_mode\_prompt](cmds/fish_mode_prompt) function if vi-mode is enabled and the [fish\_right\_prompt](cmds/fish_right_prompt) function for the right prompt. As an example, here’s a relatively simple bash prompt: ``` # <$HOSTNAME> <$PWD in blue> <Prompt Sign in Yellow> <Rest in default light white> PS1='\h\[\e[1;34m\]\w\[\e[m\] \[\e[1;32m\]\$\[\e[m\] ' ``` and a rough fish equivalent: ``` function fish_prompt set -l prompt_symbol '$' fish_is_root_user; and set prompt_symbol '#' echo -s (prompt_hostname) \ (set_color blue) (prompt_pwd) \ (set_color yellow) $prompt_symbol (set_color normal) end ``` This shows a few differences: * Fish provides [set\_color](cmds/set_color) to color text. It can use the 16 named colors and also RGB sequences (so you could also use `set_color 5555FF`) * Instead of introducing specific escapes like `\h` for the hostname, the prompt is simply a function. To achieve the effect of `\h`, fish provides helper functions like [prompt\_hostname](cmds/prompt_hostname), which prints a shortened version of the hostname. * Fish offers other helper functions for adding things to the prompt, like [fish\_vcs\_prompt](cmds/fish_vcs_prompt) for adding a display for common version control systems (git, mercurial, svn), and [prompt\_pwd](cmds/prompt_pwd) for showing a shortened `$PWD` (the user’s home directory becomes `~` and any path component is shortened). The default prompt is reasonably full-featured and its code can be read via `type fish_prompt`. Fish does not have `$PS2` for continuation lines, instead it leaves the lines indented to show that the commandline isn’t complete yet. Blocks and loops ---------------- Fish’s blocking constructs look a little different. They all start with a word, end in `end` and don’t have a second starting word: ``` for i in 1 2 3; do echo $i done # becomes for i in 1 2 3 echo $i end while true; do echo Weeee done # becomes while true echo Weeeeeee end { echo Hello } # becomes begin echo Hello end if true; then echo Yes I am true else echo "How is true not true?" fi # becomes if true echo Yes I am true else echo "How is true not true?" end foo() { echo foo } # becomes function foo echo foo end # (bash allows the word "function", # but this is an extension) ``` Fish does not have an `until`. Use `while not` or `while !`. Subshells --------- Bash has a feature called “subshells”, where it will start another shell process for certain things. That shell will then be independent and e.g. any changes it makes to variables won’t be visible in the main shell. This includes things like: ``` # A list of commands in `()` parentheses (foo; bar) | baz # Both sides of a pipe foo | while read -r bar; do # This will not be visible outside of the loop. VAR=VAL # This background process will not be, either baz & done ``` `()` subshells are often confused with `{}` grouping, which does *not* use a subshell. When you just need to group, you can use `begin; end` in fish: ``` (foo; bar) | baz # when it should really have been: { foo; bar; } | baz # becomes begin; foo; bar; end | baz ``` The pipe will simply be run in the same process, so `while read` loops can set variables outside: ``` foo | while read bar set -g VAR VAL baz & end echo $VAR # will print VAL jobs # will show "baz" ``` Subshells are also frequently confused with [command substitutions](#bash-command-substitutions), which bash writes as ``command`` or `$(command)` and fish writes as `$(command)` or `(command)`. Bash also *uses* subshells to implement them. The isolation can usually be achieved by just scoping variables (with `set -l`), but if you really do need to run your code in a new shell environment you can always use `fish -c 'your code here'` to do so explicitly. Builtins and other commands --------------------------- By now it has become apparent that fish puts much more of a focus on its builtins and external commands rather than its syntax. So here are some helpful builtins and their rough equivalent in bash: * [string](cmds/string) - this replaces most of the string transformation (`${i%foo}` et al) and can also be used instead of `grep` and `sed` and such. * [math](cmds/math) - this replaces `$((i + 1))` arithmetic and can also do floats and some simple functions (sine and friends). * [argparse](cmds/argparse) - this can handle a script’s option parsing, for which bash would probably use `getopt` (zsh provides `zparseopts`). * [count](cmds/count) can be used to count things and therefore replaces `$#` and can be used instead of `wc`. * [status](cmds/status) provides information about the shell status, e.g. if it’s interactive or what the current linenumber is. This replaces `$-` and `$BASH_LINENO` and other variables. * `seq(1)` can be used as a replacement for `{1..10}` range expansion. If your OS doesn’t ship a `seq` fish includes a replacement function. Other facilities ---------------- Bash has `set -x` or `set -o xtrace` to print all commands that are being executed. In fish, this would be enabled by setting [`fish_trace`](language#envvar-fish_trace). Or, if your intention is to *profile* how long each line of a script takes, you can use `fish --profile` - see the [page for the fish command](cmds/fish). fish The fish language The fish language ================= This document is a comprehensive overview of fish’s scripting language. For interactive features see [Interactive use](interactive#interactive). Syntax overview --------------- Shells like fish are used by giving them commands. A command is executed by writing the name of the command followed by any arguments. For example: ``` echo hello world ``` [echo](cmds/echo) command writes its arguments to the screen. In this example the output is `hello world`. Everything in fish is done with commands. There are commands for repeating other commands, commands for assigning variables, commands for treating a group of commands as a single command, etc. All of these commands follow the same basic syntax. Every program on your computer can be used as a command in fish. If the program file is located in one of the [`PATH`](#envvar-PATH) directories, you can just type the name of the program to use it. Otherwise the whole filename, including the directory (like `/home/me/code/checkers/checkers` or `../checkers`) is required. Here is a list of some useful commands: * [cd](cmds/cd): Change the current directory * `ls`: List files and directories * `man`: Display a manual page - try `man ls` to get help on your “ls” command, or `man mv` to get information about “mv”. * `mv`: Move (rename) files * `cp`: Copy files * [open](cmds/open): Open files with the default application associated with each filetype * `less`: Display the contents of files Commands and arguments are separated by the space character `' '`. Every command ends with either a newline (by pressing the return key) or a semicolon `;`. Multiple commands can be written on the same line by separating them with semicolons. A switch is a very common special type of argument. Switches almost always start with one or more hyphens `-` and alter the way a command operates. For example, the `ls` command usually lists the names of all files and directories in the current working directory. By using the `-l` switch, the behavior of `ls` is changed to not only display the filename, but also the size, permissions, owner, and modification time of each file. Switches differ between commands and are usually documented on a command’s manual page. There are some switches, however, that are common to most commands. For example, `--help` will usually display a help text, `--version` will usually display the command version, and `-i` will often turn on interactive prompting before taking action. Try `man your-command-here` to get information on your command’s switches. So the basic idea of fish is the same as with other unix shells: It gets a commandline, runs [expansions](#expand), and the result is then run as a command. Terminology ----------- Here we define some of the terms used on this page and throughout the rest of the fish documentation: * **Argument**: A parameter given to a command. In `echo foo`, the “foo” is an argument. * **Builtin**: A command that is implemented by the shell. Builtins are so closely tied to the operation of the shell that it is impossible to implement them as external commands. In `echo foo`, the “echo” is a builtin. * **Command**: A program that the shell can run, or more specifically an external program that the shell runs in another process. External commands are provided on your system, as executable files. In `echo foo` the “echo” is a builtin command, in `command echo foo` the “echo” is an external command, provided by a file like /bin/echo. * **Function**: A block of commands that can be called as if they were a single command. By using functions, it is possible to string together multiple simple commands into one more advanced command. * **Job**: A running pipeline or command. * **Pipeline**: A set of commands strung together so that the output of one command is the input of the next command. `echo foo | grep foo` is a pipeline. * **Redirection**: An operation that changes one of the input or output streams associated with a job. * **Switch** or **Option**: A special kind of argument that alters the behavior of a command. A switch almost always begins with one or two hyphens. In `echo -n foo` the “-n” is an option. Quotes ------ Sometimes features like [parameter expansion](#expand) and [character escapes](#escapes) get in the way. When that happens, you can use quotes, either single (`'`) or double (`"`). Between single quotes, fish performs no expansions. Between double quotes, fish only performs [variable expansion](#expand-variable). No other kind of expansion (including [brace expansion](#expand-brace) or parameter expansion) is performed, and escape sequences (for example, `\n`) are ignored. Within quotes, whitespace is not used to separate arguments, allowing quoted arguments to contain spaces. The only meaningful escape sequences in single quotes are `\'`, which escapes a single quote and `\\`, which escapes the backslash symbol. The only meaningful escapes in double quotes are `\"`, which escapes a double quote, `\$`, which escapes a dollar character, `\` followed by a newline, which deletes the backslash and the newline, and `\\`, which escapes the backslash symbol. Single quotes have no special meaning within double quotes and vice versa. Example: ``` rm "cumbersome filename.txt" ``` removes the file `cumbersome filename.txt`, while ``` rm cumbersome filename.txt ``` removes two files, `cumbersome` and `filename.txt`. Another example: ``` grep 'enabled)$' foo.txt ``` searches for lines ending in `enabled)` in `foo.txt` (the `$` is special to `grep`: it matches the end of the line). Escaping Characters ------------------- Some characters cannot be written directly on the command line. For these characters, so-called escape sequences are provided. These are: * `\a` represents the alert character. * `\e` represents the escape character. * `\f` represents the form feed character. * `\n` represents a newline character. * `\r` represents the carriage return character. * `\t` represents the tab character. * `\v` represents the vertical tab character. * `\xHH` or `\XHH`, where `HH` is a hexadecimal number, represents a byte of data with the specified value. For example, `\x9` is the tab character. If you are using a multibyte encoding, this can be used to enter invalid strings. Typically fish is run with the ASCII or UTF-8 encoding, so anything up to `\X7f` is an ASCII character. * `\ooo`, where `ooo` is an octal number, represents the ASCII character with the specified value. For example, `\011` is the tab character. The highest allowed value is `\177`. * `\uXXXX`, where `XXXX` is a hexadecimal number, represents the 16-bit Unicode character with the specified value. For example, `\u9` is the tab character. * `\UXXXXXXXX`, where `XXXXXXXX` is a hexadecimal number, represents the 32-bit Unicode character with the specified value. For example, `\U9` is the tab character. The highest allowed value is U10FFFF. * `\cX`, where `X` is a letter of the alphabet, represents the control sequence generated by pressing the control key and the specified letter. For example, `\ci` is the tab character Some characters have special meaning to the shell. For example, an apostrophe `'` disables expansion (see [Quotes](#quotes)). To tell the shell to treat these characters literally, escape them with a backslash. For example, the command: ``` echo \'hello world\' ``` outputs `'hello world'` (including the apostrophes), while the command: ``` echo 'hello world' ``` outputs `hello world` (without the apostrophes). In the former case the shell treats the apostrophes as literal `'` characters, while in the latter case it treats them as special expansion modifiers. The special characters and their escape sequences are: * `\` (backslash space) escapes the space character. This keeps the shell from splitting arguments on the escaped space. * `\$` escapes the dollar character. * `\\` escapes the backslash character. * `\*` escapes the star character. * `\?` escapes the question mark character (this is not necessary if the `qmark-noglob` [feature flag](#featureflags) is enabled). * `\~` escapes the tilde character. * `\#` escapes the hash character. * `\(` escapes the left parenthesis character. * `\)` escapes the right parenthesis character. * `\{` escapes the left curly bracket character. * `\}` escapes the right curly bracket character. * `\[` escapes the left bracket character. * `\]` escapes the right bracket character. * `\<` escapes the less than character. * `\>` escapes the more than character. * `\&` escapes the ampersand character. * `\|` escapes the vertical bar character. * `\;` escapes the semicolon character. * `\"` escapes the quote character. * `\'` escapes the apostrophe character. As a special case, `\` immediately followed by a literal new line is a “continuation” and tells fish to ignore the line break and resume input at the start of the next line (without introducing any whitespace or terminating a token). Input/Output Redirection ------------------------ Most programs use three input/output (I/O) streams: * Standard input (stdin) for reading. Defaults to reading from the keyboard. * Standard output (stdout) for writing output. Defaults to writing to the screen. * Standard error (stderr) for writing errors and warnings. Defaults to writing to the screen. Each stream has a number called the file descriptor (FD): 0 for stdin, 1 for stdout, and 2 for stderr. The destination of a stream can be changed using something called *redirection*. For example, `echo hello > output.txt`, redirects the standard output of the `echo` command to a text file. * To read standard input from a file, use `<SOURCE_FILE`. * To write standard output to a file, use `>DESTINATION`. * To write standard error to a file, use `2>DESTINATION`. [[1]](#id4) * To append standard output to a file, use `>>DESTINATION_FILE`. * To append standard error to a file, use `2>>DESTINATION_FILE`. * To not overwrite (“clobber”) an existing file, use `>?DESTINATION` or `2>?DESTINATION`. This is known as the “noclobber” redirection. `DESTINATION` can be one of the following: * A filename to write the output to. Often `>/dev/null` to silence output by writing it to the special “sinkhole” file. * An ampersand (`&`) followed by the number of another file descriptor like `&2` for standard error. The output will be written to the destination descriptor. * An ampersand followed by a minus sign (`&-`). The file descriptor will be closed. Note: This may cause the program to fail because its writes will be unsuccessful. As a convenience, the redirection `&>` can be used to direct both stdout and stderr to the same destination. For example, `echo hello &> all_output.txt` redirects both stdout and stderr to the file `all_output.txt`. This is equivalent to `echo hello > all_output.txt 2>&1`. Any arbitrary file descriptor can be used in a redirection by prefixing the redirection with the FD number. * To redirect the input of descriptor N, use `N<DESTINATION`. * To redirect the output of descriptor N, use `N>DESTINATION`. * To append the output of descriptor N to a file, use `N>>DESTINATION_FILE`. For example: ``` # Write `foo`'s standard error (file descriptor 2) # to a file called "output.stderr": foo 2> output.stderr # if $num doesn't contain a number, # this test will be false and print an error, # so by ignoring the error we can be sure that we're dealing # with a number in the "if" block: if test "$num" -gt 2 2>/dev/null # do things with $num as a number greater than 2 else # do things if $num is <= 2 or not a number end # Save `make`s output in a file: make &>/log # Redirections stack and can be used with blocks: begin echo stdout echo stderr >&2 # <- this goes to stderr! end >/dev/null # ignore stdout, so this prints "stderr" ``` It is an error to redirect a builtin, function, or block to a file descriptor above 2. However this is supported for external commands. Piping ------ Another way to redirect streams is a *pipe*. A pipe connects streams with each other. Usually the standard output of one command is connected with the standard input of another. This is done by separating commands with the pipe character `|`. For example: ``` cat foo.txt | head ``` The command `cat foo.txt` sends the contents of `foo.txt` to stdout. This output is provided as input for the `head` program, which prints the first 10 lines of its input. It is possible to pipe a different output file descriptor by prepending its FD number and the output redirect symbol to the pipe. For example: ``` make fish 2>| less ``` will attempt to build `fish`, and any errors will be shown using the `less` pager. [[2]](#id6) As a convenience, the pipe `&|` redirects both stdout and stderr to the same process. This is different from bash, which uses `|&`. Job control ----------- When you start a job in fish, fish itself will pause, and give control of the terminal to the program just started. Sometimes, you want to continue using the commandline, and have the job run in the background. To create a background job, append an & (ampersand) to your command. This will tell fish to run the job in the background. Background jobs are very useful when running programs that have a graphical user interface. Example: ``` emacs & ``` will start the emacs text editor in the background. [fg](cmds/fg) can be used to bring it into the foreground again when needed. Most programs allow you to suspend the program’s execution and return control to fish by pressing `Control`+`Z` (also referred to as `^Z`). Once back at the fish commandline, you can start other programs and do anything you want. If you then want you can go back to the suspended command by using the [fg](cmds/fg) (foreground) command. If you instead want to put a suspended job into the background, use the [bg](cmds/bg) command. To get a listing of all currently started jobs, use the [jobs](cmds/jobs) command. These listed jobs can be removed with the [disown](cmds/disown) command. At the moment, functions cannot be started in the background. Functions that are stopped and then restarted in the background using the [bg](cmds/bg) command will not execute correctly. If the `&` character is followed by a non-separating character, it is not interpreted as background operator. Separating characters are whitespace and the characters `;<>&|`. Functions --------- Functions are programs written in the fish syntax. They group together various commands and their arguments using a single name. For example, here’s a simple function to list directories: ``` function ll ls -l $argv end ``` The first line tells fish to define a function by the name of `ll`, so it can be used by simply writing `ll` on the commandline. The second line tells fish that the command `ls -l $argv` should be called when `ll` is invoked. [$argv](#variables-argv) is a [list variable](#variables-lists), which always contains all arguments sent to the function. In the example above, these are simply passed on to the `ls` command. The `end` on the third line ends the definition. Calling this as `ll /tmp/` will end up running `ls -l /tmp/`, which will list the contents of /tmp. This is a kind of function known as an [alias](#syntax-aliases). Fish’s prompt is also defined in a function, called [fish\_prompt](cmds/fish_prompt). It is run when the prompt is about to be displayed and its output forms the prompt: ``` function fish_prompt # A simple prompt. Displays the current directory # (which fish stores in the $PWD variable) # and then a user symbol - a '►' for a normal user and a '#' for root. set -l user_char '►' if fish_is_root_user set user_char '#' end echo (set_color yellow)$PWD (set_color purple)$user_char end ``` To edit a function, you can use [funced](cmds/funced), and to save a function [funcsave](cmds/funcsave). This will store it in a function file that fish will [autoload](#syntax-function-autoloading) when needed. The [functions](cmds/functions) builtin can show a function’s current definition (and [type](cmds/type) will also do if given a function). For more information on functions, see the documentation for the [function](cmds/function) builtin. ### Defining aliases One of the most common uses for functions is to slightly alter the behavior of an already existing command. For example, one might want to redefine the `ls` command to display colors. The switch for turning on colors on GNU systems is `--color=auto`. An alias around `ls` might look like this: ``` function ls command ls --color=auto $argv end ``` There are a few important things that need to be noted about aliases: * Always take care to add the [$argv](#variables-argv) variable to the list of parameters to the wrapped command. This makes sure that if the user specifies any additional parameters to the function, they are passed on to the underlying command. * If the alias has the same name as the aliased command, you need to prefix the call to the program with `command` to tell fish that the function should not call itself, but rather a command with the same name. If you forget to do so, the function would call itself until the end of time. Usually fish is smart enough to figure this out and will refrain from doing so (which is hopefully in your interest). To easily create a function of this form, you can use the [alias](cmds/alias) command. Unlike other shells, this just makes functions - fish has no separate concept of an “alias”, we just use the word for a simple wrapping function like this. [alias](cmds/alias) immediately creates a function. Consider using `alias --save` or [funcsave](cmds/funcsave) to save the created function into an autoload file instead of recreating the alias each time. For an alternative, try [abbreviations](interactive#abbreviations). These are words that are expanded while you type, instead of being actual functions inside the shell. ### Autoloading functions Functions can be defined on the commandline or in a configuration file, but they can also be automatically loaded. This has some advantages: * An autoloaded function becomes available automatically to all running shells. * If the function definition is changed, all running shells will automatically reload the altered version, after a while. * Startup time and memory usage is improved, etc. When fish needs to load a function, it searches through any directories in the [list variable](#variables-lists) `$fish_function_path` for a file with a name consisting of the name of the function plus the suffix `.fish` and loads the first it finds. For example if you try to execute something called `banana`, fish will go through all directories in $fish\_function\_path looking for a file called `banana.fish` and load the first one it finds. By default `$fish_function_path` contains the following: * A directory for users to keep their own functions, usually `~/.config/fish/functions` (controlled by the `XDG_CONFIG_HOME` environment variable). * A directory for functions for all users on the system, usually `/etc/fish/functions` (really `$__fish_sysconfdir/functions`). * Directories for other software to put their own functions. These are in the directories under `$__fish_user_data_dir` (usually `~/.local/share/fish`, controlled by the `XDG_DATA_HOME` environment variable) and in the `XDG_DATA_DIRS` environment variable, in a subdirectory called `fish/vendor_functions.d`. The default value for `XDG_DATA_DIRS` is usually `/usr/share/fish/vendor_functions.d` and `/usr/local/share/fish/vendor_functions.d`. * The functions shipped with fish, usually installed in `/usr/share/fish/functions` (really `$__fish_data_dir/functions`). If you are unsure, your functions probably belong in `~/.config/fish/functions`. As we’ve explained, autoload files are loaded *by name*, so, while you can put multiple functions into one file, the file will only be loaded automatically once you try to execute the one that shares the name. Autoloading also won’t work for [event handlers](#event), since fish cannot know that a function is supposed to be executed when an event occurs when it hasn’t yet loaded the function. See the [event handlers](#event) section for more information. If a file of the right name doesn’t define the function, fish will not read other autoload files, instead it will go on to try builtins and finally commands. This allows masking a function defined later in $fish\_function\_path, e.g. if your administrator has put something into /etc/fish/functions that you want to skip. If you are developing another program and want to install fish functions for it, install them to the “vendor” functions directory. As this path varies from system to system, you can use `pkgconfig` to discover it with the output of `pkg-config --variable functionsdir fish`. Your installation system should support a custom path to override the pkgconfig path, as other distributors may need to alter it easily. Comments -------- Anything after a `#` until the end of the line is a comment. That means it’s purely for the reader’s benefit, fish ignores it. This is useful to explain what and why you are doing something: ``` function ls # The function is called ls, # so we have to explicitly call `command ls` to avoid calling ourselves. command ls --color=auto $argv end ``` There are no multiline comments. If you want to make a comment span multiple lines, simply start each line with a `#`. Comments can also appear after a line like so: ``` set -gx EDITOR emacs # I don't like vim. ``` Conditions ---------- Fish has some builtins that let you execute commands only if a specific criterion is met: [if](cmds/if), [switch](cmds/switch), [and](cmds/and) and [or](cmds/or), and also the familiar [&&/||](tutorial#tut-combiners) syntax. The [switch](cmds/switch) command is used to execute one of possibly many blocks of commands depending on the value of a string. See the documentation for [switch](cmds/switch) for more information. The other conditionals use the [exit status](#variables-status) of a command to decide if a command or a block of commands should be executed. Unlike programming languages you might know, [if](cmds/if) doesn’t take a *condition*, it takes a *command*. If that command returned a successful [exit status](#variables-status) (that’s 0), the `if` branch is taken, otherwise the [else](cmds/else) branch. To check a condition, there is the [test](cmds/test) command: ``` if test 5 -gt 2 echo Yes, five is greater than two end ``` Some examples: ``` # Just see if the file contains the string "fish" anywhere. # This executes the `grep` command, which searches for a string, # and if it finds it returns a status of 0. # The `-q` switch stops it from printing any matches. if grep -q fish myanimals echo "You have fish!" else echo "You don't have fish!" end # $XDG_CONFIG_HOME is a standard place to store configuration. # If it's not set applications should use ~/.config. set -q XDG_CONFIG_HOME; and set -l configdir $XDG_CONFIG_HOME or set -l configdir ~/.config ``` Note that combiners are *lazy* - only the part that is necessary to determine the final status is run. Compare: ``` if sleep 2; and false echo 'How did I get here? This should be impossible' end ``` and: ``` if false; and sleep 2 echo 'How did I get here? This should be impossible' end ``` These do essentially the same thing, but the former takes 2 seconds longer because the `sleep` always needs to run. So, in cases like these, the ordering is quite important for performance. Or you can have a case where it is necessary to stop early: ``` if command -sq foo; and foo ``` If this went on after seeing that the command “foo” doesn’t exist, it would try to run `foo` and error because it wasn’t found! For more, see the documentation for the builtins or the [Conditionals](tutorial#tut-conditionals) section of the tutorial. Loops and blocks ---------------- Like most programming language, fish also has the familiar [while](cmds/while) and [for](cmds/for) loops. `while` works like a repeated [if](cmds/if): ``` while true echo Still running sleep 1 end ``` will print “Still running” once a second. You can abort it with ctrl-c. `for` loops work like in other shells, which is more like python’s for-loops than e.g. C’s: ``` for file in * echo file: $file end ``` will print each file in the current directory. The part after the `in` is just a list of arguments, so you can use any [expansions](#expand) there: ``` set moreanimals bird fox for animal in {cat,}fish dog $moreanimals echo I like the $animal end ``` If you need a list of numbers, you can use the `seq` command to create one: ``` for i in (seq 1 5) echo $i end ``` [break](cmds/break) is available to break out of a loop, and [continue](cmds/continue) to jump to the next iteration. [Input and output redirections](#redirects) (including [pipes](#pipes)) can also be applied to loops: ``` while read -l line echo line: $line end < file ``` In addition there’s a [begin](cmds/begin) block that just groups commands together so you can redirect to a block or use a new [variable scope](#variables-scope) without any repetition: ``` begin set -l foo bar # this variable will only be available in this block! end ``` Parameter expansion ------------------- When fish is given a commandline, it expands the parameters before sending them to the command. There are multiple different kinds of expansions: * [Wildcards](#expand-wildcard), to create filenames from patterns - `*.jpg` * [Variable expansion](#expand-variable), to use the value of a variable - `$HOME` * [Command substitution](#expand-command-substitution), to use the output of another command - `$(cat /path/to/file)` * [Brace expansion](#expand-brace), to write lists with common pre- or suffixes in a shorter way `{/usr,}/bin` * [Tilde expansion](#expand-home), to turn the `~` at the beginning of paths into the path to the home directory `~/bin` Parameter expansion is limited to 524288 items. There is a limit to how many arguments the operating system allows for any command, and 524288 is far above it. This is a measure to stop the shell from hanging doing useless computation. ### Wildcards (“Globbing”) When a parameter includes an [unquoted](#quotes) `*` star (or “asterisk”) or a `?` question mark, fish uses it as a wildcard to match files. * `*` matches any number of characters (including zero) in a file name, not including `/`. * `**` matches any number of characters (including zero), and also descends into subdirectories. If `**` is a segment by itself, that segment may match zero times, for compatibility with other shells. * `?` can match any single character except `/`. This is deprecated and can be disabled via the `qmark-noglob` [feature flag](#featureflags), so `?` will just be an ordinary character. Wildcard matches are sorted case insensitively. When sorting matches containing numbers, they are naturally sorted, so that the strings ‘1’ ‘5’ and ‘12’ would be sorted like 1, 5, 12. Hidden files (where the name begins with a dot) are not considered when wildcarding unless the wildcard string has a dot in that place. Examples: * `a*` matches any files beginning with an ‘a’ in the current directory. * `**` matches any files and directories in the current directory and all of its subdirectories. * `~/.*` matches all hidden files (also known as “dotfiles”) and directories in your home directory. For most commands, if any wildcard fails to expand, the command is not executed, [$status](#variables-status) is set to nonzero, and a warning is printed. This behavior is like what bash does with `shopt -s failglob`. There are exceptions, namely [set](cmds/set) and [path](cmds/path), overriding variables in [overrides](#variables-override), [count](cmds/count) and [for](cmds/for). Their globs will instead expand to zero arguments (so the command won’t see them at all), like with `shopt -s nullglob` in bash. Examples: ``` # List the .foo files, or warns if there aren't any. ls *.foo # List the .foo files, if any. set foos *.foo if count $foos >/dev/null ls $foos end ``` Unlike bash (by default), fish will not pass on the literal glob character if no match was found, so for a command like `apt install` that does the matching itself, you need to add quotes: ``` apt install "ncurses-*" ``` ### Variable expansion One of the most important expansions in fish is the “variable expansion”. This is the replacing of a dollar sign (`$`) followed by a variable name with the \_value\_ of that variable. In the simplest case, this is just something like: ``` echo $HOME ``` which will replace `$HOME` with the home directory of the current user, and pass it to [echo](cmds/echo), which will then print it. Some variables like `$HOME` are already set because fish sets them by default or because fish’s parent process passed them to fish when it started it. You can define your own variables by setting them with [set](cmds/set): ``` set my_directory /home/cooluser/mystuff ls $my_directory # shows the contents of /home/cooluser/mystuff ``` For more on how setting variables works, see [Shell variables](#variables) and the following sections. Sometimes a variable has no value because it is undefined or empty, and it expands to nothing: ``` echo $nonexistentvariable # Prints no output. ``` To separate a variable name from text you can encase the variable within double-quotes or braces: ``` set WORD cat echo The plural of $WORD is "$WORD"s # Prints "The plural of cat is cats" because $WORD is set to "cat". echo The plural of $WORD is {$WORD}s # ditto ``` Without the quotes or braces, fish will try to expand a variable called `$WORDs`, which may not exist. The latter syntax `{$WORD}` is a special case of [brace expansion](#expand-brace). If $WORD here is undefined or an empty list, the “s” is not printed. However, it is printed if $WORD is the empty string (like after `set WORD ""`). For more on shell variables, read the [Shell variables](#variables) section. #### Quoting variables Unlike all the other expansions, variable expansion also happens in double quoted strings. Inside double quotes (`"these"`), variables will always expand to exactly one argument. If they are empty or undefined, it will result in an empty string. If they have one element, they’ll expand to that element. If they have more than that, the elements will be joined with spaces, unless the variable is a [path variable](#variables-path) - in that case it will use a colon (`:`) instead [[3]](#id8). Outside of double quotes, variables will expand to as many arguments as they have elements. That means an empty list will expand to nothing, a variable with one element will expand to that element, and a variable with multiple elements will expand to each of those elements separately. If a variable expands to nothing, it will cancel out any other strings attached to it. See the [cartesian product](#cartesian-product) section for more information. Unlike other shells, fish doesn’t do what is known as “Word Splitting”. Once a variable is set to a particular set of elements, those elements expand as themselves. They aren’t split on spaces or newlines or anything: ``` > set foo one\nthing > echo $foo one thing > printf '|%s|\n' $foo |one thing| ``` That means quoting isn’t the absolute necessity it is in other shells. Most of the time, not quoting a variable is correct. The exception is when you need to ensure that the variable is passed as one element, even if it might be unset or have multiple elements. This happens often with [test](cmds/test): ``` set -l foo one two three test -n $foo # prints an error that it got too many arguments, because it was executed like test -n one two three test -n "$foo" # works, because it was executed like test -n "one two three" ``` #### Dereferencing variables The `$` symbol can also be used multiple times, as a kind of “dereference” operator (the `*` in C or C++), like in the following code: ``` set foo a b c set a 10; set b 20; set c 30 for i in (seq (count $$foo)) echo $$foo[$i] end # Output is: # 10 # 20 # 30 ``` `$$foo[$i]` is “the value of the variable named by `$foo[$i]`. When using this feature together with list brackets, the brackets will be used from the inside out. `$$foo[5]` will use the fifth element of `$foo` as a variable name, instead of giving the fifth element of all the variables $foo refers to. That would instead be expressed as `$$foo[1..-1][5]` (take all elements of `$foo`, use them as variable names, then give the fifth element of those). ### Command substitution The output of a command (or an entire [pipeline](#pipes)) can be used as the arguments to another command. When you write a command in parentheses like `outercommand (innercommand)`, fish first runs `innercommand`, and then uses each line of its output as a separate argument to `outercommand`, which will then be executed. Unlike other shells, the value of `$IFS` is not used [[4]](#id10), fish splits on newlines. A command substitution can have a dollar sign before the opening parenthesis like `outercommand $(innercommand)`. This variant is also allowed inside double quotes. When using double quotes, the command output is not split up by lines, but trailing empty lines are still removed. If the output is piped to [string split or string split0](cmds/string-split) as the last step, those splits are used as they appear instead of splitting lines. The exit status of the last run command substitution is available in the [status](#variables-status) variable if the substitution happens in the context of a [set](cmds/set) command (so `if set -l (something)` checks if `something` returned true). To use only some lines of the output, refer to [index range expansion](#expand-index-range). Examples: ``` # Outputs 'image.png'. echo (basename image.jpg .jpg).png # Convert all JPEG files in the current directory to the # PNG format using the 'convert' program. for i in *.jpg; convert $i (basename $i .jpg).png; end # Set the ``data`` variable to the contents of 'data.txt' # without splitting it into a list. set data "$(cat data.txt)" # Set ``$data`` to the contents of data, splitting on NUL-bytes. set data (cat data | string split0) ``` Sometimes you want to pass the output of a command to another command that only accepts files. If it’s just one file, you can usually just pass it via a pipe, like: ``` grep fish myanimallist1 | wc -l ``` but if you need multiple or the command doesn’t read from standard input, “process substitution” is useful. Other shells allow this via `foo <(bar) <(baz)`, and fish uses the [psub](cmds/psub) command: ``` # Compare just the lines containing "fish" in two files: diff -u (grep fish myanimallist1 | psub) (grep fish myanimallist2 | psub) ``` This creates a temporary file, stores the output of the command in that file and prints the filename, so it is given to the outer command. Fish has a default limit of 100 MiB on the data it will read in a command sustitution. If that limit is reached the command (all of it, not just the command substitution - the outer command won’t be executed at all) fails and `$status` is set to 122. This is so command substitutions can’t cause the system to go out of memory, because typically your operating system has a much lower limit, so reading more than that would be useless and harmful. This limit can be adjusted with the `fish_read_limit` variable (`0` meaning no limit). This limit also affects the [read](cmds/read) command. ### Brace expansion Curly braces can be used to write comma-separated lists. They will be expanded with each element becoming a new parameter, with the surrounding string attached. This is useful to save on typing, and to separate a variable name from surrounding text. Examples: ``` > echo input.{c,h,txt} input.c input.h input.txt # Move all files with the suffix '.c' or '.h' to the subdirectory src. > mv *.{c,h} src/ # Make a copy of `file` at `file.bak`. > cp file{,.bak} > set -l dogs hot cool cute "good " > echo {$dogs}dog hotdog cooldog cutedog good dog ``` If there is no “,” or variable expansion between the curly braces, they will not be expanded: ``` # This {} isn't special > echo foo-{} foo-{} # This passes "HEAD@{2}" to git > git reset --hard HEAD@{2} > echo {{a,b}} {a} {b} # because the inner brace pair is expanded, but the outer isn't. ``` If after expansion there is nothing between the braces, the argument will be removed (see [the cartesian product section](#cartesian-product)): ``` > echo foo-{$undefinedvar} # Output is an empty line, just like a bare `echo`. ``` If there is nothing between a brace and a comma or two commas, it’s interpreted as an empty element: ``` > echo {,,/usr}/bin /bin /bin /usr/bin ``` To use a “,” as an element, [quote](#quotes) or [escape](#escapes) it. ### Combining lists (Cartesian Product) When lists are expanded with other parts attached, they are expanded with these parts still attached. Even if two lists are attached to each other, they are expanded in all combinations. This is referred to as the “cartesian product” (like in mathematics), and works basically like [brace expansion](#expand-brace). Examples: ``` # Brace expansion is the most familiar: # All elements in the brace combine with the parts outside of the braces >_ echo {good,bad}" apples" good apples bad apples # The same thing happens with variable expansion. >_ set -l a x y z >_ set -l b 1 2 3 # $a is {x,y,z}, $b is {1,2,3}, # so this is `echo {x,y,z}{1,2,3}` >_ echo $a$b x1 y1 z1 x2 y2 z2 x3 y3 z3 # Same thing if something is between the lists >_ echo $a"-"$b x-1 y-1 z-1 x-2 y-2 z-2 x-3 y-3 z-3 # Or a brace expansion and a variable >_ echo {x,y,z}$b x1 y1 z1 x2 y2 z2 x3 y3 z3 # A combined brace-variable expansion >_ echo {$b}word 1word 2word 3word # Special case: If $c has no elements, this expands to nothing >_ echo {$c}word # Output is an empty line ``` Sometimes this may be unwanted, especially that tokens can disappear after expansion. In those cases, you should double-quote variables - `echo "$c"word`. This also happens after [command substitution](#expand-command-substitution). To avoid tokens disappearing there, make the inner command return a trailing newline, or store the output in a variable and double-quote it. E.g. ``` >_ set b 1 2 3 >_ echo (echo x)$b x1 x2 x3 >_ echo (printf '%s' '')banana # the printf prints nothing, so this is nothing times "banana", # which is nothing. >_ echo (printf '%s\n' '')banana # the printf prints a newline, # so the command substitution expands to an empty string, # so this is `''banana` banana ``` This can be quite useful. For example, if you want to go through all the files in all the directories in [`PATH`](#envvar-PATH), use ``` for file in $PATH/* ``` Because [`PATH`](#envvar-PATH) is a list, this expands to all the files in all the directories in it. And if there are no directories in [`PATH`](#envvar-PATH), the right answer here is to expand to no files. ### Index range expansion Sometimes it’s necessary to access only some of the elements of a [list](#variables-lists) (all fish variables are lists), or some of the lines a [command substitution](#expand-command-substitution) outputs. Both are possible in fish by writing a set of indices in brackets, like: ``` # Make $var a list of four elements set var one two three four # Print the second: echo $var[2] # prints "two" # or print the first three: echo $var[1..3] # prints "one two three" ``` In index brackets, fish understands ranges written like `a..b` (‘a’ and ‘b’ being indices). They are expanded into a sequence of indices from a to b (so `a a+1 a+2 ... b`), going up if b is larger and going down if a is larger. Negative indices can also be used - they are taken from the end of the list, so `-1` is the last element, and `-2` the one before it. If an index doesn’t exist the range is clamped to the next possible index. If a list has 5 elements the indices go from 1 to 5, so a range of `2..16` will only go from element 2 to element 5. If the end is negative the range always goes up, so `2..-2` will go from element 2 to 4, and `2..-16` won’t go anywhere because there is no way to go from the second element to one that doesn’t exist, while going up. If the start is negative the range always goes down, so `-2..1` will go from element 4 to 1, and `-16..2` won’t go anywhere because there is no way to go from an element that doesn’t exist to the second element, while going down. A missing starting index in a range defaults to 1. This is allowed if the range is the first index expression of the sequence. Similarly, a missing ending index, defaulting to -1 is allowed for the last index range in the sequence. Multiple ranges are also possible, separated with a space. Some examples: ``` echo (seq 10)[1 2 3] # Prints: 1 2 3 # Limit the command substitution output echo (seq 10)[2..5] # Uses elements from 2 to 5 # Output is: 2 3 4 5 echo (seq 10)[7..] # Prints: 7 8 9 10 # Use overlapping ranges: echo (seq 10)[2..5 1..3] # Takes elements from 2 to 5 and then elements from 1 to 3 # Output is: 2 3 4 5 1 2 3 # Reverse output echo (seq 10)[-1..1] # Uses elements from the last output line to # the first one in reverse direction # Output is: 10 9 8 7 6 5 4 3 2 1 # The command substitution has only one line, # so these will result in empty output: echo (echo one)[2..-1] echo (echo one)[-3..1] ``` The same works when setting or expanding variables: ``` # Reverse path variable set PATH $PATH[-1..1] # or set PATH[-1..1] $PATH # Use only n last items of the PATH set n -3 echo $PATH[$n..-1] ``` Variables can be used as indices for expansion of variables, like so: ``` set index 2 set letters a b c d echo $letters[$index] # returns 'b' ``` However using variables as indices for command substitution is currently not supported, so: ``` echo (seq 5)[$index] # This won't work set sequence (seq 5) # It needs to be written on two lines like this. echo $sequence[$index] # returns '2' ``` When using indirect variable expansion with multiple `$` (`$$name`), you have to give all indices up to the variable you want to slice: ``` > set -l list 1 2 3 4 5 > set -l name list > echo $$name[1] 1 2 3 4 5 > echo $$name[1..-1][1..3] # or $$name[1][1..3], since $name only has one element. 1 2 3 ``` ### Home directory expansion The `~` (tilde) character at the beginning of a parameter, followed by a username, is expanded into the home directory of the specified user. A lone `~`, or a `~` followed by a slash, is expanded into the home directory of the process owner: ``` ls ~/Music # lists my music directory echo ~root # prints root's home directory, probably "/root" ``` ### Combining different expansions All of the above expansions can be combined. If several expansions result in more than one parameter, all possible combinations are created. When combining multiple parameter expansions, expansions are performed in the following order: * Command substitutions * Variable expansions * Bracket expansion * Wildcard expansion Expansions are performed from right to left, nested bracket expansions are performed from the inside and out. Example: If the current directory contains the files ‘foo’ and ‘bar’, the command `echo a(ls){1,2,3}` will output `abar1 abar2 abar3 afoo1 afoo2 afoo3`. Shell variables --------------- Variables are a way to save data and pass it around. They can be used just by the shell, or they can be “[exported](#variables-export)”, so that a copy of the variable is available to any external command the shell starts. An exported variable is referred to as an “environment variable”. To set a variable value, use the [set](cmds/set) command. A variable name can not be empty and can contain only letters, digits, and underscores. It may begin and end with any of those characters. Example: To set the variable `smurf_color` to the value `blue`, use the command `set smurf_color blue`. After a variable has been set, you can use the value of a variable in the shell through [variable expansion](#expand-variable). Example: ``` set smurf_color blue echo Smurfs are usually $smurf_color set pants_color red echo Papa smurf, who is $smurf_color, wears $pants_color pants ``` So you set a variable with `set`, and use it with a `$` and the name. ### Variable Scope There are four kinds of variables in fish: universal, global, function and local variables. * Universal variables are shared between all fish sessions a user is running on one computer. * Global variables are specific to the current fish session, and will never be erased unless explicitly requested by using `set -e`. * Function variables are specific to the currently executing function. They are erased (“go out of scope”) when the current function ends. Outside of a function, they don’t go out of scope. * Local variables are specific to the current block of commands, and automatically erased when a specific block goes out of scope. A block of commands is a series of commands that begins with one of the commands `for`, `while` , `if`, `function`, `begin` or `switch`, and ends with the command `end`. Outside of a block, this is the same as the function scope. Variables can be explicitly set to be universal with the `-U` or `--universal` switch, global with `-g` or `--global`, function-scoped with `-f` or `--function` and local to the current block with `-l` or `--local`. The scoping rules when creating or updating a variable are: * When a scope is explicitly given, it will be used. If a variable of the same name exists in a different scope, that variable will not be changed. * When no scope is given, but a variable of that name exists, the variable of the smallest scope will be modified. The scope will not be changed. * When no scope is given and no variable of that name exists, the variable is created in function scope if inside a function, or global scope if no function is executing. There can be many variables with the same name, but different scopes. When you [use a variable](#expand-variable), the smallest scoped variable of that name will be used. If a local variable exists, it will be used instead of the global or universal variable of the same name. Example: There are a few possible uses for different scopes. Typically inside functions you should use local scope: ``` function something set -l file /path/to/my/file if not test -e "$file" set file /path/to/my/otherfile end end # or function something if test -e /path/to/my/file set -f file /path/to/my/file else set -f file /path/to/my/otherfile end end ``` If you want to set something in config.fish, or set something in a function and have it available for the rest of the session, global scope is a good choice: ``` # Don't shorten the working directory in the prompt set -g fish_prompt_pwd_dir_length 0 # Set my preferred cursor style: function setcursors set -g fish_cursor_default block set -g fish_cursor_insert line set -g fish_cursor_visual underscore end # Set my language set -gx LANG de_DE.UTF-8 ``` If you want to set some personal customization, universal variables are nice: ``` # Typically you'd run this interactively, fish takes care of keeping it. set -U fish_color_autosuggestion 555 ``` Here is an example of local vs function-scoped variables: ``` function test-scopes begin # This is a nice local scope where all variables will die set -l pirate 'There be treasure in them thar hills' set -f captain Space, the final frontier # If no variable of that name was defined, it is function-local. set gnu "In the beginning there was nothing, which exploded" end echo $pirate # This will not output anything, since the pirate was local echo $captain # This will output the good Captain's speech since $captain had function-scope. echo $gnu # Will output Sir Terry's wisdom. end ``` When in doubt, use function-scoped variables. When you need to make a variable accessible everywhere, make it global. When you need to persistently store configuration, make it universal. When you want to use a variable only in a short block, make it local. ### Overriding variables for a single command If you want to override a variable for a single command, you can use “var=val” statements before the command: ``` # Call git status on another directory # (can also be done via `git -C somerepo status`) GIT_DIR=somerepo git status ``` Unlike other shells, fish will first set the variable and then perform other expansions on the line, so: ``` set foo banana foo=gagaga echo $foo # prints gagaga, while in other shells it might print "banana" ``` Multiple elements can be given in a [brace expansion](#expand-brace): ``` # Call bash with a reasonable default path. PATH={/usr,}/{s,}bin bash ``` Or with a [glob](#expand-wildcard): ``` # Run vlc on all mp3 files in the current directory # If no file exists it will still be run with no arguments mp3s=*.mp3 vlc $mp3s ``` Unlike other shells, this does *not* inhibit any lookup (aliases or similar). Calling a command after setting a variable override will result in the exact same command being run. This syntax is supported since fish 3.1. ### More on universal variables Universal variables are variables that are shared between all the user’s fish sessions on the computer. Fish stores many of its configuration options as universal variables. This means that in order to change fish settings, all you have to do is change the variable value once, and it will be automatically updated for all sessions, and preserved across computer reboots and login/logout. To see universal variables in action, start two fish sessions side by side, and issue the following command in one of them `set fish_color_cwd blue`. Since `fish_color_cwd` is a universal variable, the color of the current working directory listing in the prompt will instantly change to blue on both terminals. [Universal variables](#variables-universal) are stored in the file `.config/fish/fish_variables`. Do not edit this file directly, as your edits may be overwritten. Edit the variables through fish scripts or by using fish interactively instead. Do not append to universal variables in [config.fish](#configuration), because these variables will then get longer with each new shell instance. Instead, simply set them once at the command line. ### Variable scope for functions When calling a function, all current local variables temporarily disappear. This shadowing of the local scope is needed since the variable namespace would become cluttered, making it very easy to accidentally overwrite variables from another function. For example: ``` function shiver set phrase 'Shiver me timbers' end function avast set --local phrase 'Avast, mateys' # Calling the shiver function here can not # change any variables in the local scope shiver echo $phrase end avast # Outputs "Avast, mateys" ``` ### Exporting variables Variables in fish can be exported, so they will be inherited by any commands started by fish. In particular, this is necessary for variables used to configure external commands like `PAGER` or `GOPATH`, but also for variables that contain general system settings like `PATH` or `LANGUAGE`. If an external command needs to know a variable, it needs to be exported. Exported variables are also often called “environment variables”. This also applies to fish - when it starts up, it receives environment variables from its parent (usually the terminal). These typically include system configuration like [`PATH`](#envvar-PATH) and [locale variables](#variables-locale). Variables can be explicitly set to be exported with the `-x` or `--export` switch, or not exported with the `-u` or `--unexport` switch. The exporting rules when setting a variable are similar to the scoping rules for variables - when an option is passed it is respected, otherwise the variable’s existing state is used. If no option is passed and the variable didn’t exist yet it is not exported. As a naming convention, exported variables are in uppercase and unexported variables are in lowercase. For example: ``` set -gx ANDROID_HOME ~/.android # /opt/android-sdk set -gx CDPATH . ~ (test -e ~/Videos; and echo ~/Videos) set -gx EDITOR emacs -nw set -gx GOPATH ~/dev/go set -gx GTK2_RC_FILES "$XDG_CONFIG_HOME/gtk-2.0/gtkrc" set -gx LESSHISTFILE "-" ``` Note: Exporting is not a [scope](#variables-scope), but an additional state. It typically makes sense to make exported variables global as well, but local-exported variables can be useful if you need something more specific than [Overrides](#variables-override). They are *copied* to functions so the function can’t alter them outside, and still available to commands. Global variables are accessible to functions whether they are exported or not. ### Lists Fish can store a list (or an “array” if you wish) of multiple strings inside of a variable: ``` > set mylist first second third > printf '%s\n' $mylist # prints each element on its own line first second third ``` To access one element of a list, use the index of the element inside of square brackets, like this: ``` echo $PATH[3] ``` List indices start at 1 in fish, not 0 like in other languages. This is because it requires less subtracting of 1 and many common Unix tools like `seq` work better with it (`seq 5` prints 1 to 5, not 0 to 5). An invalid index is silently ignored resulting in no value (not even an empty string, just no argument at all). If you don’t use any brackets, all the elements of the list will be passed to the command as separate items. This means you can iterate over a list with `for`: ``` for i in $PATH echo $i is in the path end ``` This goes over every directory in [`PATH`](#envvar-PATH) separately and prints a line saying it is in the path. To create a variable `smurf`, containing the items `blue` and `small`, simply write: ``` set smurf blue small ``` It is also possible to set or erase individual elements of a list: ``` # Set smurf to be a list with the elements 'blue' and 'small' set smurf blue small # Change the second element of smurf to 'evil' set smurf[2] evil # Erase the first element set -e smurf[1] # Output 'evil' echo $smurf ``` If you specify a negative index when expanding or assigning to a list variable, the index will be taken from the *end* of the list. For example, the index -1 is the last element of the list: ``` > set fruit apple orange banana > echo $fruit[-1] banana > echo $fruit[-2..-1] orange banana > echo $fruit[-1..1] # reverses the list banana orange apple ``` As you see, you can use a range of indices, see [index range expansion](#expand-index-range) for details. All lists are one-dimensional and can’t contain other lists, although it is possible to fake nested lists using dereferencing - see [variable expansion](#expand-variable). When a list is exported as an environment variable, it is either space or colon delimited, depending on whether it is a [path variable](#variables-path): ``` > set -x smurf blue small > set -x smurf_PATH forest mushroom > env | grep smurf smurf=blue small smurf_PATH=forest:mushroom ``` Fish automatically creates lists from all environment variables whose name ends in `PATH` (like [`PATH`](#envvar-PATH), [`CDPATH`](#envvar-CDPATH) or `MANPATH`), by splitting them on colons. Other variables are not automatically split. Lists can be inspected with the [count](cmds/count) or the [contains](cmds/contains) commands: ``` > count $smurf 2 > contains blue $smurf # blue was found, so it exits with status 0 # (without printing anything) > echo $status 0 > contains -i blue $smurf 1 ``` A nice thing about lists is that they are passed to commands one element as one argument, so once you’ve set your list, you can just pass it: ``` set -l grep_args -r "my string" grep $grep_args . # will run the same as `grep -r "my string"` . ``` Unlike other shells, fish does not do “word splitting” - elements in a list stay as they are, even if they contain spaces or tabs. ### Argument Handling An important list is `$argv`, which contains the arguments to a function or script. For example: ``` function myfunction echo $argv[1] echo $argv[3] end ``` This function takes whatever arguments it gets and prints the first and third: ``` > myfunction first second third first third > myfunction apple cucumber banana apple banana ``` That covers the positional arguments, but commandline tools often get various options and flags, and $argv would contain them intermingled with the positional arguments. Typical unix argument handling allows short options (`-h`, also grouped like in `ls -lah`), long options (`--help`) and allows those options to take arguments (`--color=auto` or `--position anywhere` or `complete -C"git "`) as well as a `--` separator to signal the end of options. Handling all of these manually is tricky and error-prone. A more robust approach to option handling is [argparse](cmds/argparse), which checks the defined options and puts them into various variables, leaving only the positional arguments in $argv. Here’s a simple example: ``` function mybetterfunction # We tell argparse about -h/--help and -s/--second - these are short and long forms of the same option. # The "--" here is mandatory, it tells it from where to read the arguments. argparse h/help s/second -- $argv # exit if argparse failed because it found an option it didn't recognize - it will print an error or return # If -h or --help is given, we print a little help text and return if set -ql _flag_help echo "mybetterfunction [-h|--help] [-s|--second] [ARGUMENT ...]" return 0 end # If -s or --second is given, we print the second argument, # not the first and third. # (this is also available as _flag_s because of the short version) if set -ql _flag_second echo $argv[2] else echo $argv[1] echo $argv[3] end end ``` The options will be *removed* from $argv, so $argv[2] is the second *positional* argument now: ``` > mybetterfunction first -s second third second ``` For more information on argparse, like how to handle option arguments, see [the argparse documentation](cmds/argparse). ### PATH variables Path variables are a special kind of variable used to support colon-delimited path lists including [`PATH`](#envvar-PATH), [`CDPATH`](#envvar-CDPATH), `MANPATH`, `PYTHONPATH`, etc. All variables that end in “PATH” (case-sensitive) become PATH variables by default. PATH variables act as normal lists, except they are implicitly joined and split on colons. ``` set MYPATH 1 2 3 echo "$MYPATH" # 1:2:3 set MYPATH "$MYPATH:4:5" echo $MYPATH # 1 2 3 4 5 echo "$MYPATH" # 1:2:3:4:5 ``` Path variables will also be exported in the colon form, so `set -x MYPATH 1 2 3` will have external commands see it as `1:2:3`. ``` > set -gx MYPATH /bin /usr/bin /sbin > env | grep MYPATH MYPATH=/bin:/usr/bin:/sbin ``` This is for compatibility with other tools. Unix doesn’t have variables with multiple elements, the closest thing it has are colon-lists like [`PATH`](#envvar-PATH). For obvious reasons this means no element can contain a `:`. Variables can be marked or unmarked as PATH variables via the `--path` and `--unpath` options to `set`. ### Special variables You can change the settings of fish by changing the values of certain variables. `PATH` A list of directories in which to search for commands. This is a common unix variable also used by other tools. `CDPATH` A list of directories in which the [cd](cmds/cd) builtin looks for a new directory. Locale Variables The locale variables [`LANG`](#envvar-LANG), [`LC_ALL`](#envvar-LC_ALL), [`LC_COLLATE`](#envvar-LC_COLLATE), [`LC_CTYPE`](#envvar-LC_CTYPE), [`LC_MESSAGES`](#envvar-LC_MESSAGES), [`LC_MONETARY`](#envvar-LC_MONETARY), [`LC_NUMERIC`](#envvar-LC_NUMERIC), and [`LANG`](#envvar-LANG) set the language option for the shell and subprograms. See the section [Locale variables](#variables-locale) for more information. Color variables A number of variable starting with the prefixes `fish_color` and `fish_pager_color`. See [Variables for changing highlighting colors](interactive#variables-color) for more information. `fish_ambiguous_width` controls the computed width of ambiguous-width characters. This should be set to 1 if your terminal renders these characters as single-width (typical), or 2 if double-width. `fish_emoji_width` controls whether fish assumes emoji render as 2 cells or 1 cell wide. This is necessary because the correct value changed from 1 to 2 in Unicode 9, and some terminals may not be aware. Set this if you see graphical glitching related to emoji (or other “special” characters). It should usually be auto-detected. `fish_autosuggestion_enabled` controls if [Autosuggestions](interactive#autosuggestions) are enabled. Set it to 0 to disable, anything else to enable. By default they are on. `fish_handle_reflow` determines whether fish should try to repaint the commandline when the terminal resizes. In terminals that reflow text this should be disabled. Set it to 1 to enable, anything else to disable. `fish_key_bindings` the name of the function that sets up the keyboard shortcuts for the [command-line editor](interactive#editor). `fish_escape_delay_ms` sets how long fish waits for another key after seeing an escape, to distinguish pressing the escape key from the start of an escape sequence. The default is 30ms. Increasing it increases the latency but allows pressing escape instead of alt for alt+character bindings. For more information, see [the chapter in the bind documentation](cmds/bind#cmd-bind-escape). `fish_complete_path` determines where fish looks for completion. When trying to complete for a command, fish looks for files in the directories in this variable. `fish_cursor_selection_mode` controls whether the selection is inclusive or exclusive of the character under the cursor (see [Copy and Paste](interactive#killring)). `fish_function_path` determines where fish looks for functions. When fish [autoloads](#syntax-function-autoloading) a function, it will look for files in these directories. `fish_greeting` the greeting message printed on startup. This is printed by a function of the same name that can be overridden for more complicated changes (see [funced](cmds/funced)) `fish_history` the current history session name. If set, all subsequent commands within an interactive fish session will be logged to a separate file identified by the value of the variable. If unset, the default session name “fish” is used. If set to an empty string, history is not saved to disk (but is still available within the interactive session). `fish_trace` if set and not empty, will cause fish to print commands before they execute, similar to `set -x` in bash. The trace is printed to the path given by the `--debug-output` option to fish or the [`FISH_DEBUG_OUTPUT`](#envvar-FISH_DEBUG_OUTPUT) variable. It goes to stderr by default. `FISH_DEBUG` Controls which debug categories **fish** enables for output, analogous to the `--debug` option. `FISH_DEBUG_OUTPUT` Specifies a file to direct debug output to. `fish_user_paths` a list of directories that are prepended to [`PATH`](#envvar-PATH). This can be a universal variable. `umask` the current file creation mask. The preferred way to change the umask variable is through the [umask](cmds/umask) function. An attempt to set umask to an invalid value will always fail. `BROWSER` your preferred web browser. If this variable is set, fish will use the specified browser instead of the system default browser to display the fish documentation. Fish also provides additional information through the values of certain environment variables. Most of these variables are read-only and their value can’t be changed with `set`. `_` the name of the currently running command (though this is deprecated, and the use of `status current-command` is preferred). `argv` a list of arguments to the shell or function. `argv` is only defined when inside a function call, or if fish was invoked with a list of arguments, like `fish myscript.fish foo bar`. This variable can be changed. `CMD_DURATION` the runtime of the last command in milliseconds. COLUMNS and LINES the current size of the terminal in height and width. These values are only used by fish if the operating system does not report the size of the terminal. Both variables must be set in that case otherwise a default of 80x24 will be used. They are updated when the window size changes. `fish_kill_signal` the signal that terminated the last foreground job, or 0 if the job exited normally. `fish_killring` a list of entries in fish’s [kill ring](interactive#killring) of cut text. `fish_read_limit` how many bytes fish will process with [read](cmds/read) or in a [command substitution](#expand-command-substitution). `fish_pid` the process ID (PID) of the shell. `history` a list containing the last commands that were entered. `HOME` the user’s home directory. This variable can be changed. `hostname` the machine’s hostname. `IFS` the internal field separator that is used for word splitting with the [read](cmds/read) builtin. Setting this to the empty string will also disable line splitting in [command substitution](#expand-command-substitution). This variable can be changed. `last_pid` the process ID (PID) of the last background process. `PWD` the current working directory. `pipestatus` a list of exit statuses of all processes that made up the last executed pipe. See [exit status](#variables-status). `SHLVL` the level of nesting of shells. Fish increments this in interactive shells, otherwise it simply passes it along. `status` the [exit status](#variables-status) of the last foreground job to exit. If the job was terminated through a signal, the exit status will be 128 plus the signal number. `status_generation` the “generation” count of `$status`. This will be incremented only when the previous command produced an explicit status. (For example, background jobs will not increment this). `TERM` the type of the current terminal. When fish tries to determine how the terminal works - how many colors it supports, what sequences it sends for keys and other things - it looks at this variable and the corresponding information in the terminfo database (see `man terminfo`). Note: Typically this should not be changed as the terminal sets it to the correct value. `USER` the current username. This variable can be changed. `EUID` the current effective user id, set by fish at startup. This variable can be changed. `version` the version of the currently running fish (also available as `FISH_VERSION` for backward compatibility). As a convention, an uppercase name is usually used for exported variables, while lowercase variables are not exported. (`CMD_DURATION` is an exception for historical reasons). This rule is not enforced by fish, but it is good coding practice to use casing to distinguish between exported and unexported variables. Fish also uses some variables internally, their name usually starting with `__fish`. These are internal and should not typically be modified directly. ### The status variable Whenever a process exits, an exit status is returned to the program that started it (usually the shell). This exit status is an integer number, which tells the calling application how the execution of the command went. In general, a zero exit status means that the command executed without problem, but a non-zero exit status means there was some form of problem. Fish stores the exit status of the last process in the last job to exit in the `status` variable. If fish encounters a problem while executing a command, the status variable may also be set to a specific value: * 0 is generally the exit status of commands if they successfully performed the requested operation. * 1 is generally the exit status of commands if they failed to perform the requested operation. * 121 is generally the exit status of commands if they were supplied with invalid arguments. * 123 means that the command was not executed because the command name contained invalid characters. * 124 means that the command was not executed because none of the wildcards in the command produced any matches. * 125 means that while an executable with the specified name was located, the operating system could not actually execute the command. * 126 means that while a file with the specified name was located, it was not executable. * 127 means that no function, builtin or command with the given name could be located. If a process exits through a signal, the exit status will be 128 plus the number of the signal. The status can be negated with [not](cmds/not) (or `!`), which is useful in a [condition](#syntax-conditional). This turns a status of 0 into 1 and any non-zero status into 0. There is also `$pipestatus`, which is a list of all `status` values of processes in a pipe. One difference is that [not](cmds/not) applies to `$status`, but not `$pipestatus`, because it loses information. For example: ``` not cat file | grep -q fish echo status is: $status pipestatus is $pipestatus ``` Here `$status` reflects the status of `grep`, which returns 0 if it found something, negated with `not` (so 1 if it found something, 0 otherwise). `$pipestatus` reflects the status of `cat` (which returns non-zero for example when it couldn’t find the file) and `grep`, without the negation. So if both `cat` and `grep` succeeded, `$status` would be 1 because of the `not`, and `$pipestatus` would be 0 and 0. It’s possible for the first command to fail while the second succeeds. One common example is when the second program quits early. For example, if you have a pipeline like: ``` cat file1 file2 | head -n 50 ``` This will tell `cat` to print two files, “file1” and “file2”, one after the other, and the `head` will then only print the first 50 lines. In this case you might often see this constellation: ``` > cat file1 file2 | head -n 50 # 50 lines of output > echo $pipestatus 141 0 ``` Here, the “141” signifies that `cat` was killed by signal number 13 (128 + 13 == 141) - a `SIGPIPE`. You can also use [`fish_kill_signal`](#envvar-fish_kill_signal) to see the signal number. This happens because it was still working, and then `head` closed the pipe, so `cat` received a signal that it didn’t ignore and so it died. Whether `cat` here will see a SIGPIPE depends on how long the file is and how much it writes at once, so you might see a pipestatus of “0 0”, depending on the implementation. This is a general unix issue and not specific to fish. Some shells feature a “pipefail” feature that will call a pipeline failed if one of the processes in it failed, and this is a big problem with it. ### Locale Variables The “locale” of a program is its set of language and regional settings that depend on language and cultural convention. In UNIX, these are made up of several categories. The categories are: `LANG` This is the typical environment variable for specifying a locale. A user may set this variable to express the language they speak, their region, and a character encoding. The actual values are specific to their platform, except for special values like `C` or `POSIX`. The value of LANG is used for each category unless the variable for that category was set or LC\_ALL is set. So typically you only need to set LANG. An example value might be `en_US.UTF-8` for the american version of english and the UTF-8 encoding, or `de_AT.UTF-8` for the austrian version of german and the UTF-8 encoding. Your operating system might have a `locale` command that you can call as `locale -a` to see a list of defined locales. A UTF-8 encoding is recommended. `LC_ALL` Overrides the [`LANG`](#envvar-LANG) environment variable and the values of the other `LC_*` variables. If this is set, none of the other variables are used for anything. Usually the other variables should be used instead. Use LC\_ALL only when you need to override something. `LC_COLLATE` This determines the rules about equivalence of cases and alphabetical ordering: collation. `LC_CTYPE` This determines classification rules, like if the type of character is an alpha, digit, and so on. Most importantly, it defines the text *encoding* - which numbers map to which characters. On modern systems, this should typically be something ending in “UTF-8”. `LC_MESSAGES` `LC_MESSAGES` determines the language in which messages are diisplayed. `LC_MONETARY` Determines currency, how it is formated, and the symbols used. `LC_NUMERIC` Sets the locale for formatting numbers. `LC_TIME` Sets the locale for formatting dates and times. Builtin commands ---------------- Fish includes a number of commands in the shell directly. We call these “builtins”. These include: * Builtins that manipulate the shell state - [cd](cmds/cd) changes directory, [set](cmds/set) sets variables * Builtins for dealing with data, like [string](cmds/string) for strings and [math](cmds/math) for numbers, [count](cmds/count) for counting lines or arguments, [path](cmds/path) for dealing with path * [status](cmds/status) for asking about the shell’s status * [printf](cmds/printf) and [echo](cmds/echo) for creating output * [test](cmds/test) for checking conditions * [argparse](cmds/argparse) for parsing function arguments * [source](cmds/source) to read a script in the current shell (so changes to variables stay) and [eval](cmds/eval) to execute a string as script * [random](cmds/random) to get random numbers or pick a random element from a list * [read](cmds/read) for reading from a pipe or the terminal For a list of all builtins, use `builtin -n`. For a list of all builtins, functions and commands shipped with fish, see the [list of commands](commands#commands). The documentation is also available by using the `--help` switch. Command lookup -------------- When fish is told to run something, it goes through multiple steps to find it. If it contains a `/`, fish tries to execute the given file, from the current directory on. If it doesn’t contain a `/`, it could be a function, builtin, or external command, and so fish goes through the full lookup. In order: 1. It tries to resolve it as a [function](#syntax-function). * If the function is already known, it uses that * If there is a file of the name with a “.fish” suffix in [`fish_function_path`](#envvar-fish_function_path), it [loads that](#syntax-function-autoloading). (If there is more than one file only the first is used) * If the function is now defined it uses that 2. It tries to resolve it as a [builtin](#builtin-overview). 3. It tries to find an executable file in [`PATH`](#envvar-PATH). * If it finds a file, it tells the kernel to run it. * If the kernel knows how to run the file (e.g. via a `#!` line - `#!/bin/sh` or `#!/usr/bin/python`), it does it. * If the kernel reports that it couldn’t run it because of a missing interpreter, and the file passes a rudimentary check, fish tells `/bin/sh` to run it. If none of these work, fish runs the function [fish\_command\_not\_found](cmds/fish_command_not_found) and sets [`status`](#envvar-status) to 127. You can use [type](cmds/type) to see how fish resolved something: ``` > type --short --all echo echo is a builtin echo is /usr/bin/echo ``` Querying for user input ----------------------- Sometimes, you want to ask the user for input, for instance to confirm something. This can be done with the [read](cmds/read) builtin. Let’s make up an example. This function will [glob](#expand-wildcard) the files in all the directories it gets as [arguments](#variables-argv), and [if](#syntax-conditional) there are [more than five](cmds/test) it will ask the user if it is supposed to show them, but only if it is connected to a terminal: ``` function show_files # This will glob on all arguments. Any non-directories will be ignored. set -l files $argv/* # If there are more than 5 files if test (count $files) -gt 5 # and both stdin (for reading input) and stdout (for writing the prompt) # are terminals and isatty stdin and isatty stdout # Keep asking until we get a valid response while read --nchars 1 -l response --prompt-str="Are you sure? (y/n)" or return 1 # if the read was aborted with ctrl-c/ctrl-d switch $response case y Y echo Okay # We break out of the while and go on with the function break case n N # We return from the function without printing echo Not showing return 1 case '*' # We go through the while loop and ask again echo Not valid input continue end end end # And now we print the files printf '%s\n' $files end ``` If you run this as `show_files /`, it will most likely ask you until you press Y/y or N/n. If you run this as `show_files / | cat`, it will print the files without asking. If you run this as `show_files .`, it might just print something without asking because there are fewer than five files. Shell variable and function names --------------------------------- The names given to variables and functions (so-called “identifiers”) have to follow certain rules: * A variable name cannot be empty. It can contain only letters, digits, and underscores. It may begin and end with any of those characters. * A function name cannot be empty. It may not begin with a hyphen (“-”) and may not contain a slash (“/”). All other characters, including a space, are valid. A function name also can’t be the same as a reserved keyword or essential builtin like `if` or `set`. * A bind mode name (e.g., `bind -m abc ...`) must be a valid variable name. Other things have other restrictions. For instance what is allowed for file names depends on your system, but at the very least they cannot contain a “/” (because that is the path separator) or NULL byte (because that is how UNIX ends strings). Configuration files ------------------- When fish is started, it reads and runs its configuration files. Where these are depends on build configuration and environment variables. The main file is `~/.config/fish/config.fish` (or more precisely `$XDG_CONFIG_HOME/fish/config.fish`). Configuration files are run in the following order: * Configuration snippets (named `*.fish`) in the directories: + `$__fish_config_dir/conf.d` (by default, `~/.config/fish/conf.d/`) + `$__fish_sysconf_dir/conf.d` (by default, `/etc/fish/conf.d/`) + Directories for others to ship configuration snippets for their software. Fish searches the directories under `$__fish_user_data_dir` (usually `~/.local/share/fish`, controlled by the `XDG_DATA_HOME` environment variable) and in the `XDG_DATA_DIRS` environment variable for a `fish/vendor_conf.d` directory; if not defined, the default value of `XDG_DATA_DIRS` is `/usr/share/fish/vendor_conf.d` and `/usr/local/share/fish/vendor_conf.d`, unless your distribution customized this.If there are multiple files with the same name in these directories, only the first will be executed. They are executed in order of their filename, sorted (like globs) in a natural order (i.e. “01” sorts before “2”). * System-wide configuration files, where administrators can include initialization for all users on the system - similar to `/etc/profile` for POSIX-style shells - in `$__fish_sysconf_dir` (usually `/etc/fish/config.fish`). * User configuration, usually in `~/.config/fish/config.fish` (controlled by the `XDG_CONFIG_HOME` environment variable, and accessible as `$__fish_config_dir`). `~/.config/fish/config.fish` is sourced *after* the snippets. This is so you can copy snippets and override some of their behavior. These files are all executed on the startup of every shell. If you want to run a command only on starting an interactive shell, use the exit status of the command `status --is-interactive` to determine if the shell is interactive. If you want to run a command only when using a login shell, use `status --is-login` instead. This will speed up the starting of non-interactive or non-login shells. If you are developing another program, you may want to add configuration for all users of fish on a system. This is discouraged; if not carefully written, they may have side-effects or slow the startup of the shell. Additionally, users of other shells won’t benefit from the fish-specific configuration. However, if they are required, you can install them to the “vendor” configuration directory. As this path may vary from system to system, `pkg-config` should be used to discover it: `pkg-config --variable confdir fish`. Future feature flags -------------------- Feature flags are how fish stages changes that might break scripts. Breaking changes are introduced as opt-in, in a few releases they become opt-out, and eventually the old behavior is removed. You can see the current list of features via `status features`: ``` > status features stderr-nocaret on 3.0 ^ no longer redirects stderr qmark-noglob off 3.0 ? no longer globs regex-easyesc on 3.1 string replace -r needs fewer \\'s ampersand-nobg-in-token on 3.4 & only backgrounds if followed by a separating character ``` Here is what they mean: * `stderr-nocaret` was introduced in fish 3.0 (and made the default in 3.3). It makes `^` an ordinary character instead of denoting an stderr redirection, to make dealing with quoting and such easier. Use `2>` instead. This can no longer be turned off since fish 3.5. The flag can still be tested for compatibility, but a `no-stderr-nocaret` value will simply be ignored. * `qmark-noglob` was also introduced in fish 3.0. It makes `?` an ordinary character instead of a single-character glob. Use a `*` instead (which will match multiple characters) or find other ways to match files like `find`. * `regex-easyesc` was introduced in 3.1. It makes it so the replacement expression in `string replace -r` does one fewer round of escaping. Before, to escape a backslash you would have to use `string replace -ra '([ab])' '\\\\\\\\$1'`. After, just `'\\\\$1'` is enough. Check your `string replace` calls if you use this anywhere. * `ampersand-nobg-in-token` was introduced in fish 3.4. It makes it so a `&` i no longer interpreted as the backgrounding operator in the middle of a token, so dealing with URLs becomes easier. Either put spaces or a semicolon after the `&`. This is recommended formatting anyway, and `fish_indent` will have done it for you already. These changes are introduced off by default. They can be enabled on a per session basis: ``` > fish --features qmark-noglob,regex-easyesc ``` or opted into globally for a user: ``` > set -U fish_features regex-easyesc qmark-noglob ``` Features will only be set on startup, so this variable will only take effect if it is universal or exported. You can also use the version as a group, so `3.0` is equivalent to “stderr-nocaret” and “qmark-noglob”. Instead of a version, the special group `all` enables all features. Prefixing a feature with `no-` turns it off instead. E.g. to reenable the `?` single-character glob: ``` set -Ua fish_features no-qmark-noglob ``` Currently, the following features are enabled by default: * stderr-nocaret - `^` no longer redirects stderr, use `2>`. Enabled by default in fish 3.3.0. No longer changeable since fish 3.5.0. * regex-easyesc - `string replace -r` requires fewer backslashes in the replacement part. Enabled by default in fish 3.5.0. * ampersand-nobg-in-token - `&` in the middle of a word is a normal character instead of backgrounding. Enabled by default in fish 3.5.0. Event handlers -------------- When defining a new function in fish, it is possible to make it into an event handler, i.e. a function that is automatically run when a specific event takes place. Events that can trigger a handler currently are: * When a signal is delivered * When a job exits * When the value of a variable is updated * When the prompt is about to be shown Example: To specify a signal handler for the WINCH signal, write: ``` function my_signal_handler --on-signal WINCH echo Got WINCH signal! end ``` Fish already the following named events for the `--on-event` switch: * `fish_prompt` is emitted whenever a new fish prompt is about to be displayed. * `fish_preexec` is emitted right before executing an interactive command. The commandline is passed as the first parameter. Not emitted if command is empty. * `fish_posterror` is emitted right after executing a command with syntax errors. The commandline is passed as the first parameter. * `fish_postexec` is emitted right after executing an interactive command. The commandline is passed as the first parameter. Not emitted if command is empty. * `fish_exit` is emitted right before fish exits. * `fish_cancel` is emitted when a commandline is cleared. Events can be fired with the [emit](cmds/emit) command, and do not have to be defined before. The names just need to match. For example: ``` function handler --on-event imdone echo generator is done $argv end function generator sleep 1 # The "imdone" is the name of the event # the rest is the arguments to pass to the handler emit imdone with $argv end ``` If there are multiple handlers for an event, they will all be run, but the order might change between fish releases, so you should not rely on it. Please note that event handlers only become active when a function is loaded, which means you need to otherwise [source](cmds/source) or execute a function instead of relying on [autoloading](#syntax-function-autoloading). One approach is to put it into your [configuration file](#configuration). For more information on how to define new event handlers, see the documentation for the [function](cmds/function) command. Debugging fish scripts ---------------------- Fish includes basic built-in debugging facilities that allow you to stop execution of a script at an arbitrary point. When this happens you are presented with an interactive prompt where you can execute any fish command to inspect or change state (there are no debug commands as such). For example, you can check or change the value of any variables using [printf](cmds/printf) and [set](cmds/set). As another example, you can run [status print-stack-trace](cmds/status) to see how the current breakpoint was reached. To resume normal execution of the script, simply type [exit](cmds/exit) or `Control`+`D`. To start a debug session simply insert the [builtin command](cmds/breakpoint) `breakpoint` at the point in a function or script where you wish to gain control, then run the function or script. Also, the default action of the `TRAP` signal is to call this builtin, meaning a running script can be actively debugged by sending it the `TRAP` signal (`kill -s TRAP <PID>`). There is limited support for interactively setting or modifying breakpoints from this debug prompt: it is possible to insert new breakpoints in (or remove old ones from) other functions by using the `funced` function to edit the definition of a function, but it is not possible to add or remove a breakpoint from the function/script currently loaded and being executed. Another way to debug script issues is to set the [`fish_trace`](#envvar-fish_trace) variable, e.g. `fish_trace=1 fish_prompt` to see which commands fish executes when running the [fish\_prompt](cmds/fish_prompt) function. If you specifically want to debug performance issues, **fish** can be run with the `--profile /path/to/profile.log` option to save a profile to the specified path. This profile log includes a breakdown of how long each step in the execution took. See [fish](cmds/fish) for more information.
programming_docs
fish Frequently asked questions Frequently asked questions ========================== What is the equivalent to this thing from bash (or other shells)? ----------------------------------------------------------------- See [Fish for bash users](fish_for_bash_users#fish-for-bash-users) How do I set or clear an environment variable? ---------------------------------------------- Use the [set](cmds/set) command: ``` set -x key value # typically set -gx key value set -e key ``` Since fish 3.1 you can set an environment variable for just one command using the `key=value some command` syntax, like in other shells. The two lines below behave identically - unlike other shells, fish will output `value` both times: ``` key=value echo $key begin; set -lx key value; echo $key; end ``` Note that “exported” is not a [scope](language#variables-scope), but an additional bit of state. A variable can be global and exported or local and exported or even universal and exported. Typically it makes sense to make an exported variable global. How do I check whether a variable is defined? --------------------------------------------- Use `set -q var`. For example, `if set -q var; echo variable defined; end`. To check multiple variables you can combine with `and` and `or` like so: ``` if set -q var1; or set -q var2 echo either variable defined end ``` Keep in mind that a defined variable could also be empty, either by having no elements (if set like `set var`) or only empty elements (if set like `set var ""`). Read on for how to deal with those. How do I check whether a variable is not empty? ----------------------------------------------- Use `string length -q -- $var`. For example, `if string length -q -- $var; echo not empty; end`. Note that `string length` will interpret a list of multiple variables as a disjunction (meaning any/or): ``` if string length -q -- $var1 $var2 $var3 echo at least one of these variables is not empty end ``` Alternatively, use `test -n "$var"`, but remember that **the variable must be double-quoted**. For example, `if test -n "$var"; echo not empty; end`. The `test` command provides its own and (-a) and or (-o): ``` if test -n "$var1" -o -n "$var2" -o -n "$var3" echo at least one of these variables is not empty end ``` If you want to know if a variable has *no elements*, use `set -q var[1]`. Why doesn’t `set -Ux` (exported universal variables) seem to work? ------------------------------------------------------------------ A global variable of the same name already exists. Environment variables such as `EDITOR` or `TZ` can be set universally using `set -Ux`. However, if there is an environment variable already set before fish starts (such as by login scripts or system administrators), it is imported into fish as a global variable. The [variable scopes](language#variables-scope) are searched from the “inside out”, which means that local variables are checked first, followed by global variables, and finally universal variables. This means that the global value takes precedence over the universal value. To avoid this problem, consider changing the setting which fish inherits. If this is not possible, add a statement to your [configuration file](language#configuration) (usually `~/.config/fish/config.fish`): ``` set -gx EDITOR vim ``` How do I run a command every login? What’s fish’s equivalent to .bashrc or .profile? ------------------------------------------------------------------------------------ Edit the file `~/.config/fish/config.fish` [[1]](#id2), creating it if it does not exist (Note the leading period). Unlike .bashrc and .profile, this file is always read, even in non-interactive or login shells. To do something only in interactive shells, check `status is-interactive` like: ``` if status is-interactive # use the coolbeans theme fish_config theme choose coolbeans end ``` How do I set my prompt? ----------------------- The prompt is the output of the `fish_prompt` function. Put it in `~/.config/fish/functions/fish_prompt.fish`. For example, a simple prompt is: ``` function fish_prompt set_color $fish_color_cwd echo -n (prompt_pwd) set_color normal echo -n ' > ' end ``` You can also use the Web configuration tool, [fish\_config](cmds/fish_config), to preview and choose from a gallery of sample prompts. Or you can use fish\_config from the commandline: ``` > fish_config prompt show # displays all the prompts fish ships with > fish_config prompt choose disco # loads the disco prompt in the current shell > fish_config prompt save # makes the change permanent ``` If you want to modify your existing prompt, you can use [funced](cmds/funced) and [funcsave](cmds/funcsave) like: ``` >_ funced fish_prompt # This opens up your editor (set in $EDITOR). # Modify the function, # save the file and repeat to your liking. # Once you are happy with it: >_ funcsave fish_prompt ``` This also applies to [fish\_right\_prompt](cmds/fish_right_prompt) and [fish\_mode\_prompt](cmds/fish_mode_prompt). Why does my prompt show a `[I]`? -------------------------------- That’s the [fish\_mode\_prompt](cmds/fish_mode_prompt). It is displayed by default when you’ve activated vi mode using `fish_vi_key_bindings`. If you haven’t activated vi mode on purpose, you might have installed a third-party theme or plugin that does it. If you want to change or disable this display, modify the `fish_mode_prompt` function, for instance via [funced](cmds/funced). How do I customize my syntax highlighting colors? ------------------------------------------------- Use the web configuration tool, [fish\_config](cmds/fish_config), or alter the [fish\_color family of environment variables](interactive#variables-color). You can also use `fish_config` on the commandline, like: ``` > fish_config theme show # to demonstrate all the colorschemes > fish_config theme choose coolbeans # to load the "coolbeans" theme > fish_config theme save # to make the change permanent ``` How do I change the greeting message? ------------------------------------- Change the value of the variable `fish_greeting` or create a [fish\_greeting](cmds/fish_greeting) function. For example, to remove the greeting use: ``` set -U fish_greeting ``` Or if you prefer not to use a universal variable, use: ``` set -g fish_greeting ``` in [config.fish](language#configuration). How do I run a command from history? ------------------------------------ Type some part of the command, and then hit the `↑` (up) or `↓` (down) arrow keys to navigate through history matches, or press `Control`+`R` to open the history in a searchable pager. In this pager you can press `Control`+`R` or `Control`+`S` to move to older or younger history respectively. Additional default key bindings include `Control`+`P` (up) and `Control`+`N` (down). See [Searchable command history](interactive#history-search) for more information. Why doesn’t history substitution (“!$” etc.) work? -------------------------------------------------- Because history substitution is an awkward interface that was invented before interactive line editing was even possible. Instead of adding this pseudo-syntax, fish opts for nice history searching and recall features. Switching requires a small change of habits: if you want to modify an old line/word, first recall it, then edit. As a special case, most of the time history substitution is used as `sudo !!`. In that case just press `Alt`+`S`, and it will recall your last commandline with `sudo` prefixed (or toggle a `sudo` prefix on the current commandline if there is anything). In general, fish’s history recall works like this: * Like other shells, the Up arrow, `↑` recalls whole lines, starting from the last executed line. A single press replaces “!!”, later presses replace “!-3” and the like. * If the line you want is far back in the history, type any part of the line and then press Up one or more times. This will filter the recalled lines to ones that include this text, and you will get to the line you want much faster. This replaces “!vi”, “!?bar.c” and the like. * `Alt`+`↑` recalls individual arguments, starting from the last argument in the last executed line. A single press replaces “!$”, later presses replace “!!:4” and such. As an alternate key binding, `Alt`+`.` can be used. * If the argument you want is far back in history (e.g. 2 lines back - that’s a lot of words!), type any part of it and then press `Alt`+`↑`. This will show only arguments containing that part and you will get what you want much faster. Try it out, this is very convenient! * If you want to reuse several arguments from the same line (“!!:3\*” and the like), consider recalling the whole line and removing what you don’t need (`Alt`+`D` and `Alt`+`Backspace` are your friends). See [documentation](interactive#editor) for more details about line editing in fish. How do I run a subcommand? The backtick doesn’t work! ----------------------------------------------------- `fish` uses parentheses for subcommands. For example: ``` for i in (ls) echo $i end ``` It also supports the familiar `$()` syntax, even in quotes. Backticks are not supported because they are discouraged even in POSIX shells. They nest poorly and are hard to tell from single quotes (`''`). My command (pkg-config) gives its output as a single long string? ----------------------------------------------------------------- Unlike other shells, fish splits command substitutions only on newlines, not spaces or tabs or the characters in $IFS. That means if you run ``` count (printf '%s ' a b c) ``` It will print `1`, because the “a b c “ is used in one piece. But if you do ``` count (printf '%s\n' a b c) ``` it will print `3`, because it gave `count` the arguments “a”, “b” and “c” separately. In the overwhelming majority of cases, splitting on spaces is unwanted, so this is an improvement. This is why you hear about problems with filenames with spaces, after all. However sometimes, especially with `pkg-config` and related tools, splitting on spaces is needed. In these cases use `string split -n " "` like: ``` g++ example_01.cpp (pkg-config --cflags --libs gtk+-2.0 | string split -n " ") ``` The `-n` is so empty elements are removed like POSIX shells would do. How do I get the exit status of a command? ------------------------------------------ Use the `$status` variable. This replaces the `$?` variable used in other shells. ``` somecommand if test $status -eq 7 echo "That's my lucky number!" end ``` If you are just interested in success or failure, you can run the command directly as the if-condition: ``` if somecommand echo "Command succeeded" else echo "Command failed" end ``` Or if you just want to do one command in case the first succeeded or failed, use `and` or `or`: ``` somecommand or someothercommand ``` See the [Conditions](language#syntax-conditional) and the documentation for [test](cmds/test) and [if](cmds/if) for more information. My command prints “No matches for wildcard” but works in bash ------------------------------------------------------------- In short: [quote](language#quotes) or [escape](language#escapes) the wildcard: ``` scp user@ip:/dir/"string-*" ``` When fish sees an unquoted `*`, it performs [wildcard expansion](language#expand-wildcard). That means it tries to match filenames to the given string. If the wildcard doesn’t match any files, fish prints an error instead of running the command: ``` > echo *this*does*not*exist fish: No matches for wildcard '*this*does*not*exist'. See `help expand`. echo *this*does*not*exist ^ ``` Now, bash also tries to match files in this case, but when it doesn’t find a match, it passes along the literal wildcard string instead. That means that commands like the above ``` scp user@ip:/dir/string-* ``` or ``` apt install postgres-* ``` appear to work, because most of the time the string doesn’t match and so it passes along the `string-*`, which is then interpreted by the receiving program. But it also means that these commands can stop working at any moment once a matching file is encountered (because it has been created or the command is executed in a different working directory), and to deal with that bash needs workarounds like ``` for f in ./*.mpg; do # We need to test if the file really exists because # the wildcard might have failed to match. test -f "$f" || continue mympgviewer "$f" done ``` (from <http://mywiki.wooledge.org/BashFAQ/004>) For these reasons, fish does not do this, and instead expects asterisks to be quoted or escaped if they aren’t supposed to be expanded. This is similar to bash’s “failglob” option. I accidentally entered a directory path and fish changed directory. What happened? ---------------------------------------------------------------------------------- If fish is unable to locate a command with a given name, and it starts with `.`, `/` or `~`, fish will test if a directory of that name exists. If it does, it assumes that you want to change your directory. For example, the fastest way to switch to your home directory is to simply press `~` and enter. The open command doesn’t work. ------------------------------ The `open` command uses the MIME type database and the `.desktop` files used by Gnome and KDE to identify filetypes and default actions. If at least one of these environments is installed, but the open command is not working, this probably means that the relevant files are installed in a non-standard location. Consider [asking for more help](index#more-help). Why won’t SSH/SCP/rsync connect properly when fish is my login shell? --------------------------------------------------------------------- This problem may show up as messages like “`Received message too long`”, “`open terminal failed: not a terminal`”, “`Bad packet length`”, or “`Connection refused`” with strange output in `ssh_exchange_identification` messages in the debug log. This usually happens because fish reads the [user configuration file](language#configuration) (`~/.config/fish/config.fish`) *always*, whether it’s in an interactive or login or non-interactive or non-login shell. This simplifies matters, but it also means when config.fish generates output, it will do that even in non-interactive shells like the one ssh/scp/rsync start when they connect. Anything in config.fish that produces output should be guarded with `status is-interactive` (or `status is-login` if you prefer): ``` if status is-interactive ... end ``` The same applies for example when you start `tmux` in config.fish without guards, which will cause a message like `sessions should be nested with care, unset $TMUX to force`. I’m getting weird graphical glitches (a staircase effect, ghost characters, cursor in the wrong position,…)? ------------------------------------------------------------------------------------------------------------ In a terminal, the application running inside it and the terminal itself need to agree on the width of characters in order to handle cursor movement. This is more important to fish than other shells because features like syntax highlighting and autosuggestions are implemented by moving the cursor. Sometimes, there is disagreement on the width. There are numerous causes and fixes for this: * It is possible the character is simply too new for your system to know - in this case you need to refrain from using it. * Fish or your terminal might not know about the character or handle it wrong - in this case fish or your terminal needs to be fixed, or you need to update to a fixed version. * The character has an “ambiguous” width and fish thinks that means a width of X while your terminal thinks it’s Y. In this case you either need to change your terminal’s configuration or set $fish\_ambiguous\_width to the correct value. * The character is an emoji and the host system only supports Unicode 8, while you are running the terminal on a system that uses Unicode >= 9. In this case set $fish\_emoji\_width to 2. This also means that a few things are unsupportable: * Non-monospace fonts - there is *no way* for fish to figure out what width a specific character has as it has no influence on the terminal’s font rendering. * Different widths for multiple ambiguous width characters - there is no way for fish to know which width you assign to each character. Uninstalling fish ----------------- If you want to uninstall fish, first make sure fish is not set as your shell. Run `chsh -s /bin/bash` if you are not sure. If you installed it with a package manager, just use that package manager’s uninstall function. If you built fish yourself, assuming you installed it to /usr/local, do this: ``` rm -Rf /usr/local/etc/fish /usr/local/share/fish ~/.config/fish rm /usr/local/share/man/man1/fish*.1 cd /usr/local/bin rm -f fish fish_indent ``` Where can I find extra tools for fish? -------------------------------------- The fish user community extends fish in unique and useful ways via scripts that aren’t always appropriate for bundling with the fish package. Typically because they solve a niche problem unlikely to appeal to a broad audience. You can find those extensions, including prompts, themes and useful functions, in various third-party repositories. These include: * [Fisher](https://github.com/jorgebucaran/fisher) * [Fundle](https://github.com/tuvistavie/fundle) * [Oh My Fish](https://github.com/oh-my-fish/oh-my-fish) * [Tacklebox](https://github.com/justinmayer/tacklebox) This is not an exhaustive list and the fish project has no opinion regarding the merits of the repositories listed above or the scripts found therein. fish read - read line of input into variables read - read line of input into variables ======================================== Synopsis -------- ``` read [OPTIONS] [VARIABLE ...] ``` Description ----------- `read` reads from standard input and either writes the result back to standard output (for use in command substitution), or stores the result in one or more shell variables. By default, `read` reads a single line and splits it into variables on spaces or tabs. Alternatively, a null character or a maximum number of characters can be used to terminate the input, and other delimiters can be given. Unlike other shells, there is no default variable (such as `REPLY`) for storing the result - instead, it is printed on standard output. The following options are available: **-c** *CMD* or **--command** *CMD* Sets the initial string in the interactive mode command buffer to *CMD*. **-d** or **--delimiter** *DELIMITER* Splits on *DELIMITER*. *DELIMITER* will be used as an entire string to split on, not a set of characters. **-g** or **--global** Makes the variables global. **-s** or **--silent** Masks characters written to the terminal, replacing them with asterisks. This is useful for reading things like passwords or other sensitive information. **-f** or **--function** Scopes the variable to the currently executing function. It is erased when the function ends. **-l** or **--local** Scopes the variable to the currently executing block. It is erased when the block ends. Outside of a block, this is the same as **--function**. **-n** or **--nchars** *NCHARS* Makes `read` return after reading *NCHARS* characters or the end of the line, whichever comes first. **-p** or **--prompt** *PROMPT\_CMD* Uses the output of the shell command *PROMPT\_CMD* as the prompt for the interactive mode. The default prompt command is `set_color green; echo read; set_color normal; echo "> "` **-P** or **--prompt-str** *PROMPT\_STR* Uses the *PROMPT\_STR* as the prompt for the interactive mode. It is equivalent to `echo $PROMPT_STR` and is provided solely to avoid the need to frame the prompt as a command. All special characters in the string are automatically escaped before being passed to the <echo> command. **-R** or **--right-prompt** *RIGHT\_PROMPT\_CMD* Uses the output of the shell command *RIGHT\_PROMPT\_CMD* as the right prompt for the interactive mode. There is no default right prompt command. **-S** or **--shell** Enables syntax highlighting, tab completions and command termination suitable for entering shellscript code in the interactive mode. NOTE: Prior to fish 3.0, the short opt for **--shell** was **-s**, but it has been changed for compatibility with bash’s **-s** short opt for **--silent**. **-t** -or **--tokenize** Causes read to split the input into variables by the shell’s tokenization rules. This means it will honor quotes and escaping. This option is of course incompatible with other options to control splitting like **--delimiter** and does not honor [`IFS`](../language#envvar-IFS) (like fish’s tokenizer). It saves the tokens in the manner they’d be passed to commands on the commandline, so e.g. `a\ b` is stored as `a b`. Note that currently it leaves command substitutions intact along with the parentheses. **-u** or **--unexport** Prevents the variables from being exported to child processes (default behaviour). **-U** or **--universal** Causes the specified shell variable to be made universal. **-x** or **--export** Exports the variables to child processes. **-a** or **--list** Stores the result as a list in a single variable. This option is also available as **--array** for backwards compatibility. **-z** or **--null** Marks the end of the line with the NUL character, instead of newline. This also disables interactive mode. **-L** or **--line** Reads each line into successive variables, and stops after each variable has been filled. This cannot be combined with the `--delimiter` option. Without the `--line` option, `read` reads a single line of input from standard input, breaks it into tokens, and then assigns one token to each variable specified in *VARIABLES*. If there are more tokens than variables, the complete remainder is assigned to the last variable. If no option to determine how to split like `--delimiter`, `--line` or `--tokenize` is given, the variable `IFS` is used as a list of characters to split on. Relying on the use of `IFS` is deprecated and this behaviour will be removed in future versions. The default value of `IFS` contains space, tab and newline characters. As a special case, if `IFS` is set to the empty string, each character of the input is considered a separate token. With the `--line` option, `read` reads a line of input from standard input into each provided variable, stopping when each variable has been filled. The line is not tokenized. If no variable names are provided, `read` enters a special case that simply provides redirection from standard input to standard output, useful for command substitution. For instance, the fish shell command below can be used to read data that should be provided via a command line argument from the console instead of hardcoding it in the command itself, allowing the command to both be reused as-is in various contexts with different input values and preventing possibly sensitive text from being included in the shell history: ``` mysql -uuser -p(read) ``` When running in this mode, `read` does not split the input in any way and text is redirected to standard output without any further processing or manipulation. If `-a` or `--array` is provided, only one variable name is allowed and the tokens are stored as a list in this variable. See the documentation for `set` for more details on the scoping rules for variables. When `read` reaches the end-of-file (EOF) instead of the terminator, the exit status is set to 1. Otherwise, it is set to 0. In order to protect the shell from consuming too many system resources, `read` will only consume a maximum of 100 MiB (104857600 bytes); if the terminator is not reached before this limit then *VARIABLE* is set to empty and the exit status is set to 122. This limit can be altered with the [`fish_read_limit`](../language#envvar-fish_read_limit) variable. If set to 0 (zero), the limit is removed. Example ------- `read` has a few separate uses. The following code stores the value ‘hello’ in the shell variable `foo`. ``` echo hello|read foo ``` While this is a neat way to handle command output line-by-line: ``` printf '%s\n' line1 line2 line3 line4 | while read -l foo echo "This is another line: $foo" end ``` Delimiters given via “-d” are taken as one string: ``` echo a==b==c | read -d == -l a b c echo $a # a echo $b # b echo $c # c ``` `--tokenize` honors quotes and escaping like the shell’s argument passing: ``` echo 'a\ b' | read -t first second echo $first # outputs "a b", $second is empty echo 'a"foo bar"b (command echo wurst)*" "{a,b}' | read -lt -l a b c echo $a # outputs 'afoo bar' (without the quotes) echo $b # outputs '(command echo wurst)* {a,b}' (without the quotes) echo $c # nothing ``` For an example on interactive use, see [Querying for user input](../language#user-input).
programming_docs
fish fish_greeting - display a welcome message in interactive shells fish\_greeting - display a welcome message in interactive shells ================================================================ Synopsis -------- ``` fish_greeting ``` ``` function fish_greeting ... end ``` Description ----------- When an interactive fish starts, it executes fish\_greeting and displays its output. The default fish\_greeting is a function that prints a variable of the same name (`$fish_greeting`), so you can also just change that if you just want to change the text. While you could also just put `echo` calls into config.fish, fish\_greeting takes care of only being used in interactive shells, so it won’t be used e.g. with `scp` (which executes a shell), which prevents some errors. Example ------- To just empty the text, with the default greeting function: ``` set -U fish_greeting ``` or `set -g fish_greeting` in [config.fish](../language#configuration). A simple greeting: ``` function fish_greeting echo Hello friend! echo The time is (set_color yellow; date +%T; set_color normal) and this machine is called $hostname end ``` fish string-replace - replace substrings string-replace - replace substrings =================================== Synopsis -------- ``` string replace [-a | --all] [-f | --filter] [-i | --ignore-case] [-r | --regex] [-q | --quiet] PATTERN REPLACEMENT [STRING ...] ``` Description ----------- `string replace` is similar to `string match` but replaces non-overlapping matching substrings with a replacement string and prints the result. By default, *PATTERN* is treated as a literal substring to be matched. If **-r** or **--regex** is given, *PATTERN* is interpreted as a Perl-compatible regular expression, and *REPLACEMENT* can contain C-style escape sequences like **t** as well as references to capturing groups by number or name as *$n* or *${n}*. If you specify the **-f** or **--filter** flag then each input string is printed only if a replacement was done. This is useful where you would otherwise use this idiom: `a_cmd | string match pattern | string replace pattern new_pattern`. You can instead just write `a_cmd | string replace --filter pattern new_pattern`. Exit status: 0 if at least one replacement was performed, or 1 otherwise. Examples -------- ### Replace Literal Examples ``` >_ string replace is was 'blue is my favorite' blue was my favorite >_ string replace 3rd last 1st 2nd 3rd 1st 2nd last >_ string replace -a ' ' _ 'spaces to underscores' spaces_to_underscores ``` ### Replace Regex Examples ``` >_ string replace -r -a '[^\d.]+' ' ' '0 one two 3.14 four 5x' 0 3.14 5 >_ string replace -r '(\w+)\s+(\w+)' '$2 $1 $$' 'left right' right left $ >_ string replace -r '\s*newline\s*' '\n' 'put a newline here' put a here ``` fish case - conditionally execute a block of commands case - conditionally execute a block of commands ================================================ Synopsis -------- ``` switch VALUE [case [GLOB ...] [COMMAND ...]] end ``` Description ----------- `switch` executes one of several blocks of commands, depending on whether a specified value matches one of several values. `case` is used together with the `switch` statement in order to determine which block should be executed. Each `case` command is given one or more parameters. The first `case` command with a parameter that matches the string specified in the switch command will be evaluated. `case` parameters may contain wildcards. These need to be escaped or quoted in order to avoid regular wildcard expansion using filenames. Note that fish does not fall through on case statements. Only the first matching case is executed. Note that command substitutions in a case statement will be evaluated even if its body is not taken. All substitutions, including command substitutions, must be performed before the value can be compared against the parameter. Example ------- Say $animal contains the name of an animal. Then this code would classify it: ``` switch $animal case cat echo evil case wolf dog human moose dolphin whale echo mammal case duck goose albatross echo bird case shark trout stingray echo fish # Note that the next case has a wildcard which is quoted case '*' echo I have no idea what a $animal is end ``` If the above code was run with `$animal` set to `whale`, the output would be `mammal`. If `$animal` was set to “banana”, it would print “I have no idea what a banana is”. fish jobs - print currently running jobs jobs - print currently running jobs =================================== Synopsis -------- ``` jobs [OPTIONS] [PID | %JOBID] ``` Description ----------- `jobs` prints a list of the currently running [jobs](../language#syntax-job-control) and their status. `jobs` accepts the following options: **-c** or **--command** Prints the command name for each process in jobs. **-g** or **--group** Only prints the group ID of each job. **-l** or **--last** Prints only the last job to be started. **-p** or **--pid** Prints the process ID for each process in all jobs. **-q** or **--query** Prints no output for evaluation of jobs by exit status only. For compatibility with old fish versions this is also **--quiet** (but this is deprecated). **-h** or **--help** Displays help about using this command. On systems that support this feature, jobs will print the CPU usage of each job since the last command was executed. The CPU usage is expressed as a percentage of full CPU activity. Note that on multiprocessor systems, the total activity may be more than 100%. Arguments of the form *PID* or *%JOBID* restrict the output to jobs with the selected process identifiers or job numbers respectively. If the output of `jobs` is redirected or if it is part of a command substitution, the column header that is usually printed is omitted, making it easier to parse. The exit status of `jobs` is `0` if there are running background jobs and `1` otherwise. Example ------- `jobs` outputs a summary of the current jobs, such as two long-running tasks in this example: ``` Job Group State Command 2 26012 running nc -l 55232 < /dev/random & 1 26011 running python tests/test_11.py & ``` fish contains - test if a word is present in a list contains - test if a word is present in a list ============================================== Synopsis -------- ``` contains [OPTIONS] KEY [VALUES ...] ``` Description ----------- `contains` tests whether the set *VALUES* contains the string *KEY*. If so, `contains` exits with code 0; if not, it exits with code 1. The following options are available: **-i** or **--index** Print the index (number of the element in the set) of the first matching element. **-h** or **--help** Displays help about using this command. Note that `contains` interprets all arguments starting with a **-** as an option to `contains`, until an **--** argument is reached. See the examples below. Example ------- If *animals* is a list of animals, the following will test if *animals* contains “cat”: ``` if contains cat $animals echo Your animal list is evil! end ``` This code will add some directories to [`PATH`](../language#envvar-PATH) if they aren’t yet included: ``` for i in ~/bin /usr/local/bin if not contains $i $PATH set PATH $PATH $i end end ``` While this will check if function `hasargs` is being ran with the **-q** option: ``` function hasargs if contains -- -q $argv echo '$argv contains a -q option' end end ``` The **--** here stops `contains` from treating **-q** to an option to itself. Instead it treats it as a normal string to check. fish string-join0 - join strings with zero bytes string-join0 - join strings with zero bytes =========================================== Synopsis -------- ``` string join [-q | --quiet] SEP [STRING ...] string join0 [-q | --quiet] [STRING ...] ``` Description ----------- `string join` joins its *STRING* arguments into a single string separated by *SEP*, which can be an empty string. Exit status: 0 if at least one join was performed, or 1 otherwise. If `-n` or `--no-empty` is specified, empty strings are excluded from consideration (e.g. `string join -n + a b "" c` would expand to `a+b+c` not `a+b++c`). `string join0` joins its *STRING* arguments into a single string separated by the zero byte (NUL), and adds a trailing NUL. This is most useful in conjunction with tools that accept NUL-delimited input, such as `sort -z`. Exit status: 0 if at least one join was performed, or 1 otherwise. Because Unix uses NUL as the string terminator, passing the output of `string join0` as an *argument* to a command (via a [command substitution](../language#expand-command-substitution)) won’t actually work. Fish will pass the correct bytes along, but the command won’t be able to tell where the argument ends. This is a limitation of Unix’ argument passing. Examples -------- ``` >_ seq 3 | string join ... 1...2...3 # Give a list of NUL-separated filenames to du (this is a GNU extension) >_ string join0 file1 file2 file\nwith\nmultiple\nlines | du --files0-from=- # Just put the strings together without a separator >_ string join '' a b c abc ``` fish set_color - set the terminal color set\_color - set the terminal color =================================== Synopsis -------- ``` set_color [OPTIONS] VALUE ``` Description ----------- `set_color` is used to control the color and styling of text in the terminal. *VALUE* describes that styling. *VALUE* can be a reserved color name like **red** or an RGB color value given as 3 or 6 hexadecimal digits (“F27” or “FF2277”). A special keyword **normal** resets text formatting to terminal defaults. Valid colors include: * **black**, **red**, **green**, **yellow**, **blue**, **magenta**, **cyan**, **white** * **brblack**, **brred**, **brgreen**, **bryellow**, **brblue**, **brmagenta**, **brcyan**, **brwhite** The *br*- (as in ‘bright’) forms are full-brightness variants of the 8 standard-brightness colors on many terminals. **brblack** has higher brightness than **black** - towards gray. An RGB value with three or six hex digits, such as A0FF33 or f2f can be used. Fish will choose the closest supported color. A three digit value is equivalent to specifying each digit twice; e.g., `set_color 2BC` is the same as `set_color 22BBCC`. Hexadecimal RGB values can be in lower or uppercase. Depending on the capabilities of your terminal (and the level of support `set_color` has for it) the actual color may be approximated by a nearby matching reserved color name or `set_color` may not have an effect on color. A second color may be given as a desired fallback color. e.g. `set_color 124212 brblue` will instruct set\_color to use *brblue* if a terminal is not capable of the exact shade of grey desired. This is very useful when an 8 or 16 color terminal might otherwise not use a color. The following options are available: **-b** or **--background** *COLOR* Sets the background color. **-c** or **--print-colors** Prints the given colors or a colored list of the 16 named colors. **-o** or **--bold** Sets bold mode. **-d** or **--dim** Sets dim mode. **-i** or **--italics** Sets italics mode. **-r** or **--reverse** Sets reverse mode. **-u** or **--underline** Sets underlined mode. **-h** or **--help** Displays help about using this command. Using the **normal** keyword will reset foreground, background, and all formatting back to default. Notes ----- 1. Using the **normal** keyword will reset both background and foreground colors to whatever is the default for the terminal. 2. Setting the background color only affects subsequently written characters. Fish provides no way to set the background color for the entire terminal window. Configuring the window background color (and other attributes such as its opacity) has to be done using whatever mechanisms the terminal provides. Look for a config option. 3. Some terminals use the `--bold` escape sequence to switch to a brighter color set rather than increasing the weight of text. 4. `set_color` works by printing sequences of characters to standard output. If used in command substitution or a pipe, these characters will also be captured. This may or may not be desirable. Checking the exit status of `isatty stdout` before using `set_color` can be useful to decide not to colorize output in a script. Examples -------- ``` set_color red; echo "Roses are red" set_color blue; echo "Violets are blue" set_color 62A; echo "Eggplants are dark purple" set_color normal; echo "Normal is nice" # Resets the background too ``` Terminal Capability Detection ----------------------------- Fish uses some heuristics to determine what colors a terminal supports to avoid sending sequences that it won’t understand. In particular it will: * Enable 256 colors if [`TERM`](../language#envvar-TERM) contains “xterm”, except for known exceptions (like MacOS 10.6 Terminal.app) * Enable 24-bit (“true-color”) even if the $TERM entry only reports 256 colors. This includes modern xterm, VTE-based terminals like Gnome Terminal, Konsole and iTerm2. * Detect support for italics, dim, reverse and other modes. If terminfo reports 256 color support for a terminal, 256 color support will always be enabled. To force true-color support on or off, set `fish_term24bit` to “1” for on and 0 for off - `set -g fish_term24bit 1`. To debug color palette problems, `tput colors` may be useful to see the number of colors in terminfo for a terminal. Fish launched as `fish -d term_support` will include diagnostic messages that indicate the color support mode in use. The `set_color` command uses the terminfo database to look up how to change terminal colors on whatever terminal is in use. Some systems have old and incomplete terminfo databases, and lack color information for terminals that support it. Fish assumes that all terminals can use the [ANSI X3.64](<https://en.wikipedia.org/wiki/ANSI_escape_code>) escape sequences if the terminfo definition indicates a color below 16 is not supported. fish builtin - run a builtin command builtin - run a builtin command =============================== Synopsis -------- ``` builtin [OPTIONS] BUILTINNAME builtin --query BUILTINNAME ... builtin --names ``` Description ----------- `builtin` forces the shell to use a builtin command named *BUILTIN*, rather than a function or external program. The following options are available: **-n** or **--names** Lists the names of all defined builtins. **-q** or **--query** *BUILTIN* Tests if any of the specified builtins exist. If any exist, it returns 0, 1 otherwise. **-h** or **--help** Displays help about using this command. Example ------- ``` builtin jobs # executes the jobs builtin, even if a function named jobs exists ``` fish continue - skip the remainder of the current iteration of the current inner loop continue - skip the remainder of the current iteration of the current inner loop ================================================================================ Synopsis -------- ``` LOOP_CONSTRUCT; [COMMANDS ...;] continue; [COMMANDS ...;] end ``` Description ----------- `continue` skips the remainder of the current iteration of the current inner loop, such as a <for> loop or a <while> loop. It is usually added inside of a conditional block such as an <if> statement or a <switch> statement. Example ------- The following code removes all tmp files that do not contain the word smurf. ``` for i in *.tmp if grep smurf $i continue end # This "rm" is skipped over if "continue" is executed. rm $i # As is this "echo" echo $i end ``` See Also -------- * the <break> command, to stop the current inner loop fish string-sub - extract substrings string-sub - extract substrings =============================== Synopsis -------- ``` string sub [(-s | --start) START] [(-e | --end) END] [(-l | --length) LENGTH] [-q | --quiet] [STRING ...] ``` Description ----------- `string sub` prints a substring of each string argument. The start/end of the substring can be specified with **-s**/**-e** or **--start**/**--end** followed by a 1-based index value. Positive index values are relative to the start of the string and negative index values are relative to the end of the string. The default start value is 1. The length of the substring can be specified with **-l** or **--length**. If the length or end is not specified, the substring continues to the end of each STRING. Exit status: 0 if at least one substring operation was performed, 1 otherwise. **--length** is mutually exclusive with **--end**. Examples -------- ``` >_ string sub --length 2 abcde ab >_ string sub -s 2 -l 2 abcde bc >_ string sub --start=-2 abcde de >_ string sub --end=3 abcde abc >_ string sub -e -1 abcde abcd >_ string sub -s 2 -e -1 abcde bcd >_ string sub -s -3 -e -2 abcde c ``` fish false - return an unsuccessful result false - return an unsuccessful result ===================================== Synopsis -------- ``` false ``` Description ----------- `false` sets the exit status to 1. See Also -------- * <true> command * [$status](../language#variables-status) variable fish fish_command_not_found - what to do when a command wasn’t found fish\_command\_not\_found - what to do when a command wasn’t found ================================================================== Synopsis -------- ``` function fish_command_not_found ... end ``` Description ----------- When fish tries to execute a command and can’t find it, it invokes this function. It can print a message to tell you about it, and it often also checks for a missing package that would include the command. Fish ships multiple handlers for various operating systems and chooses from them when this function is loaded, or you can define your own. It receives the full commandline as one argument per token, so $argv[1] contains the missing command. When you leave `fish_command_not_found` undefined (e.g. by adding an empty function file) or explicitly call `__fish_default_command_not_found_handler`, fish will just print a simple error. Example ------- A simple handler: ``` function fish_command_not_found echo Did not find command $argv[1] end > flounder Did not find command flounder ``` Or the handler for OpenSUSE’s command-not-found: ``` function fish_command_not_found /usr/bin/command-not-found $argv[1] end ``` Or the simple default handler: ``` function fish_command_not_found __fish_default_command_not_found_handler $argv end ``` Backwards compatibility ----------------------- This command was introduced in fish 3.2.0. Previous versions of fish used the “fish\_command\_not\_found” [event](../language#event) instead. To define a handler that works in older versions of fish as well, define it the old way: ``` function __fish_command_not_found_handler --on-event fish_command_not_found echo COMMAND WAS NOT FOUND MY FRIEND $argv[1] end ``` in which case fish will define a `fish_command_not_found` that calls it, or define a wrapper: ``` function fish_command_not_found echo "G'day mate, could not find your command: $argv" end function __fish_command_not_found_handler --on-event fish_command_not_found fish_command_not_found $argv end ``` fish status - query fish runtime information status - query fish runtime information ======================================= Synopsis -------- ``` status status is-login status is-interactive status is-block status is-breakpoint status is-command-substitution status is-no-job-control status is-full-job-control status is-interactive-job-control status current-command status current-commandline status filename status basename status dirname status fish-path status function status line-number status stack-trace status job-control CONTROL_TYPE status features status test-feature FEATURE ``` Description ----------- With no arguments, `status` displays a summary of the current login and job control status of the shell. The following operations (subcommands) are available: **is-command-substitution**, **-c** or **--is-command-substitution** Returns 0 if fish is currently executing a command substitution. **is-block**, **-b** or **--is-block** Returns 0 if fish is currently executing a block of code. **is-breakpoint** Returns 0 if fish is currently showing a prompt in the context of a <breakpoint> command. See also the <fish_breakpoint_prompt> function. **is-interactive**, **-i** or **--is-interactive** Returns 0 if fish is interactive - that is, connected to a keyboard. **is-login**, **-l** or **--is-login** Returns 0 if fish is a login shell - that is, if fish should perform login tasks such as setting up [`PATH`](../language#envvar-PATH). **is-full-job-control** or **--is-full-job-control** Returns 0 if full job control is enabled. **is-interactive-job-control** or **--is-interactive-job-control** Returns 0 if interactive job control is enabled. **is-no-job-control** or **--is-no-job-control** Returns 0 if no job control is enabled. **current-command** Prints the name of the currently-running function or command, like the deprecated [`_`](../language#envvar-_) variable. **current-commandline** Prints the entirety of the currently-running commandline, inclusive of all jobs and operators. **filename**, **current-filename**, **-f** or **--current-filename** Prints the filename of the currently-running script. If the current script was called via a symlink, this will return the symlink. If the current script was received by piping into <source>, then this will return `-`. **basename** Prints just the filename of the running script, without any path components before. **dirname** Prints just the path to the running script, without the actual filename itself. This can be relative to [`PWD`](../language#envvar-PWD) (including just “.”), depending on how the script was called. This is the same as passing the filename to `dirname(3)`. It’s useful if you want to use other files in the current script’s directory or similar. **fish-path** Prints the absolute path to the currently executing instance of fish. This is a best-effort attempt and the exact output is down to what the platform gives fish. In some cases you might only get “fish”. **function** or **current-function** Prints the name of the currently called function if able, when missing displays “Not a function” (or equivalent translated string). **line-number**, **current-line-number**, **-n** or **--current-line-number** Prints the line number of the currently running script. **stack-trace**, **print-stack-trace**, **-t** or **--print-stack-trace** Prints a stack trace of all function calls on the call stack. **job-control**, **-j** or **--job-control** *CONTROL\_TYPE* Sets the job control type to *CONTROL\_TYPE*, which can be **none**, **full**, or **interactive**. **features** Lists all available feature flags. **test-feature** *FEATURE* Returns 0 when FEATURE is enabled, 1 if it is disabled, and 2 if it is not recognized. Notes ----- For backwards compatibility most subcommands can also be specified as a long or short option. For example, rather than `status is-login` you can type `status --is-login`. The flag forms are deprecated and may be removed in a future release (but not before fish 4.0). You can only specify one subcommand per invocation even if you use the flag form of the subcommand.
programming_docs
fish eval - evaluate the specified commands eval - evaluate the specified commands ====================================== Synopsis -------- ``` eval [COMMANDS ...] ``` Description ----------- **eval** evaluates the specified parameters as a command. If more than one parameter is specified, all parameters will be joined using a space character as a separator. If the command does not need access to stdin, consider using <source> instead. If no piping or other compound shell constructs are required, variable-expansion-as-command, as in `set cmd ls -la; $cmd`, is also an option. Example ------- The following code will call the ls command and truncate each filename to the first 12 characters. ``` set cmd ls \| cut -c 1-12 eval $cmd ``` fish help - display fish documentation help - display fish documentation ================================= Synopsis -------- ``` help [SECTION] ``` Description ----------- `help` displays the fish help documentation. If a *SECTION* is specified, the help for that command is shown. The **-h** or **--help** option displays help about using this command. If the [`BROWSER`](../language#envvar-BROWSER) environment variable is set, it will be used to display the documentation. Otherwise, fish will search for a suitable browser. To use a different browser than as described above, one can set the `fish_help_browser` variable. This variable may be set as a list, where the first element is the browser command and the rest are browser options. Example ------- `help fg` shows the documentation for the <fg> builtin. Notes ----- Most builtin commands, including this one, display their help in the terminal when given the **--help** option. fish vared - interactively edit the value of an environment variable vared - interactively edit the value of an environment variable =============================================================== Synopsis -------- ``` vared VARIABLE_NAME ``` Description ----------- `vared` is used to interactively edit the value of an environment variable. Array variables as a whole can not be edited using `vared`, but individual list elements can. The **-h** or **--help** option displays help about using this command. Example ------- `vared PATH[3]` edits the third element of the PATH list fish functions - print or erase functions functions - print or erase functions ==================================== Synopsis -------- ``` functions [-a | --all] [-n | --names] functions [-D | --details] [-v] FUNCTION functions -c OLDNAME NEWNAME functions -d DESCRIPTION FUNCTION functions [-e | -q] FUNCTION ... ``` Description ----------- `functions` prints or erases functions. The following options are available: **-a** or **--all** Lists all functions, even those whose name starts with an underscore. **-c** or **--copy** *OLDNAME* *NEWNAME* Creates a new function named *NEWNAME*, using the definition of the *OLDNAME* function. **-d** or **--description** *DESCRIPTION* Changes the description of this function. **-e** or **--erase** Causes the specified functions to be erased. This also means that it is prevented from autoloading in the current session. Use <funcsave> to remove the saved copy. **-D** or **--details** Reports the path name where the specified function is defined or could be autoloaded, `stdin` if the function was defined interactively or on the command line or by reading standard input, **-** if the function was created via <source>, and `n/a` if the function isn’t available. (Functions created via <alias> will return **-**, because `alias` uses `source` internally.) If the **--verbose** option is also specified then five lines are written: * the pathname as already described, * `autoloaded`, `not-autoloaded` or `n/a`, * the line number within the file or zero if not applicable, * `scope-shadowing` if the function shadows the vars in the calling function (the normal case if it wasn’t defined with **--no-scope-shadowing**), else `no-scope-shadowing`, or `n/a` if the function isn’t defined, * the function description minimally escaped so it is a single line, or `n/a` if the function isn’t defined or has no description. You should not assume that only five lines will be written since we may add additional information to the output in the future. **--no-details** Turns off function path reporting, so just the definition will be printed. **-n** or **--names** Lists the names of all defined functions. **-q** or **--query** Tests if the specified functions exist. **-v** or **--verbose** Make some output more verbose. **-H** or **--handlers** Show all event handlers. **-t** or **--handlers-type** *TYPE* Show all event handlers matching the given *TYPE*. **-h** or **--help** Displays help about using this command. The default behavior of `functions`, when called with no arguments, is to print the names of all defined functions. Unless the `-a` option is given, no functions starting with underscores are included in the output. If any non-option parameters are given, the definition of the specified functions are printed. Copying a function using `-c` copies only the body of the function, and does not attach any event notifications from the original function. Only one function’s description can be changed in a single invocation of `functions -d`. The exit status of `functions` is the number of functions specified in the argument list that do not exist, which can be used in concert with the `-q` option. Examples -------- ``` functions -n # Displays a list of currently-defined functions functions -c foo bar # Copies the 'foo' function to a new function called 'bar' functions -e bar # Erases the function ``bar`` ``` See more -------- For more explanation of how functions fit into fish, see [Functions](../language#syntax-function). fish string-escape - escape special characters string-escape - escape special characters ========================================= Synopsis -------- ``` string escape [-n | --no-quoted] [--style=] [STRING ...] string unescape [--style=] [STRING ...] ``` Description ----------- `string escape` escapes each *STRING* in one of three ways. The first is **--style=script**. This is the default. It alters the string such that it can be passed back to `eval` to produce the original argument again. By default, all special characters are escaped, and quotes are used to simplify the output when possible. If **-n** or **--no-quoted** is given, the simplifying quoted format is not used. Exit status: 0 if at least one string was escaped, or 1 otherwise. **--style=var** ensures the string can be used as a variable name by hex encoding any non-alphanumeric characters. The string is first converted to UTF-8 before being encoded. **--style=url** ensures the string can be used as a URL by hex encoding any character which is not legal in a URL. The string is first converted to UTF-8 before being encoded. **--style=regex** escapes an input string for literal matching within a regex expression. The string is first converted to UTF-8 before being encoded. `string unescape` performs the inverse of the `string escape` command. If the string to be unescaped is not properly formatted it is ignored. For example, doing `string unescape --style=var (string escape --style=var $str)` will return the original string. There is no support for unescaping **--style=regex**. Examples -------- ``` >_ echo \x07 | string escape \cg >_ string escape --style=var 'a1 b2'\u6161 a1_20_b2_E6_85_A1_ ``` fish type - locate a command and describe its type type - locate a command and describe its type ============================================= Synopsis -------- ``` type [OPTIONS] NAME [...] ``` Description ----------- With no options, **type** indicates how each *NAME* would be interpreted if used as a command name. The following options are available: **-a** or **--all** Prints all of possible definitions of the specified names. **-s** or **--short** Suppresses function expansion when used with no options or with **-a**/**--all**. **-f** or **--no-functions** Suppresses function and builtin lookup. **-t** or **--type** Prints `function`, `builtin`, or `file` if *NAME* is a shell function, builtin, or disk file, respectively. **-p** or **--path** Prints the path to *NAME* if *NAME* resolves to an executable file in [`PATH`](../language#envvar-PATH), the path to the script containing the definition of the function *NAME* if *NAME* resolves to a function loaded from a file on disk (i.e. not interactively defined at the prompt), or nothing otherwise. **-P** or **--force-path** Returns the path to the executable file *NAME*, presuming *NAME* is found in the [`PATH`](../language#envvar-PATH) environment variable, or nothing otherwise. **--force-path** explicitly resolves only the path to executable files in [`PATH`](../language#envvar-PATH), regardless of whether *NAME* is shadowed by a function or builtin with the same name. **-q** or **--query** Suppresses all output; this is useful when testing the exit status. For compatibility with old fish versions this is also **--quiet**. **-h** or **--help** Displays help about using this command. The **-q**, **-p**, **-t** and **-P** flags (and their long flag aliases) are mutually exclusive. Only one can be specified at a time. `type` returns 0 if at least one entry was found, 1 otherwise, and 2 for invalid options or option combinations. Example ------- ``` >_ type fg fg is a builtin ``` fish nextd - move forward through directory history nextd - move forward through directory history ============================================== Synopsis -------- ``` nextd [-l | --list] [POS] ``` Description ----------- `nextd` moves forwards *POS* positions in the [history of visited directories](../interactive#directory-history); if the end of the history has been hit, a warning is printed. If the **-l** or **--list** option is specified, the current directory history is also displayed. The **-h** or **--help** option displays help about using this command. Note that the `cd` command limits directory history to the 25 most recently visited directories. The history is stored in the `dirprev` and `dirnext` variables which this command manipulates. Example ------- ``` cd /usr/src # Working directory is now /usr/src cd /usr/src/fish-shell # Working directory is now /usr/src/fish-shell prevd # Working directory is now /usr/src nextd # Working directory is now /usr/src/fish-shell ``` See Also -------- * the <cdh> command to display a prompt to quickly navigate the history * the <dirh> command to print the directory history * the <prevd> command to move backward fish argparse - parse options passed to a fish script or function argparse - parse options passed to a fish script or function ============================================================ Synopsis -------- ``` argparse [OPTIONS] OPTION_SPEC ... -- [ARG ...] ``` Description ----------- This command makes it easy for fish scripts and functions to handle arguments. You pass arguments that define the known options, followed by a literal **--**, then the arguments to be parsed (which might also include a literal **--**). `argparse` then sets variables to indicate the passed options with their values, and sets `$argv` to the remaining arguments. See the [usage](#cmd-argparse-usage) section below. Each option specification (`OPTION_SPEC`) is written in the [domain specific language](#cmd-argparse-option-specification) described below. All OPTION\_SPECs must appear after any argparse flags and before the `--` that separates them from the arguments to be parsed. Each option that is seen in the ARG list will result in variables named `_flag_X`, where **X** is the short flag letter and the long flag name (if they are defined). For example a **--help** option could cause argparse to define one variable called `_flag_h` and another called `_flag_help`. The variables will be set with local scope (i.e., as if the script had done `set -l _flag_X`). If the flag is a boolean (that is, it just is passed or not, it doesn’t have a value) the values are the short and long flags seen. If the option is not a boolean the values will be zero or more values corresponding to the values collected when the ARG list is processed. If the flag was not seen the flag variable will not be set. Options ------- The following `argparse` options are available. They must appear before all *OPTION\_SPEC*s: **-n** or **--name** The command name for use in error messages. By default the current function name will be used, or `argparse` if run outside of a function. **-x** or **--exclusive** *OPTIONS* A comma separated list of options that are mutually exclusive. You can use this more than once to define multiple sets of mutually exclusive options. **-N** or **--min-args** *NUMBER* The minimum number of acceptable non-option arguments. The default is zero. **-X** or **--max-args** *NUMBER* The maximum number of acceptable non-option arguments. The default is infinity. **-i** or **--ignore-unknown** Ignores unknown options, keeping them and their arguments in $argv instead. **-s** or **--stop-nonopt** Causes scanning the arguments to stop as soon as the first non-option argument is seen. Among other things, this is useful to implement subcommands that have their own options. **-h** or **--help** Displays help about using this command. Usage ----- To use this command, pass the option specifications (**OPTION\_SPEC**), a mandatory **--**, and then the arguments to be parsed. A simple example: ``` argparse --name=my_function 'h/help' 'n/name=' -- $argv or return ``` If `$argv` is empty then there is nothing to parse and `argparse` returns zero to indicate success. If `$argv` is not empty then it is checked for flags `-h`, `--help`, `-n` and `--name`. If they are found they are removed from the arguments and local variables called `_flag_OPTION` are set so the script can determine which options were seen. If `$argv` doesn’t have any errors, like a missing mandatory value for an option, then `argparse` exits with a status of zero. Otherwise it writes appropriate error messages to stderr and exits with a status of one. The `or return` means that the function returns `argparse`’s status if it failed, so if it goes on `argparse` succeeded. The `--` argument is required. You do not have to include any option specifications or arguments after the `--` but you must include the `--`. For example, this is acceptable: ``` set -l argv foo argparse 'h/help' 'n/name' -- $argv argparse --min-args=1 -- $argv ``` But this is not: ``` set -l argv argparse 'h/help' 'n/name' $argv ``` The first `--` seen is what allows the `argparse` command to reliably separate the option specifications and options to `argparse` itself (like `--ignore-unknown`) from the command arguments, so it is required. Option Specifications --------------------- Each option specification consists of: * An optional alphanumeric short flag character, followed by a `/` if the short flag can be used by someone invoking your command or, for backwards compatibility, a `-` if it should not be exposed as a valid short flag (in which case it will also not be exposed as a flag variable). * An optional long flag name, which if not present the short flag can be used, and if that is also not present, an error is reported * Nothing if the flag is a boolean that takes no argument or is an integer flag, or + **=** if it requires a value and only the last instance of the flag is saved, or + **=?** if it takes an optional value and only the last instance of the flag is saved, or + **=+** if it requires a value and each instance of the flag is saved. * Optionally a `!` followed by fish script to validate the value. Typically this will be a function to run. If the exit status is zero the value for the flag is valid. If non-zero the value is invalid. Any error messages should be written to stdout (not stderr). See the section on [Flag Value Validation](#flag-value-validation) for more information. See the <fish_opt> command for a friendlier but more verbose way to create option specifications. If a flag is not seen when parsing the arguments then the corresponding \_flag\_X var(s) will not be set. Integer flag ------------ Sometimes commands take numbers directly as options, like `foo -55`. To allow this one option spec can have the `#` modifier so that any integer will be understood as this flag, and the last number will be given as its value (as if `=` was used). The `#` must follow the short flag letter (if any), and other modifiers like `=` are not allowed, except for `-` (for backwards compatibility): ``` m#maximum ``` This does not read numbers given as `+NNN`, only those that look like flags - `-NNN`. Note: Optional arguments ------------------------ An option defined with `=?` can take optional arguments. Optional arguments have to be *directly attached* to the option they belong to. That means the argument will only be used for the option if you use it like: ``` cmd --flag=value # or cmd -fvalue ``` but not if used like: ``` cmd --flag value # "value" here will be used as a positional argument # and "--flag" won't have an argument. ``` If this weren’t the case, using an option without an optional argument would be difficult if you also wanted to use positional arguments. For example: ``` grep --color auto # Here "auto" will be used as the search string, # "color" will not have an argument and will fall back to the default, # which also *happens to be* auto. grep --color always # Here grep will still only use color "auto"matically # and search for the string "always". ``` This isn’t specific to argparse but common to all things using `getopt(3)` (if they have optional arguments at all). That `grep` example is how GNU grep actually behaves. Flag Value Validation --------------------- Sometimes you need to validate the option values. For example, that it is a valid integer within a specific range, or an ip address, or something entirely different. You can always do this after `argparse` returns but you can also request that `argparse` perform the validation by executing arbitrary fish script. To do so simply append an `!` (exclamation-mark) then the fish script to be run. When that code is executed three vars will be defined: * `_argparse_cmd` will be set to the value of the value of the `argparse --name` value. * `_flag_name` will be set to the short or long flag that being processed. * `_flag_value` will be set to the value associated with the flag being processed. These variables are passed to the function as local exported variables. The script should write any error messages to stdout, not stderr. It should return a status of zero if the flag value is valid otherwise a non-zero status to indicate it is invalid. Fish ships with a `_validate_int` function that accepts a `--min` and `--max` flag. Let’s say your command accepts a `-m` or `--max` flag and the minimum allowable value is zero and the maximum is 5. You would define the option like this: `m/max=!_validate_int --min 0 --max 5`. The default if you just call `_validate_int` without those flags is to simply check that the value is a valid integer with no limits on the min or max value allowed. Example OPTION\_SPECs --------------------- Some *OPTION\_SPEC* examples: * `h/help` means that both `-h` and `--help` are valid. The flag is a boolean and can be used more than once. If either flag is used then `_flag_h` and `_flag_help` will be set to however either flag was seen, as many times as it was seen. So it could be set to `-h`, `-h` and `--help`, and `count $_flag_h` would yield “3”. * `help` means that only `--help` is valid. The flag is a boolean and can be used more than once. If it is used then `_flag_help` will be set as above. Also `h-help` (with an arbitrary short letter) for backwards compatibility. * `longonly=` is a flag `--longonly` that requires an option, there is no short flag or even short flag variable. * `n/name=` means that both `-n` and `--name` are valid. It requires a value and can be used at most once. If the flag is seen then `_flag_n` and `_flag_name` will be set with the single mandatory value associated with the flag. * `n/name=?` means that both `-n` and `--name` are valid. It accepts an optional value and can be used at most once. If the flag is seen then `_flag_n` and `_flag_name` will be set with the value associated with the flag if one was provided else it will be set with no values. * `name=+` means that only `--name` is valid. It requires a value and can be used more than once. If the flag is seen then `_flag_name` will be set with the values associated with each occurrence. * `x` means that only `-x` is valid. It is a boolean that can be used more than once. If it is seen then `_flag_x` will be set as above. * `x=`, `x=?`, and `x=+` are similar to the n/name examples above but there is no long flag alternative to the short flag `-x`. * `#max` (or `#-max`) means that flags matching the regex “^--?\d+$” are valid. When seen they are assigned to the variable `_flag_max`. This allows any valid positive or negative integer to be specified by prefixing it with a single “-”. Many commands support this idiom. For example `head -3 /a/file` to emit only the first three lines of /a/file. * `n#max` means that flags matching the regex “^--?\d+$” are valid. When seen they are assigned to the variables `_flag_n` and `_flag_max`. This allows any valid positive or negative integer to be specified by prefixing it with a single “-”. Many commands support this idiom. For example `head -3 /a/file` to emit only the first three lines of /a/file. You can also specify the value using either flag: `-n NNN` or `--max NNN` in this example. * `#longonly` causes the last integer option to be stored in `_flag_longonly`. After parsing the arguments the `argv` variable is set with local scope to any values not already consumed during flag processing. If there are no unbound values the variable is set but `count $argv` will be zero. If an error occurs during argparse processing it will exit with a non-zero status and print error messages to stderr. Limitations ----------- One limitation with **--ignore-unknown** is that, if an unknown option is given in a group with known options, the entire group will be kept in $argv. `argparse` will not do any permutations here. For instance: ``` argparse --ignore-unknown h -- -ho echo $_flag_h # is -h, because -h was given echo $argv # is still -ho ``` This limitation may be lifted in future. Additionally, it can only parse known options up to the first unknown option in the group - the unknown option could take options, so it isn’t clear what any character after an unknown option means.
programming_docs
fish else - execute command if a condition is not met else - execute command if a condition is not met ================================================ Synopsis -------- ``` if CONDITION; COMMANDS_TRUE ...; [else; COMMANDS_FALSE ...;] end ``` Description ----------- <if> will execute the command *CONDITION\**. If the condition’s exit status is 0, the commands *COMMANDS\_TRUE* will execute. If it is not 0 and **else** is given, *COMMANDS\_FALSE* will be executed. Example ------- The following code tests whether a file *foo.txt* exists as a regular file. ``` if test -f foo.txt echo foo.txt exists else echo foo.txt does not exist end ``` fish string-shorten - shorten strings to a width, with an ellipsis string-shorten - shorten strings to a width, with an ellipsis ============================================================= Synopsis -------- ``` string shorten [(-c | --char) CHARS] [(-m | --max) INTEGER] [-N | --no-newline] [-l | --left] [-q | --quiet] [STRING ...] ``` Description ----------- `string shorten` truncates each *STRING* to the given visible width and adds an ellipsis to indicate it. “Visible width” means the width of all visible characters added together, excluding escape sequences and accounting for [`fish_emoji_width`](../language#envvar-fish_emoji_width) and [`fish_ambiguous_width`](../language#envvar-fish_ambiguous_width). It is the amount of columns in a terminal the *STRING* occupies. The escape sequences reflect what fish knows about, and how it computes its output. Your terminal might support more escapes, or not support escape sequences that fish knows about. If **-m** or **--max** is given, truncate at the given width. Otherwise, the lowest non-zero width of all input strings is used. A max of 0 means no shortening takes place, all STRINGs are printed as-is. If **-N** or **--no-newline** is given, only the first line (or last line with **--left**) of each STRING is used, and an ellipsis is added if it was multiline. This only works for STRINGs being given as arguments, multiple lines given on stdin will be interpreted as separate STRINGs instead. If **-c** or **--char** is given, add *CHAR* instead of an ellipsis. This can also be empty or more than one character. If **-l** or **--left** is given, remove text from the left on instead, so this prints the longest *suffix* of the string that fits. With **--no-newline**, this will take from the last line instead of the first. If **-q** or **--quiet** is given, `string shorten` only runs for the return value - if anything would be shortened, it returns 0, else 1. The default ellipsis is `…`. If fish thinks your system is incapable because of your locale, it will use `...` instead. The return value is 0 if any shortening occured, 1 otherwise. Examples -------- ``` >_ string shorten foo foobar # No width was given, we infer, and "foo" is the shortest. foo fo… >_ string shorten --char="..." foo foobar # The target width is 3 because of "foo", # and our ellipsis is 3 too, so we can't really show anything. # This is the default ellipsis if your locale doesn't allow "…". foo ... >_ string shorten --char="" --max 4 abcdef 123456 # Leaving the char empty makes us not add an ellipsis # So this truncates at 4 columns: abcd 1234 >_ touch "a multiline"\n"file" >_ for file in *; string shorten -N -- $file; end # Shorten the multiline file so we only show one line per file: a multiline… >_ ss -p | string shorten -m$COLUMNS -c "" # `ss` from Linux' iproute2 shows socket information, but prints extremely long lines. # This shortens input so it fits on the screen without overflowing lines. >_ git branch | string match -rg '^\* (.*)' | string shorten -m20 # Take the current git branch and shorten it at 20 columns. # Here the branch is "builtin-path-with-expand" builtin-path-with-e… >_ git branch | string match -rg '^\* (.*)' | string shorten -m20 --left # Taking 20 columns from the right instead: …in-path-with-expand ``` See Also -------- * [string](string#cmd-string)’s `pad` subcommand does the inverse of this command, adding padding to a specific width instead. * The <printf> command can do simple padding, for example `printf %10s\n` works like `string pad -w10`. * [string length](string-length) with the `--visible` option can be used to show what fish thinks the width is. fish while - perform a set of commands multiple times while - perform a set of commands multiple times ================================================ Synopsis -------- ``` while CONDITION; COMMANDS; end ``` Description ----------- **while** repeatedly executes `CONDITION`, and if the exit status is 0, then executes `COMMANDS`. The exit status of the **while** loop is the exit status of the last iteration of the `COMMANDS` executed, or 0 if none were executed. (This matches other shells and is POSIX-compatible.) You can use <and> or <or> for complex conditions. Even more complex control can be achieved with `while true` containing a <break>. The **-h** or **--help** option displays help about using this command. Example ------- ``` while test -f foo.txt; or test -f bar.txt ; echo file exists; sleep 10; end # outputs 'file exists' at 10 second intervals, # as long as the file foo.txt or bar.txt exists. ``` fish alias - create a function alias - create a function ========================= Synopsis -------- ``` alias alias [--save] NAME DEFINITION alias [--save] NAME=DEFINITION ``` Description ----------- `alias` is a simple wrapper for the `function` builtin, which creates a function wrapping a command. It has similar syntax to POSIX shell `alias`. For other uses, it is recommended to define a <function>. If you want to ease your interactive use, to save typing, consider using an [abbreviation](abbr) instead. `fish` marks functions that have been created by `alias` by including the command used to create them in the function description. You can list `alias`-created functions by running `alias` without arguments. They must be erased using `functions -e`. * `NAME` is the name of the alias * `DEFINITION` is the actual command to execute. `alias` automatically appends `$argv`, so that all parameters used with the alias are passed to the actual command. You cannot create an alias to a function with the same name. Note that spaces need to be escaped in the call to `alias` just like at the command line, *even inside quoted parts*. The following options are available: **-h** or **--help** Displays help about using this command. **-s** or **--save** Saves the function created by the alias into your fish configuration directory using <funcsave>. Example ------- The following code will create `rmi`, which runs `rm` with additional arguments on every invocation. ``` alias rmi="rm -i" # This is equivalent to entering the following function: function rmi --wraps rm --description 'alias rmi=rm -i' rm -i $argv end # This needs to have the spaces escaped or "Chrome.app..." # will be seen as an argument to "/Applications/Google": alias chrome='/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome banana' ``` See more -------- 1. The <function> command this builds on. 2. [Functions](../language#syntax-function). 3. [Defining aliases](../language#syntax-aliases). fish trap - perform an action when the shell receives a signal trap - perform an action when the shell receives a signal ========================================================= Synopsis -------- ``` trap [OPTIONS] [[ARG] REASON ... ] ``` Description ----------- `trap` is a wrapper around the fish event delivery framework. It exists for backwards compatibility with POSIX shells. For other uses, it is recommended to define an [event handler](../language#event). The following parameters are available: *ARG* Command to be executed on signal delivery. *REASON* Name of the event to trap. For example, a signal like `INT` or `SIGINT`, or the special symbol `EXIT`. **-l** or **--list-signals** Prints a list of signal names. **-p** or **--print** Prints all defined signal handlers. **-h** or **--help** Displays help about using this command. If *ARG* and *REASON* are both specified, *ARG* is the command to be executed when the event specified by *REASON* occurs (e.g., the signal is delivered). If *ARG* is absent (and there is a single *REASON*) or `-`, each specified signal is reset to its original disposition (the value it had upon entrance to the shell). If *ARG* is the null string the signal specified by each *REASON* is ignored by the shell and by the commands it invokes. If *ARG* is not present and **-p** has been supplied, then the trap commands associated with each *REASON* are displayed. If no arguments are supplied or if only **-p** is given, `trap` prints the list of commands associated with each signal. Signal names are case insensitive and the `SIG` prefix is optional. Trapping a signal will prevent fish from exiting in response to that signal. The exit status is 1 if any *REASON* is invalid; otherwise trap returns 0. Example ------- ``` trap "status --print-stack-trace" SIGUSR1 # Prints a stack trace each time the SIGUSR1 signal is sent to the shell. ``` fish fish_indent - indenter and prettifier fish\_indent - indenter and prettifier ====================================== Synopsis -------- ``` fish_indent [OPTIONS] [FILE ...] ``` Description ----------- **fish\_indent** is used to indent a piece of fish code. **fish\_indent** reads commands from standard input or the given filenames and outputs them to standard output or a specified file (if `-w` is given). The following options are available: **-w** or **--write** Indents a specified file and immediately writes to that file. **-i** or **--no-indent** Do not indent commands; only reformat to one job per line. **-c** or **--check** Do not indent, only return 0 if the code is already indented as fish\_indent would, the number of failed files otherwise. Also print the failed filenames if not reading from standard input. **-v** or **--version** Displays the current **fish** version and then exits. **--ansi** Colorizes the output using ANSI escape sequences, appropriate for the current [`TERM`](../language#envvar-TERM), using the colors defined in the environment (such as [`fish_color_command`](../interactive#envvar-fish_color_command)). **--html** Outputs HTML, which supports syntax highlighting if the appropriate CSS is defined. The CSS class names are the same as the variable names, such as `fish_color_command`. **-d** or **--debug=DEBUG\_CATEGORIES** Enable debug output and specify a pattern for matching debug categories. See [Debugging](fish#debugging-fish) in <fish> (1) for details. **-o** or **--debug-output=DEBUG\_FILE** Specify a file path to receive the debug output, including categories and `fish_trace`. The default is standard error. **--dump-parse-tree** Dumps information about the parsed statements to standard error. This is likely to be of interest only to people working on the fish source code. **-h** or **--help** Displays help about using this command. fish fish_right_prompt - define the appearance of the right-side command line prompt fish\_right\_prompt - define the appearance of the right-side command line prompt ================================================================================= Synopsis -------- ``` function fish_right_prompt ... end ``` Description ----------- `fish_right_prompt` is similar to `fish_prompt`, except that it appears on the right side of the terminal window. Multiple lines are not supported in `fish_right_prompt`. Example ------- A simple right prompt: ``` function fish_right_prompt -d "Write out the right prompt" date '+%m/%d/%y' end ``` fish pushd - push directory to directory stack pushd - push directory to directory stack ========================================= Synopsis -------- ``` pushd DIRECTORY ``` Description ----------- The `pushd` function adds *DIRECTORY* to the top of the [directory stack](../interactive#directory-stack) and makes it the current working directory. <popd> will pop it off and return to the original directory. Without arguments, it exchanges the top two directories in the stack. `pushd +NUMBER` rotates the stack counter-clockwise i.e. from bottom to top `pushd -NUMBER` rotates clockwise i.e. top to bottom. The **-h** or **--help** option displays help about using this command. Example ------- ``` cd ~/dir1 pushd ~/dir2 pushd ~/dir3 # Working directory is now ~/dir3 # Directory stack contains ~/dir2 ~/dir1 pushd /tmp # Working directory is now /tmp # Directory stack contains ~/dir3 ~/dir2 ~/dir1 pushd +1 # Working directory is now ~/dir3 # Directory stack contains ~/dir2 ~/dir1 /tmp popd # Working directory is now ~/dir2 # Directory stack contains ~/dir1 /tmp ``` See Also -------- * the <dirs> command to print the directory stack * the <cdh> command which provides a more intuitive way to navigate to recently visited directories. fish fish_clipboard_copy - copy text to the system’s clipboard fish\_clipboard\_copy - copy text to the system’s clipboard =========================================================== Synopsis -------- ``` fish_clipboard_copy foo | fish_clipboard_copy ``` Description ----------- The `fish_clipboard_copy` function copies text to the system clipboard. If stdin is not a terminal (see <isatty>), it will read all input from there and copy it. If it is, it will use the current commandline, or the current selection if there is one. It is bound to `Control`+`X` by default. `fish_clipboard_copy` works by calling a system-specific backend. If it doesn’t appear to work you may need to install yours. Currently supported are: * `pbcopy` * `wl-copy` using wayland * `xsel` and `xclip` for X11 * `clip.exe` on Windows. See also -------- * [fish\_clipboard\_paste - get text from the system’s clipboard](fish_clipboard_paste) which does the inverse. fish funced - edit a function interactively funced - edit a function interactively ====================================== Synopsis -------- ``` funced [OPTIONS] NAME ``` Description ----------- `funced` provides an interface to edit the definition of the function *NAME*. If the `$VISUAL` environment variable is set, it will be used as the program to edit the function. If `$VISUAL` is unset but `$EDITOR` is set, that will be used. Otherwise, a built-in editor will be used. Note that to enter a literal newline using the built-in editor you should press `Alt`+`Enter`. Pressing `Enter` signals that you are done editing the function. This does not apply to an external editor like emacs or vim. `funced` will try to edit the original file that a function is defined in, which might include variable definitions or helper functions as well. If changes cannot be saved to the original file, a copy will be created in the user’s function directory. If there is no function called *NAME*, a new function will be created with the specified name. **-e command** or **--editor command** Open the function body inside the text editor given by the command (for example, **-e vi**). The special command `fish` will use the built-in editor (same as specifying **-i**). **-i** or **--interactive** Force opening the function body in the built-in editor even if `$VISUAL` or `$EDITOR` is defined. **-s** or **--save** Automatically save the function after successfully editing it. **-h** or **--help** Displays help about using this command. Example ------- Say you want to modify your prompt. Run: ``` >_ funced fish_prompt ``` This will open up your editor, allowing you to modify the function. When you’re done, save and quit. Fish will reload the function, so you should see the changes right away. When you’re done, use: ``` >_ funcsave fish_prompt ``` For more, see <funcsave>. fish string-match - match substrings string-match - match substrings =============================== Synopsis -------- ``` string match [-a | --all] [-e | --entire] [-i | --ignore-case] [-g | --groups-only] [-r | --regex] [-n | --index] [-q | --quiet] [-v | --invert] PATTERN [STRING ...] ``` Description ----------- `string match` tests each *STRING* against *PATTERN* and prints matching substrings. Only the first match for each *STRING* is reported unless **-a** or **--all** is given, in which case all matches are reported. If you specify the **-e** or **--entire** then each matching string is printed including any prefix or suffix not matched by the pattern (equivalent to `grep` without the **-o** flag). You can, obviously, achieve the same result by prepending and appending **\*** or **.\*** depending on whether or not you have specified the **--regex** flag. The **--entire** flag is simply a way to avoid having to complicate the pattern in that fashion and make the intent of the `string match` clearer. Without **--entire** and **--regex**, a *PATTERN* will need to match the entire *STRING* before it will be reported. Matching can be made case-insensitive with **--ignore-case** or **-i**. If **--groups-only** or **-g** is given, only the capturing groups will be reported - meaning the full match will be skipped. This is incompatible with **--entire** and **--invert**, and requires **--regex**. It is useful as a simple cutting tool instead of `string replace`, so you can simply choose “this part” of a string. If **--index** or **-n** is given, each match is reported as a 1-based start position and a length. By default, PATTERN is interpreted as a glob pattern matched against each entire *STRING* argument. A glob pattern is only considered a valid match if it matches the entire *STRING*. If **--regex** or **-r** is given, *PATTERN* is interpreted as a Perl-compatible regular expression, which does not have to match the entire *STRING*. For a regular expression containing capturing groups, multiple items will be reported for each match, one for the entire match and one for each capturing group. With this, only the matching part of the *STRING* will be reported, unless **--entire** is given. When matching via regular expressions, `string match` automatically sets variables for all named capturing groups (`(?<name>expression)`). It will create a variable with the name of the group, in the default scope, for each named capturing group, and set it to the value of the capturing group in the first matched argument. If a named capture group matched an empty string, the variable will be set to the empty string (like `set var ""`). If it did not match, the variable will be set to nothing (like `set var`). When **--regex** is used with **--all**, this behavior changes. Each named variable will contain a list of matches, with the first match contained in the first element, the second match in the second, and so on. If the group was empty or did not match, the corresponding element will be an empty string. If **--invert** or **-v** is used the selected lines will be only those which do not match the given glob pattern or regular expression. Exit status: 0 if at least one match was found, or 1 otherwise. Examples -------- ### Match Glob Examples ``` >_ string match '?' a a >_ string match 'a*b' axxb axxb >_ string match -i 'a??B' Axxb Axxb >_ echo 'ok?' | string match '*\?' ok? # Note that only the second STRING will match here. >_ string match 'foo' 'foo1' 'foo' 'foo2' foo >_ string match -e 'foo' 'foo1' 'foo' 'foo2' foo1 foo foo2 >_ string match 'foo?' 'foo1' 'foo' 'foo2' foo1 foo2 ``` ### Match Regex Examples ``` >_ string match -r 'cat|dog|fish' 'nice dog' dog >_ string match -r -v "c.*[12]" {cat,dog}(seq 1 4) dog1 dog2 cat3 dog3 cat4 dog4 >_ string match -r '(\d\d?):(\d\d):(\d\d)' 2:34:56 2:34:56 2 34 56 >_ string match -r '^(\w{2,4})\1$' papa mud murmur papa pa murmur mur >_ string match -r -a -n at ratatat 2 2 4 2 6 2 >_ string match -r -i '0x[0-9a-f]{1,8}' 'int magic = 0xBadC0de;' 0xBadC0de >_ echo $version 3.1.2-1575-ga2ff32d90 >_ string match -rq '(?<major>\d+).(?<minor>\d+).(?<revision>\d+)' -- $version >_ echo "You are using fish $major!" You are using fish 3! >_ string match -raq ' *(?<sentence>[^.!?]+)(?<punctuation>[.!?])?' "hello, friend. goodbye" >_ printf "%s\n" -- $sentence hello, friend goodbye >_ printf "%s\n" -- $punctuation . >_ string match -rq '(?<word>hello)' 'hi' >_ count $word 0 ```
programming_docs
fish fish_opt - create an option specification for the argparse command fish\_opt - create an option specification for the argparse command =================================================================== Synopsis -------- ``` fish_opt [(-slor | --multiple-vals=) OPTNAME] fish_opt --help ``` Description ----------- This command provides a way to produce option specifications suitable for use with the <argparse> command. You can, of course, write the option specifications by hand without using this command. But you might prefer to use this for the clarity it provides. The following `argparse` options are available: **-s** or **--short** Takes a single letter that is used as the short flag in the option being defined. This option is mandatory. **-l** or **--long** Takes a string that is used as the long flag in the option being defined. This option is optional and has no default. If no long flag is defined then only the short flag will be allowed when parsing arguments using the option specification. **--long-only** The option being defined will only allow the long flag name to be used. The short flag name must still be defined (i.e., **--short** must be specified) but it cannot be used when parsing arguments using this option specification. **-o** or **--optional-val** Tthe option being defined can take a value, but it is optional rather than required. If the option is seen more than once when parsing arguments, only the last value seen is saved. This means the resulting flag variable created by `argparse` will zero elements if no value was given with the option else it will have exactly one element. **-r** or **--required-val** The option being defined requires a value. If the option is seen more than once when parsing arguments, only the last value seen is saved. This means the resulting flag variable created by `argparse` will have exactly one element. **--multiple-vals** The option being defined requires a value each time it is seen. Each instance is stored. This means the resulting flag variable created by `argparse` will have one element for each instance of this option in the arguments. **-h** or **--help** Displays help about using this command. Examples -------- Define a single option specification for the boolean help flag: ``` set -l options (fish_opt -s h -l help) argparse $options -- $argv ``` Same as above but with a second flag that requires a value: ``` set -l options (fish_opt -s h -l help) set options $options (fish_opt -s m -l max --required-val) argparse $options -- $argv ``` Same as above but with a third flag that can be given multiple times saving the value of each instance seen and only the long flag name (`--token`) can be used: ``` set -l options (fish_opt --short=h --long=help) set options $options (fish_opt --short=m --long=max --required-val) set options $options (fish_opt --short=t --long=token --multiple-vals --long-only) argparse $options -- $argv ``` fish bind - handle fish key bindings bind - handle fish key bindings =============================== Synopsis -------- ``` bind [(-M | --mode) MODE] [(-m | --sets-mode) NEW_MODE] [--preset | --user] [-s | --silent] [-k | --key] SEQUENCE COMMAND ... bind [(-M | --mode) MODE] [-k | --key] [--preset] [--user] SEQUENCE bind (-K | --key-names) [-a | --all] [--preset] [--user] bind (-f | --function-names) bind (-L | --list-modes) bind (-e | --erase) [(-M | --mode) MODE] [--preset] [--user] [-a | --all] | [-k | --key] SEQUENCE ... ``` Description ----------- `bind` manages bindings. It can add bindings if given a SEQUENCE of characters to bind to. These should be written as [fish escape sequences](../language#escapes). The most important of these are `\c` for the control key, and `\e` for escape, and because of historical reasons also the Alt key (sometimes also called “Meta”). For example, `Alt`+`W` can be written as `\ew`, and `Control`+`X` (^X) can be written as `\cx`. Note that Alt-based key bindings are case sensitive and Control-based key bindings are not. This is a constraint of text-based terminals, not `fish`. The generic key binding that matches if no other binding does can be set by specifying a `SEQUENCE` of the empty string (that is, `''` ). For most key bindings, it makes sense to bind this to the `self-insert` function (i.e. `bind '' self-insert`). This will insert any keystrokes not specifically bound to into the editor. Non-printable characters are ignored by the editor, so this will not result in control sequences being inserted. If the `-k` switch is used, the name of a key (such as ‘down’, ‘up’ or ‘backspace’) is used instead of a sequence. The names used are the same as the corresponding curses variables, but without the ‘key\_’ prefix. (See `terminfo(5)` for more information, or use `bind --key-names` for a list of all available named keys). Normally this will print an error if the current `$TERM` entry doesn’t have a given key, unless the `-s` switch is given. To find out what sequence a key combination sends, you can use <fish_key_reader>. `COMMAND` can be any fish command, but it can also be one of a set of special input functions. These include functions for moving the cursor, operating on the kill-ring, performing tab completion, etc. Use `bind --function-names` for a complete list of these input functions. When `COMMAND` is a shellscript command, it is a good practice to put the actual code into a [function](../language#syntax-function) and simply bind to the function name. This way it becomes significantly easier to test the function while editing, and the result is usually more readable as well. Note Special input functions cannot be combined with ordinary shell script commands. The commands must be entirely a sequence of special input functions (from `bind -f`) or all shell script commands (i.e., valid fish script). To run special input functions from regular fish script, use `commandline -f` (see also <commandline>). If a script produces output, it should finish by calling `commandline -f repaint` to tell fish that a repaint is in order. If no `SEQUENCE` is provided, all bindings (or just the bindings in the given `MODE`) are printed. If `SEQUENCE` is provided but no `COMMAND`, just the binding matching that sequence is printed. To save custom key bindings, put the `bind` statements into [config.fish](../language#configuration). Alternatively, fish also automatically executes a function called `fish_user_key_bindings` if it exists. Key bindings may use “modes”, which mimics Vi’s modal input behavior. The default mode is “default”. Every key binding applies to a single mode; you can specify which one with `-M MODE`. If the key binding should change the mode, you can specify the new mode with `-m NEW_MODE`. The mode can be viewed and changed via the `$fish_bind_mode` variable. If you want to change the mode from inside a fish function, use `set fish_bind_mode MODE`. Options ------- The following options are available: **-k** or **--key** Specify a key name, such as ‘left’ or ‘backspace’ instead of a character sequence **-K** or **--key-names** Display a list of available key names. Specifying **-a** or **--all** includes keys that don’t have a known mapping **-f** or **--function-names** Display a list of available input functions **-L** or **--list-modes** Display a list of defined bind modes **-M MODE** or **--mode** *MODE* Specify a bind mode that the bind is used in. Defaults to “default” **-m NEW\_MODE** or **--sets-mode** *NEW\_MODE* Change the current mode to *NEW\_MODE* after this binding is executed **-e** or **--erase** Erase the binding with the given sequence and mode instead of defining a new one. Multiple sequences can be specified with this flag. Specifying **-a** or **--all** with **-M** or **--mode** erases all binds in the given mode regardless of sequence. Specifying **-a** or **--all** without **-M** or **--mode** erases all binds in all modes regardless of sequence. **-a** or **--all** See **--erase** and **--key-names** **--preset** and **--user** Specify if bind should operate on user or preset bindings. User bindings take precedence over preset bindings when fish looks up mappings. By default, all `bind` invocations work on the “user” level except for listing, which will show both levels. All invocations except for inserting new bindings can operate on both levels at the same time (if both **--preset** and **--user** are given). **--preset** should only be used in full binding sets (like when working on `fish_vi_key_bindings`). **-s** or **--silent** Silences some of the error messages, including for unknown key names and unbound sequences. **-h** or **--help** Displays help about using this command. Special input functions ----------------------- The following special input functions are available: `and` only execute the next function if the previous succeeded (note: only some functions report success) `accept-autosuggestion` accept the current autosuggestion `backward-char` move one character to the left. If the completion pager is active, select the previous completion instead. `backward-bigword` move one whitespace-delimited word to the left `backward-delete-char` deletes one character of input to the left of the cursor `backward-kill-bigword` move the whitespace-delimited word to the left of the cursor to the killring `backward-kill-line` move everything from the beginning of the line to the cursor to the killring `backward-kill-path-component` move one path component to the left of the cursor to the killring. A path component is everything likely to belong to a path component, i.e. not any of the following: `/={,}’":@ |;<>&`, plus newlines and tabs. `backward-kill-word` move the word to the left of the cursor to the killring. The “word” here is everything up to punctuation or whitespace. `backward-word` move one word to the left `beginning-of-buffer` moves to the beginning of the buffer, i.e. the start of the first line `beginning-of-history` move to the beginning of the history `beginning-of-line` move to the beginning of the line `begin-selection` start selecting text `cancel` cancel the current commandline and replace it with a new empty one `cancel-commandline` cancel the current commandline and replace it with a new empty one, leaving the old one in place with a marker to show that it was cancelled `capitalize-word` make the current word begin with a capital letter `complete` guess the remainder of the current token `complete-and-search` invoke the searchable pager on completion options (for convenience, this also moves backwards in the completion pager) `delete-char` delete one character to the right of the cursor `delete-or-exit` delete one character to the right of the cursor, or exit the shell if the commandline is empty `down-line` move down one line `downcase-word` make the current word lowercase `end-of-buffer` moves to the end of the buffer, i.e. the end of the first line `end-of-history` move to the end of the history `end-of-line` move to the end of the line `end-selection` end selecting text `expand-abbr` expands any abbreviation currently under the cursor `execute` run the current commandline `exit` exit the shell `forward-bigword` move one whitespace-delimited word to the right `forward-char` move one character to the right; or if at the end of the commandline, accept the current autosuggestion. If the completion pager is active, select the next completion instead. `forward-single-char` move one character to the right; or if at the end of the commandline, accept a single char from the current autosuggestion. `forward-word` move one word to the right; or if at the end of the commandline, accept one word from the current autosuggestion. `history-pager` invoke the searchable pager on history (incremental search); or if the history pager is already active, search further backwards in time. `history-search-backward` search the history for the previous match `history-search-forward` search the history for the next match `history-prefix-search-backward` search the history for the previous prefix match `history-prefix-search-forward` search the history for the next prefix match `history-token-search-backward` search the history for the previous matching argument `history-token-search-forward` search the history for the next matching argument `forward-jump and backward-jump` read another character and jump to its next occurence after/before the cursor `forward-jump-till and backward-jump-till` jump to right *before* the next occurence `repeat-jump and repeat-jump-reverse` redo the last jump in the same/opposite direction `kill-bigword` move the next whitespace-delimited word to the killring `kill-line` move everything from the cursor to the end of the line to the killring `kill-selection` move the selected text to the killring `kill-whole-line` move the line (including the following newline) to the killring. If the line is the last line, its preceeding newline is also removed `kill-inner-line` move the line (without the following newline) to the killring `kill-word` move the next word to the killring `nextd-or-forward-word` if the commandline is empty, then move forward in the directory history, otherwise move one word to the right; or if at the end of the commandline, accept one word from the current autosuggestion. `or` only execute the next function if the previous did not succeed (note: only some functions report failure) `pager-toggle-search` toggles the search field if the completions pager is visible; or if used after `history-pager`, search forwards in time. `prevd-or-backward-word` if the commandline is empty, then move backward in the directory history, otherwise move one word to the left `repaint` reexecutes the prompt functions and redraws the prompt (also `force-repaint` for backwards-compatibility) `repaint-mode` reexecutes the <fish_mode_prompt> and redraws the prompt. This is useful for vi-mode. If no `fish_mode_prompt` exists or it prints nothing, it acts like a normal repaint. `self-insert` inserts the matching sequence into the command line `self-insert-notfirst` inserts the matching sequence into the command line, unless the cursor is at the beginning `suppress-autosuggestion` remove the current autosuggestion. Returns true if there was a suggestion to remove. `swap-selection-start-stop` go to the other end of the highlighted text without changing the selection `transpose-chars` transpose two characters to the left of the cursor `transpose-words` transpose two words to the left of the cursor `togglecase-char` toggle the capitalisation (case) of the character under the cursor `togglecase-selection` toggle the capitalisation (case) of the selection `insert-line-under` add a new line under the current line `insert-line-over` add a new line over the current line `up-line` move up one line `undo and redo` revert or redo the most recent edits on the command line `upcase-word` make the current word uppercase `yank` insert the latest entry of the killring into the buffer `yank-pop` rotate to the previous entry of the killring Additional functions -------------------- The following functions are included as normal functions, but are particularly useful for input editing: `up-or-search and down-or-search` move the cursor or search the history depending on the cursor position and current mode `edit_command_buffer` open the visual editor (controlled by the `VISUAL` or `EDITOR` environment variables) with the current command-line contents `fish_clipboard_copy` copy the current selection to the system clipboard `fish_clipboard_paste` paste the current selection from the system clipboard before the cursor `fish_commandline_append` append the argument to the command-line. If the command-line already ends with the argument, this removes the suffix instead. Starts with the last command from history if the command-line is empty. `fish_commandline_prepend` prepend the argument to the command-line. If the command-line already starts with the argument, this removes the prefix instead. Starts with the last command from history if the command-line is empty. Examples -------- Exit the shell when `Control`+`D` is pressed: ``` bind \cd 'exit' ``` Perform a history search when `Page Up` is pressed: ``` bind -k ppage history-search-backward ``` Turn on [Vi key bindings](../interactive#vi-mode) and rebind `Control`+`C` to clear the input line: ``` set -g fish_key_bindings fish_vi_key_bindings bind -M insert \cc kill-whole-line repaint ``` Launch `git diff` and repaint the commandline afterwards when `Control`+`G` is pressed: ``` bind \cg 'git diff; commandline -f repaint' ``` Terminal Limitations -------------------- Unix terminals, like the ones fish operates in, are at heart 70s technology. They have some limitations that applications running inside them can’t workaround. For instance, the control key modifies a character by setting the top three bits to 0. This means: * Many characters + control are indistinguishable from other keys. `Control`+`I` *is* tab, `Control`+`J` *is* newline (`\n`). * Control and shift don’t work simultaneously Other keys don’t have a direct encoding, and are sent as escape sequences. For example `→` (Right) often sends `\e\[C`. These can differ from terminal to terminal, and the mapping is typically available in `terminfo(5)`. Sometimes however a terminal identifies as e.g. `xterm-256color` for compatibility, but then implements xterm’s sequences incorrectly. Special Case: The Escape Character ---------------------------------- The escape key can be used standalone, for example, to switch from insertion mode to normal mode when using Vi keybindings. Escape can also be used as a “meta” key, to indicate the start of an escape sequence, like for function or arrow keys. Custom bindings can also be defined that begin with an escape character. Holding alt and something else also typically sends escape, for example holding alt+a will send an escape character and then an “a”. fish waits for a period after receiving the escape character, to determine whether it is standalone or part of an escape sequence. While waiting, additional key presses make the escape key behave as a meta key. If no other key presses come in, it is handled as a standalone escape. The waiting period is set to 30 milliseconds (0.03 seconds). It can be configured by setting the `fish_escape_delay_ms` variable to a value between 10 and 5000 ms. This can be a universal variable that you set once from an interactive session. fish string-collect - join strings into one string-collect - join strings into one ====================================== Synopsis -------- ``` string collect [-a | --allow-empty] [-N | --no-trim-newlines] [STRING ...] ``` Description ----------- `string collect` collects its input into a single output argument, without splitting the output when used in a command substitution. This is useful when trying to collect multiline output from another command into a variable. Exit status: 0 if any output argument is non-empty, or 1 otherwise. A command like `echo (cmd | string collect)` is mostly equivalent to a quoted command substitution (`echo "$(cmd)"`). The main difference is that the former evaluates to zero or one elements whereas the quoted command substitution always evaluates to one element due to string interpolation. If invoked with multiple arguments instead of input, `string collect` preserves each argument separately, where the number of output arguments is equal to the number of arguments given to `string collect`. Any trailing newlines on the input are trimmed, just as with `"$(cmd)"` substitution. Use **--no-trim-newlines** to disable this behavior, which may be useful when running a command such as `set contents (cat filename | string collect -N)`. With **--allow-empty**, `string collect` always prints one (empty) argument. This can be used to prevent an argument from disappearing. Examples -------- ``` >_ echo "zero $(echo one\ntwo\nthree) four" zero one two three four >_ echo \"(echo one\ntwo\nthree | string collect)\" "one two three" >_ echo \"(echo one\ntwo\nthree | string collect -N)\" "one two three " >_ echo foo(true | string collect --allow-empty)bar foobar ```
programming_docs
fish bg - send jobs to background bg - send jobs to background ============================ Synopsis -------- ``` bg [PID ...] ``` Description ----------- `bg` sends [jobs](../language#syntax-job-control) to the background, resuming them if they are stopped. A background job is executed simultaneously with fish, and does not have access to the keyboard. If no job is specified, the last job to be used is put in the background. If `PID` is specified, the jobs containing the specified process IDs are put in the background. For compatibility with other shells, job expansion syntax is supported for `bg`. A PID of the format `%1` will be interpreted as the PID of job 1. Job numbers can be seen in the output of <jobs>. When at least one of the arguments isn’t a valid job specifier, `bg` will print an error without backgrounding anything. When all arguments are valid job specifiers, `bg` will background all matching jobs that exist. The **-h** or **--help** option displays help about using this command. Example ------- `bg 123 456 789` will background the jobs that contain processes 123, 456 and 789. If only 123 and 789 exist, it will still background them and print an error about 456. `bg 123 banana` or `bg banana 123` will complain that “banana” is not a valid job specifier. `bg %1` will background job 1. fish fish_title - define the terminal’s title fish\_title - define the terminal’s title ========================================= Synopsis -------- ``` fish_title ``` ``` function fish_title ... end ``` Description ----------- The `fish_title` function is executed before and after a new command is executed or put into the foreground and the output is used as a titlebar message. The first argument to fish\_title contains the most recently executed foreground command as a string, if any. This requires that your terminal supports programmable titles and the feature is turned on. Example ------- A simple title: ``` function fish_title set -q argv[1]; or set argv fish # Looks like ~/d/fish: git log # or /e/apt: fish echo (fish_prompt_pwd_dir_length=1 prompt_pwd): $argv; end ``` fish dirs - print directory stack dirs - print directory stack ============================ Synopsis -------- ``` dirs [-c] ``` Description ----------- `dirs` prints the current [directory stack](../interactive#directory-stack), as created by <pushd> and modified by <popd>. The following options are available: **-c**: Clear the directory stack instead of printing it. **-h** or **--help** Displays help about using this command. `dirs` does not accept any arguments. See Also -------- * the <cdh> command, which provides a more intuitive way to navigate to recently visited directories. fish fish_git_prompt - output git information for use in a prompt fish\_git\_prompt - output git information for use in a prompt ============================================================== Synopsis -------- ``` fish_git_prompt ``` ``` function fish_prompt printf '%s' $PWD (fish_git_prompt) ' $ ' end ``` Description ----------- The `fish_git_prompt` function displays information about the current git repository, if any. [Git](https://git-scm.com) must be installed. There are numerous customization options, which can be controlled with git options or fish variables. git options, where available, take precedence over the fish variable with the same function. git options can be set on a per-repository or global basis. git options can be set with the `git config` command, while fish variables can be set as usual with the <set> command. Boolean options (those which enable or disable something) understand “1”, “yes” or “true” to mean true and every other value to mean false. * `$__fish_git_prompt_show_informative_status` or the git option `bash.showInformativeStatus` can be set to 1, true or yes to enable the “informative” display, which will show a large amount of information - the number of dirty files, unpushed/unpulled commits, and more. In large repositories, this can take a lot of time, so you may wish to disable it in these repositories with `git config --local bash.showInformativeStatus false`. It also changes the characters the prompt uses to less plain ones (`✚` instead of `*` for the dirty state for example) , and if you are only interested in that, set `$__fish_git_prompt_use_informative_chars` instead. Because counting untracked files requires a lot of time, the number of untracked files is only shown if enabled via `$__fish_git_prompt_showuntrackedfiles` or the git option `bash.showUntrackedFiles`. * `$__fish_git_prompt_showdirtystate` or the git option `bash.showDirtyState` can be set to 1, true or yes to show if the repository is “dirty”, i.e. has uncommitted changes. * `$__fish_git_prompt_showuntrackedfiles` or the git option `bash.showUntrackedFiles` can be set to 1, true or yes to show if the repository has untracked files (that aren’t ignored). * `$__fish_git_prompt_showupstream` can be set to a list of values to determine how changes between HEAD and upstream are shown: `auto` summarize the difference between HEAD and its upstream `verbose` show number of commits ahead/behind (+/-) upstream `name` if verbose, then also show the upstream abbrev name `informative` similar to verbose, but shows nothing when equal - this is the default if informative status is enabled. `git` always compare HEAD to @{upstream} `svn` always compare HEAD to your SVN upstream `none` disables (useful with informative status) * `$__fish_git_prompt_showstashstate` can be set to 1, true or yes to display the state of the stash. * `$__fish_git_prompt_shorten_branch_len` can be set to the number of characters that the branch name will be shortened to. * `$__fish_git_prompt_describe_style` can be set to one of the following styles to describe the current HEAD: `contains` relative to newer annotated tag, such as `(v1.6.3.2~35)` `branch` relative to newer tag or branch, such as `(master~4)` `describe` relative to older annotated tag, such as `(v1.6.3.1-13-gdd42c2f)` `default` an exactly matching tag (`(develop)`) If none of these apply, the commit SHA shortened to 8 characters is used. * `$__fish_git_prompt_showcolorhints` can be set to 1, true or yes to enable coloring for the branch name and status symbols. A number of variables set characters and color used as indicators. Many of these have a different default if used with informative status enabled, or `$__fish_git_prompt_use_informative_chars` set. The usual default is given first, then the informative default (if it is different). If no default for the colors is given, they default to `$__fish_git_prompt_color`. * `$__fish_git_prompt_char_stateseparator` (’ ‘, `|`) - the character to be used between the state characters * `$__fish_git_prompt_color` (no default) * `$__fish_git_prompt_color_prefix` - the color of the `(` prefix * `$__fish_git_prompt_color_suffix` - the color of the `)` suffix * `$__fish_git_prompt_color_bare` - the color to use for a bare repository - one without a working tree * `$__fish_git_prompt_color_merging` - the color when a merge/rebase/revert/bisect or cherry-pick is in progress * `$__fish_git_prompt_char_cleanstate` (✔ in informative mode) - the character to be used when nothing else applies * `$__fish_git_prompt_color_cleanstate` (no default) Variables used with `showdirtystate`: * `$__fish_git_prompt_char_dirtystate` (`*`, ✚) - the number of “dirty” changes, i.e. unstaged files with changes * `$__fish_git_prompt_char_invalidstate` (#, ✖) - the number of “unmerged” changes, e.g. additional changes to already added files * `$__fish_git_prompt_char_stagedstate` (+, ●) - the number of staged files without additional changes * `$__fish_git_prompt_color_dirtystate` (red with showcolorhints, same as color\_flags otherwise) * `$__fish_git_prompt_color_invalidstate` * `$__fish_git_prompt_color_stagedstate` (green with showcolorhints, color\_flags otherwise) Variables used with `showstashstate`: * `$__fish_git_prompt_char_stashstate` (`$`, ⚑) * `$__fish_git_prompt_color_stashstate` (same as color\_flags) Variables used with `showuntrackedfiles`: * `$__fish_git_prompt_char_untrackedfiles` (%, …) - the symbol for untracked files * `$__fish_git_prompt_color_untrackedfiles` (same as color\_flags) Variables used with `showupstream` (also implied by informative status): * `$__fish_git_prompt_char_upstream_ahead` (>, ↑) - the character for the commits this repository is ahead of upstream * `$__fish_git_prompt_char_upstream_behind` (<, ↓) - the character for the commits this repository is behind upstream * `$__fish_git_prompt_char_upstream_diverged` (<>) - the symbol if this repository is both ahead and behind upstream * `$__fish_git_prompt_char_upstream_equal` (=) - the symbol if this repo is equal to upstream * `$__fish_git_prompt_char_upstream_prefix` (‘’) * `$__fish_git_prompt_color_upstream` Colors used with `showcolorhints`: * `$__fish_git_prompt_color_branch` (green) - the color of the branch if nothing else applies * `$__fish_git_prompt_color_branch_detached` (red) the color of the branch if it’s detached (e.g. a commit is checked out) * `$__fish_git_prompt_color_branch_dirty` (no default) the color of the branch if it’s dirty and not detached * `$__fish_git_prompt_color_branch_staged` (no default) the color of the branch if it just has something staged and is otherwise clean * `$__fish_git_prompt_color_flags` (--bold blue) - the default color for dirty/staged/stashed/untracked state Note that all colors can also have a corresponding `_done` color. For example, the contents of `$__fish_git_prompt_color_upstream_done` is printed right \_after\_ the upstream. See also <fish_vcs_prompt>, which will call all supported version control prompt functions, including git, Mercurial and Subversion. Example ------- A simple prompt that displays git info: ``` function fish_prompt # ... set -g __fish_git_prompt_showupstream auto printf '%s %s$' $PWD (fish_git_prompt) end ``` fish string-pad - pad strings to a fixed width string-pad - pad strings to a fixed width ========================================= Synopsis -------- ``` string pad [-r | --right] [(-c | --char) CHAR] [(-w | --width) INTEGER] [STRING ...] ``` Description ----------- `string pad` extends each *STRING* to the given visible width by adding *CHAR* to the left. That means the width of all visible characters added together, excluding escape sequences and accounting for [`fish_emoji_width`](../language#envvar-fish_emoji_width) and [`fish_ambiguous_width`](../language#envvar-fish_ambiguous_width). It is the amount of columns in a terminal the *STRING* occupies. The escape sequences reflect what fish knows about, and how it computes its output. Your terminal might support more escapes, or not support escape sequences that fish knows about. If **-r** or **--right** is given, add the padding after a string. If **-c** or **--char** is given, pad with *CHAR* instead of whitespace. The output is padded to the maximum width of all input strings. If **-w** or **--width** is given, use at least that. Examples -------- ``` >_ string pad -w 10 abc abcdef abc abcdef >_ string pad --right --char=🐟 "fish are pretty" "rich. " fish are pretty rich. 🐟🐟🐟🐟 >_ string pad -w$COLUMNS (date) # Prints the current time on the right edge of the screen. ``` See Also -------- * The <printf> command can do simple padding, for example `printf %10s\n` works like `string pad -w10`. * [string length](string-length) with the `--visible` option can be used to show what fish thinks the width is. fish prompt_pwd - print pwd suitable for prompt prompt\_pwd - print pwd suitable for prompt =========================================== Synopsis -------- ``` prompt_pwd ``` Description ----------- `prompt_pwd` is a function to print the current working directory in a way suitable for prompts. It will replace the home directory with “~” and shorten every path component but the last to a default of one character. To change the number of characters per path component, pass `--dir-length=` or set `fish_prompt_pwd_dir_length` to the number of characters. Setting it to 0 or an invalid value will disable shortening entirely. This defaults to 1. To keep some components unshortened, pass `--full-length-dirs=` or set `fish_prompt_pwd_full_dirs` to the number of components. This defaults to 1, keeping the last component. If any positional arguments are given, `prompt_pwd` shortens them instead of [`PWD`](../language#envvar-PWD). Options ------- **-d** or **--dir-length** *MAX* Causes the components to be shortened to *MAX* characters each. This overrides `fish_prompt_pwd_dir_length`. **-D** or **--full-length-dirs** *NUM* Keeps *NUM* components (counted from the right) as full length without shortening. This overrides `fish_prompt_pwd_full_dirs`. **-h** or **--help** Displays help about using this command. Examples -------- ``` >_ cd ~/ >_ echo $PWD /home/alfa >_ prompt_pwd ~ >_ cd /tmp/banana/sausage/with/mustard >_ prompt_pwd /t/b/s/w/mustard >_ set -g fish_prompt_pwd_dir_length 3 >_ prompt_pwd /tmp/ban/sau/wit/mustard >_ prompt_pwd --full-length-dirs=2 --dir-length=1 /t/b/s/with/mustard ``` fish fish_is_root_user - check if the current user is root fish\_is\_root\_user - check if the current user is root ======================================================== Synopsis -------- ``` fish_is_root_user ``` Description ----------- `fish_is_root_user` will check if the current user is root. It can be useful for the prompt to display something different if the user is root, for example. Example ------- A simple example: ``` function example --description 'Just an example' if fish_is_root_user do_something_different end end ``` fish count - count the number of elements of a list count - count the number of elements of a list ============================================== Synopsis -------- ``` count STRING1 STRING2 ... COMMAND | count count [...] < FILE ``` Description ----------- `count` prints the number of arguments that were passed to it, plus the number of newlines passed to it via stdin. This is usually used to find out how many elements an environment variable list contains, or how many lines there are in a text file. `count` does not accept any options, not even `-h` or `--help`. `count` exits with a non-zero exit status if no arguments were passed to it, and with zero if at least one argument was passed. Note that, like `wc -l`, reading from stdin counts newlines, so `echo -n foo | count` will print 0. Example ------- ``` count $PATH # Returns the number of directories in the users PATH variable. count *.txt # Returns the number of files in the current working directory # ending with the suffix '.txt'. git ls-files --others --exclude-standard | count # Returns the number of untracked files in a git repository printf '%s\n' foo bar | count baz # Returns 3 (2 lines from stdin plus 1 argument) count < /etc/hosts # Counts the number of entries in the hosts file ``` fish fish_breakpoint_prompt - define the prompt when stopped at a breakpoint fish\_breakpoint\_prompt - define the prompt when stopped at a breakpoint ========================================================================= Synopsis -------- ``` fish_breakpoint_prompt ``` ``` function fish_breakpoint_prompt ... end ``` Description ----------- `fish_breakpoint_prompt` is the prompt function when asking for input in response to a <breakpoint> command. The exit status of commands within `fish_breakpoint_prompt` will not modify the value of [$status](../language#variables-status) outside of the `fish_breakpoint_prompt` function. `fish` ships with a default version of this function that displays the function name and line number of the current execution context. Example ------- A simple prompt that is a simplified version of the default debugging prompt: ``` function fish_breakpoint_prompt -d "Write out the debug prompt" set -l function (status current-function) set -l line (status current-line-number) set -l prompt "$function:$line >" echo -ns (set_color $fish_color_status) "BP $prompt" (set_color normal) ' ' end ``` fish commandline - set or get the current command line buffer commandline - set or get the current command line buffer ======================================================== Synopsis -------- ``` commandline [OPTIONS] [CMD] ``` Description ----------- `commandline` can be used to set or get the current contents of the command line buffer. With no parameters, `commandline` returns the current value of the command line. With **CMD** specified, the command line buffer is erased and replaced with the contents of **CMD**. The following options are available: **-C** or **--cursor** Set or get the current cursor position, not the contents of the buffer. If no argument is given, the current cursor position is printed, otherwise the argument is interpreted as the new cursor position. If one of the options **-j**, **-p** or **-t** is given, the position is relative to the respective substring instead of the entire command line buffer. **-B** or **--selection-start** Get current position of the selection start in the buffer. **-E** or **--selection-end** Get current position of the selection end in the buffer. **-f** or **--function** Causes any additional arguments to be interpreted as input functions, and puts them into the queue, so that they will be read before any additional actual key presses are. This option cannot be combined with any other option. See <bind> for a list of input functions. **-h** or **--help** Displays help about using this command. The following options change the way `commandline` updates the command line buffer: **-a** or **--append** Do not remove the current commandline, append the specified string at the end of it. **-i** or **--insert** Do not remove the current commandline, insert the specified string at the current cursor position **-r** or **--replace** Remove the current commandline and replace it with the specified string (default) The following options change what part of the commandline is printed or updated: **-b** or **--current-buffer** Select the entire commandline, not including any displayed autosuggestion (default). **-j** or **--current-job** Select the current job - a **job** here is one pipeline. Stops at logical operators or terminators (**;**, **&**, and newlines). **-p** or **--current-process** Select the current process - a **process** here is one command. Stops at logical operators, terminators, and pipes. **-s** or **--current-selection** Selects the current selection **-t** or **--current-token** Selects the current token The following options change the way `commandline` prints the current commandline buffer: **-c** or **--cut-at-cursor** Only print selection up until the current cursor position. **-o** or **--tokenize** Tokenize the selection and print one string-type token per line. If `commandline` is called during a call to complete a given string using `complete -C STRING`, `commandline` will consider the specified string to be the current contents of the command line. The following options output metadata about the commandline state: **-L** or **--line** Print the line that the cursor is on, with the topmost line starting at 1. **-S** or **--search-mode** Evaluates to true if the commandline is performing a history search. **-P** or **--paging-mode** Evaluates to true if the commandline is showing pager contents, such as tab completions. **--paging-full-mode** Evaluates to true if the commandline is showing pager contents, such as tab completions and all lines are shown (no “<n> more rows” message). **--is-valid** Returns true when the commandline is syntactically valid and complete. If it is, it would be executed when the `execute` bind function is called. If the commandline is incomplete, return 2, if erroneus, return 1. Example ------- `commandline -j $history[3]` replaces the job under the cursor with the third item from the command line history. If the commandline contains ``` >_ echo $flounder >&2 | less; and echo $catfish ``` (with the cursor on the “o” of “flounder”) The `echo $flounder >&` is the first process, `less` the second and `and echo $catfish` the third. `echo $flounder >&2 | less` is the first job, `and echo $catfish` the second. **$flounder** is the current token. More examples: ``` >_ commandline -t $flounder >_ commandline -ct $fl >_ commandline -b # or just commandline echo $flounder >&2 | less; and echo $catfish >_ commandline -p echo $flounder >&2 >_ commandline -j echo $flounder >&2 | less ```
programming_docs
fish command - run a program command - run a program ======================= Synopsis -------- ``` command [OPTIONS] [COMMANDNAME [ARG ...]] ``` Description ----------- **command** forces the shell to execute the program *COMMANDNAME* and ignore any functions or builtins with the same name. The following options are available: **-a** or **--all** Prints all *COMMAND* found in [`PATH`](../language#envvar-PATH), in the order found. **-q** or **--query** Silence output and print nothing, setting only exit status. Implies **--search**. For compatibility, this is also **--quiet** (deprecated). **-v** (or **-s** or **--search**) Prints the external command that would be executed, or prints nothing if no file with the specified name could be found in [`PATH`](../language#envvar-PATH). **-h** or **--help** Displays help about using this command. With the **-v** option, `command` treats every argument as a separate command to look up and sets the exit status to 0 if any of the specified commands were found, or 127 if no commands could be found. **--quiet** used with **-v** prevents commands being printed, like `type -q`. Examples -------- fish emit - emit a generic event emit - emit a generic event =========================== Synopsis -------- ``` emit EVENT_NAME [ARGUMENTS ...] ``` Description ----------- `emit` emits, or fires, an event. Events are delivered to, or caught by, special functions called [event handlers](../language#event). The arguments are passed to the event handlers as function arguments. The **--help** or **-h** option displays help about using this command. Example ------- The following code first defines an event handler for the generic event named ‘test\_event’, and then emits an event of that type. ``` function event_test --on-event test_event echo event test: $argv end emit test_event something ``` Notes ----- Note that events are only sent to the current fish process as there is no way to send events from one fish process to another. fish complete - edit command specific tab-completions complete - edit command specific tab-completions ================================================ Synopsis -------- ``` complete ((-c | --command) | (-p | --path)) COMMAND [OPTIONS] complete (-C | --do-complete) [--escape] STRING ``` Description ----------- `complete` defines, removes or lists completions for a command. For an introduction to writing your own completions, see [Writing your own completions](../completions#completion-own) in the fish manual. The following options are available: **-c** or **--command** *COMMAND* Specifies that *COMMAND* is the name of the command. If there is no **-c** or **-p**, one non-option argument will be used as the command. **-p** or **--path** *COMMAND* Specifies that *COMMAND* is the absolute path of the command (optionally containing wildcards). **-e** or **--erase** Deletes the specified completion. **-s** or **--short-option** *SHORT\_OPTION* Adds a short option to the completions list. **-l** or **--long-option** *LONG\_OPTION* Adds a GNU-style long option to the completions list. **-o** or **--old-option** *OPTION* Adds an old-style short or long option (see below for details). **-a** or **--arguments** *ARGUMENTS* Adds the specified option arguments to the completions list. **-k** or **--keep-order** Keeps the order of *ARGUMENTS* instead of sorting alphabetically. Multiple `complete` calls with **-k** result in arguments of the later ones displayed first. **-f** or **--no-files** This completion may not be followed by a filename. **-F** or **--force-files** This completion may be followed by a filename, even if another applicable `complete` specified **--no-files**. **-r** or **--require-parameter** This completion must have an option argument, i.e. may not be followed by another option. **-x** or **--exclusive** Short for **-r** and **-f**. **-w** or **--wraps** *WRAPPED\_COMMAND* Causes the specified command to inherit completions from *WRAPPED\_COMMAND* (see below for details). **-n** or **--condition** *CONDITION* This completion should only be used if the *CONDITION* (a shell command) returns 0. This makes it possible to specify completions that should only be used in some cases. If multiple conditions are specified, fish will try them in the order they are specified until one fails or all succeeded. **-C** or **--do-complete** *STRING* Makes `complete` try to find all possible completions for the specified string. If there is no *STRING*, the current commandline is used instead. **--escape** When used with `-C`, escape special characters in completions. **-h** or **--help** Displays help about using this command. Command specific tab-completions in `fish` are based on the notion of options and arguments. An option is a parameter which begins with a hyphen, such as `-h`, `-help` or `--help`. Arguments are parameters that do not begin with a hyphen. Fish recognizes three styles of options, the same styles as the GNU getopt library. These styles are: * Short options, like `-a`. Short options are a single character long, are preceded by a single hyphen and can be grouped together (like `-la`, which is equivalent to `-l -a`). Option arguments may be specified by appending the option with the value (`-w32`), or, if `--require-parameter` is given, in the following parameter (`-w 32`). * Old-style options, long like `-Wall` or `-name` or even short like `-a`. Old-style options can be more than one character long, are preceded by a single hyphen and may not be grouped together. Option arguments are specified by default following a space (`-foo null`) or after `=` (`-foo=null`). * GNU-style long options, like `--colors`. GNU-style long options can be more than one character long, are preceded by two hyphens, and can’t be grouped together. Option arguments may be specified after a `=` (`--quoting-style=shell`), or, if `--require-parameter` is given, in the following parameter (`--quoting-style shell`). Multiple commands and paths can be given in one call to define the same completions for multiple commands. Multiple command switches and wrapped commands can also be given to define multiple completions in one call. Invoking `complete` multiple times for the same command adds the new definitions on top of any existing completions defined for the command. When `-a` or `--arguments` is specified in conjunction with long, short, or old-style options, the specified arguments are only completed as arguments for any of the specified options. If `-a` or `--arguments` is specified without any long, short, or old-style options, the specified arguments are used when completing non-option arguments to the command (except when completing an option argument that was specified with `-r` or `--require-parameter`). Command substitutions found in `ARGUMENTS` should return a newline-separated list of arguments, and each argument may optionally have a tab character followed by the argument description. Description given this way override a description given with `-d` or `--description`. Descriptions given with `--description` are also used to group options given with `-s`, `-o` or `-l`. Options with the same (non-empty) description will be listed as one candidate, and one of them will be picked. If the description is empty or no description was given this is skipped. The `-w` or `--wraps` options causes the specified command to inherit completions from another command, “wrapping” the other command. The wrapping command can also have additional completions. A command can wrap multiple commands, and wrapping is transitive: if A wraps B, and B wraps C, then A automatically inherits all of C’s completions. Wrapping can be removed using the `-e` or `--erase` options. Wrapping only works for completions specified with `-c` or `--command` and are ignored when specifying completions with `-p` or `--path`. When erasing completions, it is possible to either erase all completions for a specific command by specifying `complete -c COMMAND -e`, or by specifying a specific completion option to delete. When `complete` is called without anything that would define or erase completions (options, arguments, wrapping, …), it shows matching completions instead. So `complete` without any arguments shows all loaded completions, `complete -c foo` shows all loaded completions for `foo`. Since completions are [autoloaded](../language#syntax-function-autoloading), you will have to trigger them first. Examples -------- The short-style option `-o` for the `gcc` command needs a file argument: ``` complete -c gcc -s o -r ``` The short-style option `-d` for the `grep` command requires one of `read`, `skip` or `recurse`: ``` complete -c grep -s d -x -a "read skip recurse" ``` The `su` command takes any username as an argument. Usernames are given as the first colon-separated field in the file /etc/passwd. This can be specified as: ``` complete -x -c su -d "Username" -a "(cat /etc/passwd | cut -d : -f 1)" ``` The `rpm` command has several different modes. If the `-e` or `--erase` flag has been specified, `rpm` should delete one or more packages, in which case several switches related to deleting packages are valid, like the `nodeps` switch. This can be written as: ``` complete -c rpm -n "__fish_contains_opt -s e erase" -l nodeps -d "Don't check dependencies" ``` where `__fish_contains_opt` is a function that checks the command line buffer for the presence of a specified set of options. To implement an alias, use the `-w` or `--wraps` option: ``` complete -c hub -w git ``` Now hub inherits all of the completions from git. Note this can also be specified in a function declaration (`function thing -w otherthing`). ``` complete -c git ``` Shows all completions for `git`. Any command `foo` that doesn’t support grouping multiple short options in one string (not supporting `-xf` as short for `-x -f`) or a short option and its value in one string (not supporting `-d9` instead of `-d 9`) should be specified as a single-character old-style option instead of as a short-style option; for example, `complete -c foo -o s; complete -c foo -o v` would never suggest `foo -ov` but rather `foo -o -v`. fish fish_mode_prompt - define the appearance of the mode indicator fish\_mode\_prompt - define the appearance of the mode indicator ================================================================ Synopsis -------- ``` fish_mode_prompt ``` ``` function fish_mode_prompt echo -n "$fish_bind_mode " end ``` Description ----------- The `fish_mode_prompt` function outputs the mode indicator for use in vi-mode. The default `fish_mode_prompt` function will output indicators about the current Vi editor mode displayed to the left of the regular prompt. Define your own function to customize the appearance of the mode indicator. The `$fish_bind_mode variable` can be used to determine the current mode. It will be one of `default`, `insert`, `replace_one`, or `visual`. You can also define an empty `fish_mode_prompt` function to remove the Vi mode indicators: ``` function fish_mode_prompt; end funcsave fish_mode_prompt ``` `fish_mode_prompt` will be executed when the vi mode changes. If it produces any output, it is displayed and used. If it does not, the other prompt functions (<fish_prompt> and <fish_right_prompt>) will be executed as well in case they contain a mode display. Example ------- ``` function fish_mode_prompt switch $fish_bind_mode case default set_color --bold red echo 'N' case insert set_color --bold green echo 'I' case replace_one set_color --bold green echo 'R' case visual set_color --bold brmagenta echo 'V' case '*' set_color --bold red echo '?' end set_color normal end ``` Outputting multiple lines is not supported in `fish_mode_prompt`. fish string-split0 - split on zero bytes string-split0 - split on zero bytes =================================== Synopsis -------- ``` string split [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] SEP [STRING ...] string split0 [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] [STRING ...] ``` Description ----------- `string split` splits each *STRING* on the separator *SEP*, which can be an empty string. If **-m** or **--max** is specified, at most MAX splits are done on each *STRING*. If **-r** or **--right** is given, splitting is performed right-to-left. This is useful in combination with **-m** or **--max**. With **-n** or **--no-empty**, empty results are excluded from consideration (e.g. `hello\n\nworld` would expand to two strings and not three). Exit status: 0 if at least one split was performed, or 1 otherwise. Use **-f** or **--fields** to print out specific fields. FIELDS is a comma-separated string of field numbers and/or spans. Each field is one-indexed, and will be printed on separate lines. If a given field does not exist, then the command exits with status 1 and does not print anything, unless **--allow-empty** is used. See also the **--delimiter** option of the <read> command. `string split0` splits each *STRING* on the zero byte (NUL). Options are the same as `string split` except that no separator is given. `split0` has the important property that its output is not further split when used in a command substitution, allowing for the command substitution to produce elements containing newlines. This is most useful when used with Unix tools that produce zero bytes, such as `find -print0` or `sort -z`. See split0 examples below. Examples -------- ``` >_ string split . example.com example com >_ string split -r -m1 / /usr/local/bin/fish /usr/local/bin fish >_ string split '' abc a b c >_ string split --allow-empty -f1,3-4,5 '' abcd a c d ``` ### NUL Delimited Examples ``` >_ # Count files in a directory, without being confused by newlines. >_ count (find . -print0 | string split0) 42 >_ # Sort a list of elements which may contain newlines >_ set foo beta alpha\ngamma >_ set foo (string join0 $foo | sort -z | string split0) >_ string escape $foo[1] alpha\ngamma ``` fish fg - bring job to foreground fg - bring job to foreground ============================ Synopsis -------- ``` fg [PID] ``` Description ----------- The **fg** builtin brings the specified [job](../language#syntax-job-control) to the foreground, resuming it if it is stopped. While a foreground job is executed, fish is suspended. If no job is specified, the last job to be used is put in the foreground. If `PID` is specified, the job containing a process with the specified process ID is put in the foreground. For compatibility with other shells, job expansion syntax is supported for `fg`. A *PID* of the format **%1** will foreground job 1. Job numbers can be seen in the output of <jobs>. The **--help** or **-h** option displays help about using this command. Example ------- `fg` will put the last job in the foreground. `fg %3` will put job 3 into the foreground. fish realpath - convert a path to an absolute path without symlinks realpath - convert a path to an absolute path without symlinks ============================================================== Synopsis -------- ``` realpath [OPTIONS] PATH ``` Description ----------- **realpath** follows all symbolic links encountered for the provided [`PATH`](../language#envvar-PATH), printing the absolute path resolved. <fish> provides a **realpath**-alike builtin intended to enrich systems where no such command is installed by default. If a **realpath** command exists, that will be preferred. `builtin realpath` will explicitly use the fish implementation of **realpath**. The following options are available: **-s** or **--no-symlinks** Don’t resolve symlinks, only make paths absolute, squash multiple slashes and remove trailing slashes. **-h** or **--help** Displays help about using this command. fish cdh - change to a recently visited directory cdh - change to a recently visited directory ============================================ Synopsis -------- ``` cdh [DIRECTORY] ``` Description ----------- `cdh` with no arguments presents a list of [recently visited directories](../interactive#directory-history). You can then select one of the entries by letter or number. You can also press `Tab` to use the completion pager to select an item from the list. If you give it a single argument it is equivalent to `cd DIRECTORY`. Note that the `cd` command limits directory history to the 25 most recently visited directories. The history is stored in the `dirprev` and `dirnext` variables, which this command manipulates. If you make those universal variables, your `cd` history is shared among all fish instances. See Also -------- * the <dirh> command to print the directory history * the <prevd> command to move backward * the <nextd> command to move forward fish _ - call fish’s translations \_ - call fish’s translations ============================= Synopsis -------- ``` _ STRING ``` Description ----------- `_` translates its arguments into the current language, if possible. It is equivalent to `gettext fish STRING`, meaning it can only be used to look up fish’s own translations. It requires fish to be built with gettext support. If that support is disabled, or there is no translation it will simply echo the argument back. The language depends on the current locale, set with [`LANG`](../language#envvar-LANG) and [`LC_MESSAGES`](../language#envvar-LC_MESSAGES). Options ------- `_` takes no options. Examples -------- ``` > _ File Datei ``` fish time - measure how long a command or block takes time - measure how long a command or block takes ================================================ Synopsis -------- ``` time COMMAND ``` Description ----------- `time` causes fish to measure how long a command takes and print the results afterwards. The command can be a simple fish command or a block. The results can not currently be redirected. For checking timing after a command has completed, check [$CMD\_DURATION](../language#variables-special). Your system most likely also has a `time` command. To use that use something like `command time`, as in `command time sleep 10`. Because it’s not inside fish, it won’t have access to fish functions and won’t be able to time blocks and such. How to interpret the output --------------------------- Time outputs a few different values. Let’s look at an example: ``` > time string repeat -n 10000000 y\n | command grep y >/dev/null ________________________________________________________ Executed in 805.98 millis fish external usr time 798.88 millis 763.88 millis 34.99 millis sys time 141.22 millis 40.20 millis 101.02 millis ``` The time after “Executed in” is what is known as the “wall-clock time”. It is simply a measure of how long it took from the start of the command until it finished. Typically it is reasonably close to [`CMD_DURATION`](../language#envvar-CMD_DURATION), except for a slight skew because the two are taken at slightly different times. The other times are all measures of CPU time. That means they measure how long the CPU was used in this part, and they count multiple cores separately. So a program with four threads using all CPU for a second will have a time of 4 seconds. The “usr” time is how much CPU time was spent inside the program itself, the “sys” time is how long was spent in the kernel on behalf of that program. The “fish” time is how much CPU was spent in fish, the “external” time how much was spent in external commands. So in this example, since `string` is a builtin, everything that `string repeat` did is accounted to fish. Any time it spends doing syscalls like `write()` is accounted for in the fish/sys time. And `grep` here is explicitly invoked as an external command, so its times will be counted in the “external” column. Note that, as in this example, the CPU times can add up to more than the execution time. This is because things can be done in parallel - `grep` can match while `string repeat` writes. Example ------- (for obvious reasons exact results will vary on your system) ``` >_ time sleep 1s ________________________________________________________ Executed in 1,01 secs fish external usr time 2,32 millis 0,00 micros 2,32 millis sys time 0,88 millis 877,00 micros 0,00 millis >_ time for i in 1 2 3; sleep 1s; end ________________________________________________________ Executed in 3,01 secs fish external usr time 9,16 millis 2,94 millis 6,23 millis sys time 0,23 millis 0,00 millis 0,23 millis ``` Inline variable assignments need to follow the `time` keyword: ``` >_ time a_moment=1.5m sleep $a_moment ________________________________________________________ Executed in 90.00 secs fish external usr time 4.62 millis 4.62 millis 0.00 millis sys time 2.35 millis 0.41 millis 1.95 millis ```
programming_docs
fish pwd - output the current working directory pwd - output the current working directory ========================================== Synopsis -------- ``` pwd [-P | --physical] pwd [-L | --logical] ``` Description ----------- `pwd` outputs (prints) the current working directory. The following options are available: **-L** or **--logical** Output the logical working directory, without resolving symlinks (default behavior). **-P** or **--physical** Output the physical working directory, with symlinks resolved. **-h** or **--help** Displays help about using this command. See Also -------- Navigate directories using the [directory history](../interactive#directory-history) or the [directory stack](../interactive#directory-stack) fish string-upper - convert strings to uppercase string-upper - convert strings to uppercase =========================================== Synopsis -------- ``` string upper [-q | --quiet] [STRING ...] ``` Description ----------- `string upper` converts each string argument to uppercase. Exit status: 0 if at least one string was converted to uppercase, else 1. This means that in conjunction with the **-q** flag you can readily test whether a string is already uppercase. fish suspend - suspend the current shell suspend - suspend the current shell =================================== Synopsis -------- ``` suspend [--force] ``` Description ----------- `suspend` suspends execution of the current shell by sending it a SIGTSTP signal, returning to the controlling process. It can be resumed later by sending it a SIGCONT. In order to prevent suspending a shell that doesn’t have a controlling process, it will not suspend the shell if it is a login shell. This requirement is bypassed if the **--force** option is given or the shell is not interactive. fish funcsave - save the definition of a function to the user’s autoload directory funcsave - save the definition of a function to the user’s autoload directory ============================================================================= Synopsis -------- ``` funcsave FUNCTION_NAME funcsave [-q | --quiet] [(-d | --directory) DIR] FUNCTION_NAME ``` Description ----------- `funcsave` saves a function to a file in the fish configuration directory. This function will be [automatically loaded](../language#syntax-function-autoloading) by current and future fish sessions. This can be useful to commit functions created interactively for permanent use. If you have erased a function using <functions>’s `--erase` option, `funcsave` will remove the saved function definition. Because fish loads functions on-demand, saved functions cannot serve as [event handlers](../language#event) until they are run or otherwise sourced. To activate an event handler for every new shell, add the function to the [configuration file](../language#configuration) instead of using `funcsave`. This is often used after <funced>, which opens the function in `$EDITOR` or `$VISUAL` and loads it into the current session afterwards. fish fish_hg_prompt - output Mercurial information for use in a prompt fish\_hg\_prompt - output Mercurial information for use in a prompt =================================================================== Synopsis -------- ``` fish_hg_prompt ``` ``` function fish_prompt printf '%s' $PWD (fish_hg_prompt) ' $ ' end ``` Description ----------- The fish\_hg\_prompt function displays information about the current Mercurial repository, if any. [Mercurial](https://www.mercurial-scm.org/) (`hg`) must be installed. By default, only the current branch is shown because `hg status` can be slow on a large repository. You can enable a more informative prompt by setting the variable `$fish_prompt_hg_show_informative_status`, for example: ``` set --universal fish_prompt_hg_show_informative_status ``` If you enabled the informative status, there are numerous customization options, which can be controlled with fish variables. * `$fish_color_hg_clean`, `$fish_color_hg_modified` and `$fish_color_hg_dirty` are colors used when the repository has the respective status. Some colors for status symbols: * `$fish_color_hg_added` * `$fish_color_hg_renamed` * `$fish_color_hg_copied` * `$fish_color_hg_deleted` * `$fish_color_hg_untracked` * `$fish_color_hg_unmerged` The status symbols themselves: * `$fish_prompt_hg_status_added`, default ‘✚’ * `$fish_prompt_hg_status_modified`, default ‘\*’ * `$fish_prompt_hg_status_copied`, default ‘⇒’ * `$fish_prompt_hg_status_deleted`, default ‘✖’ * `$fish_prompt_hg_status_untracked`, default ‘?’ * `$fish_prompt_hg_status_unmerged`, default ‘!’ Finally, `$fish_prompt_hg_status_order`, which can be used to change the order the status symbols appear in. It defaults to `added modified copied deleted untracked unmerged`. See also <fish_vcs_prompt>, which will call all supported version control prompt functions, including git, Mercurial and Subversion. Example ------- A simple prompt that displays hg info: ``` function fish_prompt ... set -g fish_prompt_hg_show_informative_status printf '%s %s$' $PWD (fish_hg_prompt) end ``` fish string - manipulate strings string - manipulate strings =========================== Synopsis -------- ``` string collect [-a | --allow-empty] [-N | --no-trim-newlines] [STRING ...] string escape [-n | --no-quoted] [--style=] [STRING ...] string join [-q | --quiet] [-n | --no-empty] SEP [STRING ...] string join0 [-q | --quiet] [STRING ...] string length [-q | --quiet] [STRING ...] string lower [-q | --quiet] [STRING ...] string match [-a | --all] [-e | --entire] [-i | --ignore-case] [-g | --groups-only] [-r | --regex] [-n | --index] [-q | --quiet] [-v | --invert] PATTERN [STRING ...] string pad [-r | --right] [(-c | --char) CHAR] [(-w | --width) INTEGER] [STRING ...] string repeat [(-n | --count) COUNT] [(-m | --max) MAX] [-N | --no-newline] [-q | --quiet] [STRING ...] string replace [-a | --all] [-f | --filter] [-i | --ignore-case] [-r | --regex] [-q | --quiet] PATTERN REPLACE [STRING ...] string shorten [(-c | --char) CHARS] [(-m | --max) INTEGER] [-N | --no-newline] [-l | --left] [-q | --quiet] [STRING ...] string split [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] SEP [STRING ...] string split0 [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] [STRING ...] string sub [(-s | --start) START] [(-e | --end) END] [(-l | --length) LENGTH] [-q | --quiet] [STRING ...] string trim [-l | --left] [-r | --right] [(-c | --chars) CHARS] [-q | --quiet] [STRING ...] string unescape [--style=] [STRING ...] string upper [-q | --quiet] [STRING ...] ``` Description ----------- `string` performs operations on strings. *STRING* arguments are taken from the command line unless standard input is connected to a pipe or a file, in which case they are read from standard input, one *STRING* per line. It is an error to supply *STRING* arguments on the command line and on standard input. Arguments beginning with `-` are normally interpreted as switches; `--` causes the following arguments not to be treated as switches even if they begin with `-`. Switches and required arguments are recognized only on the command line. Most subcommands accept a **-q** or **--quiet** switch, which suppresses the usual output but exits with the documented status. In this case these commands will quit early, without reading all of the available input. The following subcommands are available. “collect” subcommand -------------------- ``` string collect [-a | --allow-empty] [-N | --no-trim-newlines] [STRING ...] ``` `string collect` collects its input into a single output argument, without splitting the output when used in a command substitution. This is useful when trying to collect multiline output from another command into a variable. Exit status: 0 if any output argument is non-empty, or 1 otherwise. A command like `echo (cmd | string collect)` is mostly equivalent to a quoted command substitution (`echo "$(cmd)"`). The main difference is that the former evaluates to zero or one elements whereas the quoted command substitution always evaluates to one element due to string interpolation. If invoked with multiple arguments instead of input, `string collect` preserves each argument separately, where the number of output arguments is equal to the number of arguments given to `string collect`. Any trailing newlines on the input are trimmed, just as with `"$(cmd)"` substitution. Use **--no-trim-newlines** to disable this behavior, which may be useful when running a command such as `set contents (cat filename | string collect -N)`. With **--allow-empty**, `string collect` always prints one (empty) argument. This can be used to prevent an argument from disappearing. ### Examples ``` >_ echo "zero $(echo one\ntwo\nthree) four" zero one two three four >_ echo \"(echo one\ntwo\nthree | string collect)\" "one two three" >_ echo \"(echo one\ntwo\nthree | string collect -N)\" "one two three " >_ echo foo(true | string collect --allow-empty)bar foobar ``` “escape” and “unescape” subcommands ----------------------------------- ``` string escape [-n | --no-quoted] [--style=] [STRING ...] string unescape [--style=] [STRING ...] ``` `string escape` escapes each *STRING* in one of three ways. The first is **--style=script**. This is the default. It alters the string such that it can be passed back to `eval` to produce the original argument again. By default, all special characters are escaped, and quotes are used to simplify the output when possible. If **-n** or **--no-quoted** is given, the simplifying quoted format is not used. Exit status: 0 if at least one string was escaped, or 1 otherwise. **--style=var** ensures the string can be used as a variable name by hex encoding any non-alphanumeric characters. The string is first converted to UTF-8 before being encoded. **--style=url** ensures the string can be used as a URL by hex encoding any character which is not legal in a URL. The string is first converted to UTF-8 before being encoded. **--style=regex** escapes an input string for literal matching within a regex expression. The string is first converted to UTF-8 before being encoded. `string unescape` performs the inverse of the `string escape` command. If the string to be unescaped is not properly formatted it is ignored. For example, doing `string unescape --style=var (string escape --style=var $str)` will return the original string. There is no support for unescaping **--style=regex**. ### Examples ``` >_ echo \x07 | string escape \cg >_ string escape --style=var 'a1 b2'\u6161 a1_20_b2_E6_85_A1_ ``` “join” and “join0” subcommands ------------------------------ ``` string join [-q | --quiet] SEP [STRING ...] string join0 [-q | --quiet] [STRING ...] ``` `string join` joins its *STRING* arguments into a single string separated by *SEP*, which can be an empty string. Exit status: 0 if at least one join was performed, or 1 otherwise. If `-n` or `--no-empty` is specified, empty strings are excluded from consideration (e.g. `string join -n + a b "" c` would expand to `a+b+c` not `a+b++c`). `string join0` joins its *STRING* arguments into a single string separated by the zero byte (NUL), and adds a trailing NUL. This is most useful in conjunction with tools that accept NUL-delimited input, such as `sort -z`. Exit status: 0 if at least one join was performed, or 1 otherwise. Because Unix uses NUL as the string terminator, passing the output of `string join0` as an *argument* to a command (via a [command substitution](../language#expand-command-substitution)) won’t actually work. Fish will pass the correct bytes along, but the command won’t be able to tell where the argument ends. This is a limitation of Unix’ argument passing. ### Examples ``` >_ seq 3 | string join ... 1...2...3 # Give a list of NUL-separated filenames to du (this is a GNU extension) >_ string join0 file1 file2 file\nwith\nmultiple\nlines | du --files0-from=- # Just put the strings together without a separator >_ string join '' a b c abc ``` “length” subcommand ------------------- ``` string length [-q | --quiet] [-V | --visible] [STRING ...] ``` `string length` reports the length of each string argument in characters. Exit status: 0 if at least one non-empty *STRING* was given, or 1 otherwise. With **-V** or **--visible**, it uses the visible width of the arguments. That means it will discount escape sequences fish knows about, account for $fish\_emoji\_width and $fish\_ambiguous\_width. It will also count each line (separated by `\n`) on its own, and with a carriage return (`\r`) count only the widest stretch on a line. The intent is to measure the number of columns the *STRING* would occupy in the current terminal. ### Examples ``` >_ string length 'hello, world' 12 >_ set str foo >_ string length -q $str; echo $status 0 # Equivalent to test -n "$str" >_ string length --visible (set_color red)foobar # the set_color is discounted, so this is the width of "foobar" 6 >_ string length --visible 🐟🐟🐟🐟 # depending on $fish_emoji_width, this is either 4 or 8 # in new terminals it should be 8 >_ string length --visible abcdef\r123 # this displays as "123def", so the width is 6 6 >_ string length --visible a\nbc # counts "a" and "bc" as separate lines, so it prints width for each 1 2 ``` “lower” subcommand ------------------ ``` string lower [-q | --quiet] [STRING ...] ``` `string lower` converts each string argument to lowercase. Exit status: 0 if at least one string was converted to lowercase, else 1. This means that in conjunction with the **-q** flag you can readily test whether a string is already lowercase. “match” subcommand ------------------ ``` string match [-a | --all] [-e | --entire] [-i | --ignore-case] [-g | --groups-only] [-r | --regex] [-n | --index] [-q | --quiet] [-v | --invert] PATTERN [STRING ...] ``` `string match` tests each *STRING* against *PATTERN* and prints matching substrings. Only the first match for each *STRING* is reported unless **-a** or **--all** is given, in which case all matches are reported. If you specify the **-e** or **--entire** then each matching string is printed including any prefix or suffix not matched by the pattern (equivalent to `grep` without the **-o** flag). You can, obviously, achieve the same result by prepending and appending **\*** or **.\*** depending on whether or not you have specified the **--regex** flag. The **--entire** flag is simply a way to avoid having to complicate the pattern in that fashion and make the intent of the `string match` clearer. Without **--entire** and **--regex**, a *PATTERN* will need to match the entire *STRING* before it will be reported. Matching can be made case-insensitive with **--ignore-case** or **-i**. If **--groups-only** or **-g** is given, only the capturing groups will be reported - meaning the full match will be skipped. This is incompatible with **--entire** and **--invert**, and requires **--regex**. It is useful as a simple cutting tool instead of `string replace`, so you can simply choose “this part” of a string. If **--index** or **-n** is given, each match is reported as a 1-based start position and a length. By default, PATTERN is interpreted as a glob pattern matched against each entire *STRING* argument. A glob pattern is only considered a valid match if it matches the entire *STRING*. If **--regex** or **-r** is given, *PATTERN* is interpreted as a Perl-compatible regular expression, which does not have to match the entire *STRING*. For a regular expression containing capturing groups, multiple items will be reported for each match, one for the entire match and one for each capturing group. With this, only the matching part of the *STRING* will be reported, unless **--entire** is given. When matching via regular expressions, `string match` automatically sets variables for all named capturing groups (`(?<name>expression)`). It will create a variable with the name of the group, in the default scope, for each named capturing group, and set it to the value of the capturing group in the first matched argument. If a named capture group matched an empty string, the variable will be set to the empty string (like `set var ""`). If it did not match, the variable will be set to nothing (like `set var`). When **--regex** is used with **--all**, this behavior changes. Each named variable will contain a list of matches, with the first match contained in the first element, the second match in the second, and so on. If the group was empty or did not match, the corresponding element will be an empty string. If **--invert** or **-v** is used the selected lines will be only those which do not match the given glob pattern or regular expression. Exit status: 0 if at least one match was found, or 1 otherwise. ### Match Glob Examples ``` >_ string match '?' a a >_ string match 'a*b' axxb axxb >_ string match -i 'a??B' Axxb Axxb >_ echo 'ok?' | string match '*\?' ok? # Note that only the second STRING will match here. >_ string match 'foo' 'foo1' 'foo' 'foo2' foo >_ string match -e 'foo' 'foo1' 'foo' 'foo2' foo1 foo foo2 >_ string match 'foo?' 'foo1' 'foo' 'foo2' foo1 foo2 ``` ### Match Regex Examples ``` >_ string match -r 'cat|dog|fish' 'nice dog' dog >_ string match -r -v "c.*[12]" {cat,dog}(seq 1 4) dog1 dog2 cat3 dog3 cat4 dog4 >_ string match -r '(\d\d?):(\d\d):(\d\d)' 2:34:56 2:34:56 2 34 56 >_ string match -r '^(\w{2,4})\1$' papa mud murmur papa pa murmur mur >_ string match -r -a -n at ratatat 2 2 4 2 6 2 >_ string match -r -i '0x[0-9a-f]{1,8}' 'int magic = 0xBadC0de;' 0xBadC0de >_ echo $version 3.1.2-1575-ga2ff32d90 >_ string match -rq '(?<major>\d+).(?<minor>\d+).(?<revision>\d+)' -- $version >_ echo "You are using fish $major!" You are using fish 3! >_ string match -raq ' *(?<sentence>[^.!?]+)(?<punctuation>[.!?])?' "hello, friend. goodbye" >_ printf "%s\n" -- $sentence hello, friend goodbye >_ printf "%s\n" -- $punctuation . >_ string match -rq '(?<word>hello)' 'hi' >_ count $word 0 ``` “pad” and “shorten” subcommands ------------------------------- ``` string pad [-r | --right] [(-c | --char) CHAR] [(-w | --width) INTEGER] [STRING ...] ``` `string pad` extends each *STRING* to the given visible width by adding *CHAR* to the left. That means the width of all visible characters added together, excluding escape sequences and accounting for [`fish_emoji_width`](../language#envvar-fish_emoji_width) and [`fish_ambiguous_width`](../language#envvar-fish_ambiguous_width). It is the amount of columns in a terminal the *STRING* occupies. The escape sequences reflect what fish knows about, and how it computes its output. Your terminal might support more escapes, or not support escape sequences that fish knows about. If **-r** or **--right** is given, add the padding after a string. If **-c** or **--char** is given, pad with *CHAR* instead of whitespace. The output is padded to the maximum width of all input strings. If **-w** or **--width** is given, use at least that. ``` >_ string pad -w 10 abc abcdef abc abcdef >_ string pad --right --char=🐟 "fish are pretty" "rich. " fish are pretty rich. 🐟🐟🐟🐟 >_ string pad -w$COLUMNS (date) # Prints the current time on the right edge of the screen. ``` See Also -------- * The <printf> command can do simple padding, for example `printf %10s\n` works like `string pad -w10`. * [string length](string-length) with the `--visible` option can be used to show what fish thinks the width is. ``` string shorten [(-c | --char) CHARS] [(-m | --max) INTEGER] [-N | --no-newline] [-l | --left] [-q | --quiet] [STRING ...] ``` `string shorten` truncates each *STRING* to the given visible width and adds an ellipsis to indicate it. “Visible width” means the width of all visible characters added together, excluding escape sequences and accounting for [`fish_emoji_width`](../language#envvar-fish_emoji_width) and [`fish_ambiguous_width`](../language#envvar-fish_ambiguous_width). It is the amount of columns in a terminal the *STRING* occupies. The escape sequences reflect what fish knows about, and how it computes its output. Your terminal might support more escapes, or not support escape sequences that fish knows about. If **-m** or **--max** is given, truncate at the given width. Otherwise, the lowest non-zero width of all input strings is used. A max of 0 means no shortening takes place, all STRINGs are printed as-is. If **-N** or **--no-newline** is given, only the first line (or last line with **--left**) of each STRING is used, and an ellipsis is added if it was multiline. This only works for STRINGs being given as arguments, multiple lines given on stdin will be interpreted as separate STRINGs instead. If **-c** or **--char** is given, add *CHAR* instead of an ellipsis. This can also be empty or more than one character. If **-l** or **--left** is given, remove text from the left on instead, so this prints the longest *suffix* of the string that fits. With **--no-newline**, this will take from the last line instead of the first. If **-q** or **--quiet** is given, `string shorten` only runs for the return value - if anything would be shortened, it returns 0, else 1. The default ellipsis is `…`. If fish thinks your system is incapable because of your locale, it will use `...` instead. The return value is 0 if any shortening occured, 1 otherwise. ``` >_ string shorten foo foobar # No width was given, we infer, and "foo" is the shortest. foo fo… >_ string shorten --char="..." foo foobar # The target width is 3 because of "foo", # and our ellipsis is 3 too, so we can't really show anything. # This is the default ellipsis if your locale doesn't allow "…". foo ... >_ string shorten --char="" --max 4 abcdef 123456 # Leaving the char empty makes us not add an ellipsis # So this truncates at 4 columns: abcd 1234 >_ touch "a multiline"\n"file" >_ for file in *; string shorten -N -- $file; end # Shorten the multiline file so we only show one line per file: a multiline… >_ ss -p | string shorten -m$COLUMNS -c "" # `ss` from Linux' iproute2 shows socket information, but prints extremely long lines. # This shortens input so it fits on the screen without overflowing lines. >_ git branch | string match -rg '^\* (.*)' | string shorten -m20 # Take the current git branch and shorten it at 20 columns. # Here the branch is "builtin-path-with-expand" builtin-path-with-e… >_ git branch | string match -rg '^\* (.*)' | string shorten -m20 --left # Taking 20 columns from the right instead: …in-path-with-expand ``` See Also -------- * [string](#cmd-string)’s `pad` subcommand does the inverse of this command, adding padding to a specific width instead. * The <printf> command can do simple padding, for example `printf %10s\n` works like `string pad -w10`. * [string length](string-length) with the `--visible` option can be used to show what fish thinks the width is. “repeat” subcommand ------------------- ``` string repeat [(-n | --count) COUNT] [(-m | --max) MAX] [-N | --no-newline] [-q | --quiet] [STRING ...] ``` `string repeat` repeats the *STRING* **-n** or **--count** times. The **-m** or **--max** option will limit the number of outputted characters (excluding the newline). This option can be used by itself or in conjunction with **--count**. If both **--count** and **--max** are present, max char will be outputed unless the final repeated string size is less than max, in that case, the string will repeat until count has been reached. Both **--count** and **--max** will accept a number greater than or equal to zero, in the case of zero, nothing will be outputed. If **-N** or **--no-newline** is given, the output won’t contain a newline character at the end. Exit status: 0 if yielded string is not empty, 1 otherwise. ### Examples ### Repeat Examples ``` >_ string repeat -n 2 'foo ' foo foo >_ echo foo | string repeat -n 2 foofoo >_ string repeat -n 2 -m 5 'foo' foofo >_ string repeat -m 5 'foo' foofo ``` “replace” subcommand -------------------- ``` string replace [-a | --all] [-f | --filter] [-i | --ignore-case] [-r | --regex] [-q | --quiet] PATTERN REPLACEMENT [STRING ...] ``` `string replace` is similar to `string match` but replaces non-overlapping matching substrings with a replacement string and prints the result. By default, *PATTERN* is treated as a literal substring to be matched. If **-r** or **--regex** is given, *PATTERN* is interpreted as a Perl-compatible regular expression, and *REPLACEMENT* can contain C-style escape sequences like **t** as well as references to capturing groups by number or name as *$n* or *${n}*. If you specify the **-f** or **--filter** flag then each input string is printed only if a replacement was done. This is useful where you would otherwise use this idiom: `a_cmd | string match pattern | string replace pattern new_pattern`. You can instead just write `a_cmd | string replace --filter pattern new_pattern`. Exit status: 0 if at least one replacement was performed, or 1 otherwise. ### Replace Literal Examples ``` >_ string replace is was 'blue is my favorite' blue was my favorite >_ string replace 3rd last 1st 2nd 3rd 1st 2nd last >_ string replace -a ' ' _ 'spaces to underscores' spaces_to_underscores ``` ### Replace Regex Examples ``` >_ string replace -r -a '[^\d.]+' ' ' '0 one two 3.14 four 5x' 0 3.14 5 >_ string replace -r '(\w+)\s+(\w+)' '$2 $1 $$' 'left right' right left $ >_ string replace -r '\s*newline\s*' '\n' 'put a newline here' put a here ``` “split” and “split0” subcommands -------------------------------- ``` string split [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] SEP [STRING ...] string split0 [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] [STRING ...] ``` `string split` splits each *STRING* on the separator *SEP*, which can be an empty string. If **-m** or **--max** is specified, at most MAX splits are done on each *STRING*. If **-r** or **--right** is given, splitting is performed right-to-left. This is useful in combination with **-m** or **--max**. With **-n** or **--no-empty**, empty results are excluded from consideration (e.g. `hello\n\nworld` would expand to two strings and not three). Exit status: 0 if at least one split was performed, or 1 otherwise. Use **-f** or **--fields** to print out specific fields. FIELDS is a comma-separated string of field numbers and/or spans. Each field is one-indexed, and will be printed on separate lines. If a given field does not exist, then the command exits with status 1 and does not print anything, unless **--allow-empty** is used. See also the **--delimiter** option of the <read> command. `string split0` splits each *STRING* on the zero byte (NUL). Options are the same as `string split` except that no separator is given. `split0` has the important property that its output is not further split when used in a command substitution, allowing for the command substitution to produce elements containing newlines. This is most useful when used with Unix tools that produce zero bytes, such as `find -print0` or `sort -z`. See split0 examples below. ### Examples ``` >_ string split . example.com example com >_ string split -r -m1 / /usr/local/bin/fish /usr/local/bin fish >_ string split '' abc a b c >_ string split --allow-empty -f1,3-4,5 '' abcd a c d ``` ### NUL Delimited Examples ``` >_ # Count files in a directory, without being confused by newlines. >_ count (find . -print0 | string split0) 42 >_ # Sort a list of elements which may contain newlines >_ set foo beta alpha\ngamma >_ set foo (string join0 $foo | sort -z | string split0) >_ string escape $foo[1] alpha\ngamma ``` “sub” subcommand ---------------- ``` string sub [(-s | --start) START] [(-e | --end) END] [(-l | --length) LENGTH] [-q | --quiet] [STRING ...] ``` `string sub` prints a substring of each string argument. The start/end of the substring can be specified with **-s**/**-e** or **--start**/**--end** followed by a 1-based index value. Positive index values are relative to the start of the string and negative index values are relative to the end of the string. The default start value is 1. The length of the substring can be specified with **-l** or **--length**. If the length or end is not specified, the substring continues to the end of each STRING. Exit status: 0 if at least one substring operation was performed, 1 otherwise. **--length** is mutually exclusive with **--end**. ### Examples ``` >_ string sub --length 2 abcde ab >_ string sub -s 2 -l 2 abcde bc >_ string sub --start=-2 abcde de >_ string sub --end=3 abcde abc >_ string sub -e -1 abcde abcd >_ string sub -s 2 -e -1 abcde bcd >_ string sub -s -3 -e -2 abcde c ``` “trim” subcommand ----------------- ``` string trim [-l | --left] [-r | --right] [(-c | --chars) CHARS] [-q | --quiet] [STRING ...] ``` `string trim` removes leading and trailing whitespace from each *STRING*. If **-l** or **--left** is given, only leading whitespace is removed. If **-r** or **--right** is given, only trailing whitespace is trimmed. The **-c** or **--chars** switch causes the characters in *CHARS* to be removed instead of whitespace. Exit status: 0 if at least one character was trimmed, or 1 otherwise. ### Examples ``` >_ string trim ' abc ' abc >_ string trim --right --chars=yz xyzzy zany x zan ``` “upper” subcommand ------------------ ``` string upper [-q | --quiet] [STRING ...] ``` `string upper` converts each string argument to uppercase. Exit status: 0 if at least one string was converted to uppercase, else 1. This means that in conjunction with the **-q** flag you can readily test whether a string is already uppercase. Regular Expressions ------------------- Both the `match` and `replace` subcommand support regular expressions when used with the **-r** or **--regex** option. The dialect is that of PCRE2. In general, special characters are special by default, so `a+` matches one or more “a”s, while `a\+` matches an “a” and then a “+”. `(a+)` matches one or more “a”s in a capturing group (`(?:XXXX)` denotes a non-capturing group). For the replacement parameter of `replace`, `$n` refers to the n-th group of the match. In the match parameter, `\n` (e.g. `\1`) refers back to groups. Some features include repetitions: * `*` refers to 0 or more repetitions of the previous expression * `+` 1 or more * `?` 0 or 1. * `{n}` to exactly n (where n is a number) * `{n,m}` at least n, no more than m. * `{n,}` n or more Character classes, some of the more important: * `.` any character except newline * `\d` a decimal digit and `\D`, not a decimal digit * `\s` whitespace and `\S`, not whitespace * `\w` a “word” character and `\W`, a “non-word” character * `[...]` (where “…” is some characters) is a character set * `[^...]` is the inverse of the given character set * `[x-y]` is the range of characters from x-y * `[[:xxx:]]` is a named character set * `[[:^xxx:]]` is the inverse of a named character set * `[[:alnum:]]` : “alphanumeric” * `[[:alpha:]]` : “alphabetic” * `[[:ascii:]]` : “0-127” * `[[:blank:]]` : “space or tab” * `[[:cntrl:]]` : “control character” * `[[:digit:]]` : “decimal digit” * `[[:graph:]]` : “printing, excluding space” * `[[:lower:]]` : “lower case letter” * `[[:print:]]` : “printing, including space” * `[[:punct:]]` : “printing, excluding alphanumeric” * `[[:space:]]` : “white space” * `[[:upper:]]` : “upper case letter” * `[[:word:]]` : “same as w” * `[[:xdigit:]]` : “hexadecimal digit” Groups: * `(...)` is a capturing group * `(?:...)` is a non-capturing group * `\n` is a backreference (where n is the number of the group, starting with 1) * `$n` is a reference from the replacement expression to a group in the match expression. And some other things: * `\b` denotes a word boundary, `\B` is not a word boundary. * `^` is the start of the string or line, `$` the end. * `|` is “alternation”, i.e. the “or”. Comparison to other tools ------------------------- Most operations `string` supports can also be done by external tools. Some of these include `grep`, `sed` and `cut`. If you are familiar with these, it is useful to know how `string` differs from them. In contrast to these classics, `string` reads input either from stdin or as arguments. `string` also does not deal with files, so it requires redirections to be used with them. In contrast to `grep`, `string`’s `match` defaults to glob-mode, while `replace` defaults to literal matching. If set to regex-mode, they use PCRE regular expressions, which is comparable to `grep`’s `-P` option. `match` defaults to printing just the match, which is like `grep` with `-o` (use **--entire** to enable grep-like behavior). Like `sed`’s `s/old/new/` command, `string replace` still prints strings that don’t match. `sed`’s `-n` in combination with a `/p` modifier or command is like `string replace -f`. `string split somedelimiter` is a replacement for `tr somedelimiter \n`.
programming_docs
fish dirh - print directory history dirh - print directory history ============================== Synopsis -------- ``` dirh ``` Description ----------- `dirh` prints the current [directory history](../interactive#directory-history). The current position in the history is highlighted using the color defined in the `fish_color_history_current` environment variable. `dirh` does not accept any parameters. Note that the <cd> command limits directory history to the 25 most recently visited directories. The history is stored in the `$dirprev` and `$dirnext` variables. See Also -------- * the <cdh> command to display a prompt to quickly navigate the history * the <prevd> command to move backward * the <nextd> command to move forward fish fish_svn_prompt - output Subversion information for use in a prompt fish\_svn\_prompt - output Subversion information for use in a prompt ===================================================================== Synopsis -------- ``` fish_svn_prompt ``` ``` function fish_prompt printf '%s' $PWD (fish_svn_prompt) ' $ ' end ``` Description ----------- The fish\_svn\_prompt function displays information about the current Subversion repository, if any. [Subversion](https://subversion.apache.org/) (`svn`) must be installed. There are numerous customization options, which can be controlled with fish variables. * `__fish_svn_prompt_color_revision` the colour of the revision number to display in the prompt * `__fish_svn_prompt_char_separator` the separator between status characters A number of variables control the symbol (“display”) and color (“color”) for the different status indicators: * `__fish_svn_prompt_char_added_display` * `__fish_svn_prompt_char_added_color` * `__fish_svn_prompt_char_conflicted_display` * `__fish_svn_prompt_char_conflicted_color` * `__fish_svn_prompt_char_deleted_display` * `__fish_svn_prompt_char_deleted_color` * `__fish_svn_prompt_char_ignored_display` * `__fish_svn_prompt_char_ignored_color` * `__fish_svn_prompt_char_modified_display` * `__fish_svn_prompt_char_modified_color` * `__fish_svn_prompt_char_replaced_display` * `__fish_svn_prompt_char_replaced_color` * `__fish_svn_prompt_char_unversioned_external_display` * `__fish_svn_prompt_char_unversioned_external_color` * `__fish_svn_prompt_char_unversioned_display` * `__fish_svn_prompt_char_unversioned_color` * `__fish_svn_prompt_char_missing_display` * `__fish_svn_prompt_char_missing_color` * `__fish_svn_prompt_char_versioned_obstructed_display` * `__fish_svn_prompt_char_versioned_obstructed_color` * `__fish_svn_prompt_char_locked_display` * `__fish_svn_prompt_char_locked_color` * `__fish_svn_prompt_char_scheduled_display` * `__fish_svn_prompt_char_scheduled_color` * `__fish_svn_prompt_char_switched_display` * `__fish_svn_prompt_char_switched_color` * `__fish_svn_prompt_char_token_present_display` * `__fish_svn_prompt_char_token_present_color` * `__fish_svn_prompt_char_token_other_display` * `__fish_svn_prompt_char_token_other_color` * `__fish_svn_prompt_char_token_stolen_display` * `__fish_svn_prompt_char_token_stolen_color` * `__fish_svn_prompt_char_token_broken_display` * `__fish_svn_prompt_char_token_broken_color` See also <fish_vcs_prompt>, which will call all supported version control prompt functions, including git, Mercurial and Subversion. Example ------- A simple prompt that displays svn info: ``` function fish_prompt ... printf '%s %s$' $PWD (fish_svn_prompt) end ``` fish breakpoint - launch debug mode breakpoint - launch debug mode ============================== Synopsis -------- ``` breakpoint ``` Description ----------- `breakpoint` is used to halt a running script and launch an interactive debugging prompt. For more details, see [Debugging fish scripts](../language#debugging) in the `fish` manual. There are no parameters for `breakpoint`. fish true - return a successful result true - return a successful result ================================= Synopsis -------- ``` true ``` Description ----------- `true` sets the exit status to 0. **:** (a single colon) is an alias for the `true` command. See Also -------- * <false> command * [$status](../language#variables-status) variable fish prevd - move backward through directory history prevd - move backward through directory history =============================================== Synopsis -------- ``` prevd [-l | --list] [POS] ``` Description ----------- `prevd` moves backwards *POS* positions in the [history of visited directories](../interactive#directory-history); if the beginning of the history has been hit, a warning is printed. If the **-l** or **--list** flag is specified, the current history is also displayed. Note that the `cd` command limits directory history to the 25 most recently visited directories. The history is stored in the `dirprev` and `dirnext` variables which this command manipulates. The **-h** or **--help** option displays help about using this command. Example ------- ``` cd /usr/src # Working directory is now /usr/src cd /usr/src/fish-shell # Working directory is now /usr/src/fish-shell prevd # Working directory is now /usr/src nextd # Working directory is now /usr/src/fish-shell ``` See Also -------- * the <cdh> command to display a prompt to quickly navigate the history * the <dirh> command to print the directory history * the <nextd> command to move forward fish random - generate random number random - generate random number =============================== Synopsis -------- ``` random random SEED random START END random START STEP END random choice [ITEMS ...] ``` Description ----------- `random` generates a pseudo-random integer from a uniform distribution. The range (inclusive) depends on the arguments. No arguments indicate a range of 0 to 32767 (inclusive). If one argument is specified, the internal engine will be seeded with the argument for future invocations of `random` and no output will be produced. Two arguments indicate a range from *START* to *END* (both *START* and *END* included). Three arguments indicate a range from *START* to *END* with a spacing of *STEP* between possible outputs. `random choice` will select one random item from the succeeding arguments. The **-h** or **--help** option displays help about using this command. Note that seeding the engine will NOT give the same result across different systems. You should not consider `random` cryptographically secure, or even statistically accurate. Example ------- The following code will count down from a random even number between 10 and 20 to 1: ``` for i in (seq (random 10 2 20) -1 1) echo $i end ``` And this will open a random picture from any of the subdirectories: ``` open (random choice **.jpg) ``` Or, to only get even numbers from 2 to 20: ``` random 2 2 20 ``` Or odd numbers from 1 to 3: ``` random 1 2 3 # or 1 2 4 ``` fish and - conditionally execute a command and - conditionally execute a command ===================================== Synopsis -------- ``` PREVIOUS; and COMMAND ``` Description ----------- `and` is used to execute a command if the previous command was successful (returned a status of 0). `and` statements may be used as part of the condition in an <while> or <if> block. `and` does not change the current exit status itself, but the command it runs most likely will. The exit status of the last foreground command to exit can always be accessed using the [$status](../language#variables-status) variable. The **-h** or **--help** option displays help about using this command. Example ------- The following code runs the `make` command to build a program. If the build succeeds, `make`’s exit status is 0, and the program is installed. If either step fails, the exit status is 1, and `make clean` is run, which removes the files created by the build process. ``` make; and make install; or make clean ``` See Also -------- * <or> command * <not> command fish history - show and manipulate command history history - show and manipulate command history ============================================= Synopsis -------- ``` history [search] [--show-time] [--case-sensitive] [--exact | --prefix | --contains] [--max N] [--null] [--reverse] [SEARCH_STRING ...] history delete [--case-sensitive] [--exact | --prefix | --contains] SEARCH_STRING ... history merge history save history clear history clear-session ``` Description ----------- `history` is used to search, delete, and otherwise manipulate the [history of interactive commands](../interactive#history-search). The following operations (sub-commands) are available: **search** Returns history items matching the search string. If no search string is provided it returns all history items. This is the default operation if no other operation is specified. You only have to explicitly say `history search` if you wish to search for one of the subcommands. The `--contains` search option will be used if you don’t specify a different search option. Entries are ordered newest to oldest unless you use the `--reverse` flag. If stdout is attached to a tty the output will be piped through your pager by the history function. The history builtin simply writes the results to stdout. **delete** Deletes history items. The `--contains` search option will be used if you don’t specify a different search option. If you don’t specify `--exact` a prompt will be displayed before any items are deleted asking you which entries are to be deleted. You can enter the word “all” to delete all matching entries. You can enter a single ID (the number in square brackets) to delete just that single entry. You can enter more than one ID separated by a space to delete multiple entries. Just press [enter] to not delete anything. Note that the interactive delete behavior is a feature of the history function. The history builtin only supports `--exact --case-sensitive` deletion. **merge** Immediately incorporates history changes from other sessions. Ordinarily `fish` ignores history changes from sessions started after the current one. This command applies those changes immediately. **save** Immediately writes all changes to the history file. The shell automatically saves the history file; this option is provided for internal use and should not normally need to be used by the user. **clear** Clears the history file. A prompt is displayed before the history is erased asking you to confirm you really want to clear all history unless `builtin history` is used. **clear-session** Clears the history file from all activity of the current session. Note: If `history merge` or `builtin history merge` is run in a session, only the history after this will be erased. The following options are available: These flags can appear before or immediately after one of the sub-commands listed above. **-C** or **--case-sensitive** Does a case-sensitive search. The default is case-insensitive. Note that prior to fish 2.4.0 the default was case-sensitive. **-c** or **--contains** Searches items in the history that contain the specified text string. This is the default for the **--search** flag. This is not currently supported by the **delete** subcommand. **-e** or **--exact** Searches or deletes items in the history that exactly match the specified text string. This is the default for the **delete** subcommand. Note that the match is case-insensitive by default. If you really want an exact match, including letter case, you must use the **-C** or **--case-sensitive** flag. **-p** or **--prefix** Searches items in the history that begin with the specified text string. This is not currently supported by the **delete** subcommand. **-t** or **--show-time** Prepends each history entry with the date and time the entry was recorded. By default it uses the strftime format `# %c%n`. You can specify another format; e.g., `--show-time="%Y-%m-%d %H:%M:%S "` or `--show-time="%a%I%p"`. The short option, **-t**, doesn’t accept a strftime format string; it only uses the default format. Any strftime format is allowed, including `%s` to get the raw UNIX seconds since the epoch. **-z** or **--null** Causes history entries written by the search operations to be terminated by a NUL character rather than a newline. This allows the output to be processed by `read -z` to correctly handle multiline history entries. **-**\*NUMBER\* **-n** *NUMBER* or **--max** *NUMBER* Limits the matched history items to the first *NUMBER* matching entries. This is only valid for `history search`. **-R** or **--reverse** Causes the history search results to be ordered oldest to newest. Which is the order used by most shells. The default is newest to oldest. **-h** or **--help** Displays help for this command. Example ------- ``` history clear # Deletes all history items history search --contains "foo" # Outputs a list of all previous commands containing the string "foo". history delete --prefix "foo" # Interactively deletes commands which start with "foo" from the history. # You can select more than one entry by entering their IDs separated by a space. ``` Customizing the name of the history file ---------------------------------------- By default interactive commands are logged to `$XDG_DATA_HOME/fish/fish_history` (typically `~/.local/share/fish/fish_history`). You can set the `fish_history` variable to another name for the current shell session. The default value (when the variable is unset) is `fish` which corresponds to `$XDG_DATA_HOME/fish/fish_history`. If you set it to e.g. `fun`, the history would be written to `$XDG_DATA_HOME/fish/fun_history`. An empty string means history will not be stored at all. This is similar to the private session features in web browsers. You can change `fish_history` at any time (by using `set -x fish_history "session_name"`) and it will take effect right away. If you set it to `"default"`, it will use the default session name (which is `"fish"`). Other shells such as bash and zsh use a variable named `HISTFILE` for a similar purpose. Fish uses a different name to avoid conflicts and signal that the behavior is different (session name instead of a file path). Also, if you set the var to anything other than `fish` or `default` it will inhibit importing the bash history. That’s because the most common use case for this feature is to avoid leaking private or sensitive history when giving a presentation. Notes ----- If you specify both **--prefix** and **--contains** the last flag seen is used. Note that for backwards compatibility each subcommand can also be specified as a long option. For example, rather than `history search` you can type `history --search`. Those long options are deprecated and will be removed in a future release. fish test - perform tests on files and text test - perform tests on files and text ====================================== Synopsis -------- ``` test [EXPRESSION] [ [EXPRESSION] ] ``` Description ----------- Tests the expression given and sets the exit status to 0 if true, and 1 if false. An expression is made up of one or more operators and their arguments. The first form (`test`) is preferred. For compatibility with other shells, the second form is available: a matching pair of square brackets (`[ [EXPRESSION] ]`). This test is mostly POSIX-compatible. When using a variable as an argument for a test operator you should almost always enclose it in double-quotes. There are only two situations it is safe to omit the quote marks. The first is when the argument is a literal string with no whitespace or other characters special to the shell (e.g., semicolon). For example, `test -b /my/file`. The second is using a variable that expands to exactly one element including if that element is the empty string (e.g., `set x ''`). If the variable is not set, set but with no value, or set to more than one value you must enclose it in double-quotes. For example, `test "$x" = "$y"`. Since it is always safe to enclose variables in double-quotes when used as `test` arguments that is the recommended practice. Operators for files and directories ----------------------------------- **-b** *FILE* Returns true if *FILE* is a block device. **-c** *FILE* Returns true if *FILE* is a character device. **-d** *FILE* Returns true if *FILE* is a directory. **-e** *FILE* Returns true if *FILE* exists. **-f** *FILE* Returns true if *FILE* is a regular file. **-g** *FILE* Returns true if *FILE* has the set-group-ID bit set. **-G** *FILE* Returns true if *FILE* exists and has the same group ID as the current user. **-k** *FILE* Returns true if *FILE* has the sticky bit set. If the OS does not support the concept it returns false. See <https://en.wikipedia.org/wiki/Sticky_bit>. **-L** *FILE* Returns true if *FILE* is a symbolic link. **-O** *FILE* Returns true if *FILE* exists and is owned by the current user. **-p** *FILE* Returns true if *FILE* is a named pipe. **-r** *FILE* Returns true if *FILE* is marked as readable. **-s** *FILE* Returns true if the size of *FILE* is greater than zero. **-S** *FILE* Returns true if *FILE* is a socket. **-t** *FD* Returns true if the file descriptor *FD* is a terminal (TTY). **-u** *FILE* Returns true if *FILE* has the set-user-ID bit set. **-w** *FILE* Returns true if *FILE* is marked as writable; note that this does not check if the filesystem is read-only. **-x** *FILE* Returns true if *FILE* is marked as executable. Operators to compare files and directories ------------------------------------------ *FILE1* **-nt** *FILE2* Returns true if *FILE1* is newer than *FILE2*, or *FILE1* exists and *FILE2* does not. *FILE1* **-ot** *FILE2* Returns true if *FILE1* is older than *FILE2*, or *FILE2* exists and *FILE1* does not. *FILE1* **-ef** *FILE1* Returns true if *FILE1* and *FILE2* refer to the same file. Operators for text strings -------------------------- *STRING1* **=** *STRING2* Returns true if the strings *STRING1* and *STRING2* are identical. *STRING1* **!=** *STRING2* Returns true if the strings *STRING1* and *STRING2* are not identical. **-n** *STRING* Returns true if the length of *STRING* is non-zero. **-z** *STRING* Returns true if the length of *STRING* is zero. Operators to compare and examine numbers ---------------------------------------- *NUM1* **-eq** *NUM2* Returns true if *NUM1* and *NUM2* are numerically equal. *NUM1* **-ne** *NUM2* Returns true if *NUM1* and *NUM2* are not numerically equal. *NUM1* **-gt** *NUM2* Returns true if *NUM1* is greater than *NUM2*. *NUM1* **-ge** *NUM2* Returns true if *NUM1* is greater than or equal to *NUM2*. *NUM1* **-lt** *NUM2* Returns true if *NUM1* is less than *NUM2*. *NUM1* **-le** *NUM2* Returns true if *NUM1* is less than or equal to *NUM2*. Both integers and floating point numbers are supported. Operators to combine expressions -------------------------------- *COND1* **-a** *COND2* Returns true if both *COND1* and *COND2* are true. *COND1* **-o** *COND2* Returns true if either *COND1* or *COND2* are true. Expressions can be inverted using the **!** operator: **!** *EXPRESSION* Returns true if *EXPRESSION* is false, and false if *EXPRESSION* is true. Expressions can be grouped using parentheses. **(** *EXPRESSION* **)** Returns the value of *EXPRESSION*. Note that parentheses will usually require escaping with `\(` to avoid being interpreted as a command substitution. Examples -------- If the `/tmp` directory exists, copy the `/etc/motd` file to it: ``` if test -d /tmp cp /etc/motd /tmp/motd end ``` If the variable `MANPATH` is defined and not empty, print the contents. (If `MANPATH` is not defined, then it will expand to zero arguments, unless quoted.) ``` if test -n "$MANPATH" echo $MANPATH end ``` Parentheses and the `-o` and `-a` operators can be combined to produce more complicated expressions. In this example, success is printed if there is a `/foo` or `/bar` file as well as a `/baz` or `/bat` file. ``` if test \( -f /foo -o -f /bar \) -a \( -f /baz -o -f /bat \) echo Success. end ``` Numerical comparisons will simply fail if one of the operands is not a number: ``` if test 42 -eq "The answer to life, the universe and everything" echo So long and thanks for all the fish # will not be executed end ``` A common comparison is with [`status`](../language#envvar-status): ``` if test $status -eq 0 echo "Previous command succeeded" end ``` The previous test can likewise be inverted: ``` if test ! $status -eq 0 echo "Previous command failed" end ``` which is logically equivalent to the following: ``` if test $status -ne 0 echo "Previous command failed" end ``` Standards --------- `test` implements a subset of the [IEEE Std 1003.1-2008 (POSIX.1) standard](https://www.unix.com/man-page/posix/1p/test/). The following exceptions apply: * The `<` and `>` operators for comparing strings are not implemented. * Because this test is a shell builtin and not a standalone utility, using the -c flag on a special file descriptors like standard input and output may not return the same result when invoked from within a pipe as one would expect when invoking the `test` utility in another shell. In cases such as this, one can use `command` `test` to explicitly use the system’s standalone `test` rather than this `builtin` `test`.
programming_docs
fish string-length - print string lengths string-length - print string lengths ==================================== Synopsis -------- ``` string length [-q | --quiet] [-V | --visible] [STRING ...] ``` Description ----------- `string length` reports the length of each string argument in characters. Exit status: 0 if at least one non-empty *STRING* was given, or 1 otherwise. With **-V** or **--visible**, it uses the visible width of the arguments. That means it will discount escape sequences fish knows about, account for $fish\_emoji\_width and $fish\_ambiguous\_width. It will also count each line (separated by `\n`) on its own, and with a carriage return (`\r`) count only the widest stretch on a line. The intent is to measure the number of columns the *STRING* would occupy in the current terminal. Examples -------- ``` >_ string length 'hello, world' 12 >_ set str foo >_ string length -q $str; echo $status 0 # Equivalent to test -n "$str" >_ string length --visible (set_color red)foobar # the set_color is discounted, so this is the width of "foobar" 6 >_ string length --visible 🐟🐟🐟🐟 # depending on $fish_emoji_width, this is either 4 or 8 # in new terminals it should be 8 >_ string length --visible abcdef\r123 # this displays as "123def", so the width is 6 6 >_ string length --visible a\nbc # counts "a" and "bc" as separate lines, so it prints width for each 1 2 ``` fish fish_update_completions - update completions using manual pages fish\_update\_completions - update completions using manual pages ================================================================= Synopsis -------- ``` fish_update_completions ``` Description ----------- `fish_update_completions` parses manual pages installed on the system, and attempts to create completion files in the `fish` configuration directory. This does not overwrite custom completions. There are no parameters for `fish_update_completions`. fish fish_add_path - add to the path fish\_add\_path - add to the path ================================= Synopsis -------- ``` fish_add_path path ... fish_add_path [(-g | --global) | (-U | --universal) | (-P | --path)] [(-m | --move)] [(-a | --append) | (-p | --prepend)] [(-v | --verbose) | (-n | --dry-run)] PATHS ... ``` Description ----------- **fish\_add\_path** is a simple way to add more components to fish’s [`PATH`](../language#envvar-PATH). It does this by adding the components either to $fish\_user\_paths or directly to [`PATH`](../language#envvar-PATH) (if the `--path` switch is given). It is (by default) safe to use **fish\_add\_path** in config.fish, or it can be used once, interactively, and the paths will stay in future because of [universal variables](../language#variables-universal). This is a “do what I mean” style command, if you need more control, consider modifying the variable yourself. Components are normalized by <realpath>. Trailing slashes are ignored and relative paths are made absolute (but symlinks are not resolved). If a component already exists, it is not added again and stays in the same place unless the `--move` switch is given. Components are added in the order they are given, and they are prepended to the path unless `--append` is given (if $fish\_user\_paths is used, that means they are last in $fish\_user\_paths, which is itself prepended to [`PATH`](../language#envvar-PATH), so they still stay ahead of the system paths). If no component is new, the variable ([`fish_user_paths`](../language#envvar-fish_user_paths) or [`PATH`](../language#envvar-PATH)) is not set again or otherwise modified, so variable handlers are not triggered. If a component is not an existing directory, `fish_add_path` ignores it. Options ------- **-a** or **--append** Add components to the *end* of the variable. **-p** or **--prepend** Add components to the *front* of the variable (this is the default). **-g** or **--global** Use a global [`fish_user_paths`](../language#envvar-fish_user_paths). **-U** or **--universal** Use a universal [`fish_user_paths`](../language#envvar-fish_user_paths) - this is the default if it doesn’t already exist. **-P** or **--path** Manipulate [`PATH`](../language#envvar-PATH) directly. **-m** or **--move** Move already-existing components to the place they would be added - by default they would be left in place and not added again. **-v** or **--verbose** Print the <set> command used. **-n** or **--dry-run** Print the `set` command that would be used without executing it. **-h** or **--help** Displays help about using this command. If `--move` is used, it may of course lead to the path swapping order, so you should be careful doing that in config.fish. Example ------- ``` # I just installed mycoolthing and need to add it to the path to use it. > fish_add_path /opt/mycoolthing/bin # I want my ~/.local/bin to be checked first. > fish_add_path -m ~/.local/bin # I prefer using a global fish_user_paths > fish_add_path -g ~/.local/bin ~/.otherbin /usr/local/sbin # I want to append to the entire $PATH because this directory contains fallbacks > fish_add_path -aP /opt/fallback/bin # I want to add the bin/ directory of my current $PWD (say /home/nemo/) > fish_add_path -v bin/ set fish_user_paths /home/nemo/bin /usr/bin /home/nemo/.local/bin # I have installed ruby via homebrew > fish_add_path /usr/local/opt/ruby/bin ``` fish fish_delta - compare functions and completions to the default fish\_delta - compare functions and completions to the default ============================================================== Synopsis -------- ``` fish_delta name ... fish_delta [-f | --no-functions] [-c | --no-completions] [-C | --no-config] [-d | --no-diff] [-n | --new] [-V | --vendor=] fish_delta [-h | --help] ``` Description ----------- The `fish_delta` function tells you, at a glance, which of your functions and completions differ from the set that fish ships. It does this by going through the relevant variables ([`fish_function_path`](../language#envvar-fish_function_path) for functions and [`fish_complete_path`](../language#envvar-fish_complete_path) for completions) and comparing the files against fish’s default directories. If any names are given, it will only compare files by those names (plus a “.fish” extension). By default, it will also use `diff` to display the difference between the files. If `diff` is unavailable, it will skip it, but in that case it also cannot figure out if the files really differ. The exit status is 1 if there was a difference and 2 for other errors, otherwise 0. Options ------- The following options are available: **-f** or **--no-functions** Stops checking functions **-c** or **--no-completions** Stops checking completions **-C** or **--no-config** Stops checking configuration files like config.fish or snippets in the conf.d directories. **-d** or **--no-diff** Removes the diff display (this happens automatically if `diff` can’t be found) **-n** or **--new** Also prints new files (i.e. those that can’t be found in fish’s default directories). **-Vvalue** or **--vendor=value** Determines how the vendor directories are counted. Valid values are: * “default” - counts vendor files as belonging to the defaults. Any changes in other directories will be counted as changes over them. This is the default. * “user” - counts vendor files as belonging to the user files. Any changes in them will be counted as new or changed files. * “ignore” - ignores vendor directories. Files of the same name will be counted as “new” if no file of the same name in fish’s default directories exists. **-h** or **--help** Prints `fish_delta`’s help (this). Example ------- Running just: ``` fish_delta ``` will give you a list of all your changed functions and completions, including diffs (if you have the `diff` command). It might look like this: ``` > fish_delta New: /home/alfa/.config/fish/functions/battery.fish Changed: /home/alfa/.config/fish/test/completions/cargo.fish --- /home/alfa/.config/fish/test/completions/cargo.fish 2022-09-02 12:57:55.579229959 +0200 +++ /usr/share/fish/completions/cargo.fish 2022-09-25 17:51:53.000000000 +0200 # the output of `diff` follows ``` The options are there to select which parts of the output you want. With `--no-completions` you can compare just functions, and with `--no-diff` you can turn off the `diff` display. To only compare your `fish_git_prompt`, you might use: ``` fish_delta --no-completions fish_git_prompt ``` which will only compare files called “fish\_git\_prompt.fish”. fish fish_clipboard_paste - get text from the system’s clipboard fish\_clipboard\_paste - get text from the system’s clipboard ============================================================= Synopsis -------- ``` fish_clipboard_paste fish_clipboard_paste | foo ``` Description ----------- The `fish_clipboard_paste` function copies text from the system clipboard. If its stdout is not a terminal (see <isatty>), it will output everything there, as-is, without any additional newlines. If it is, it will put the text in the commandline instead. If it outputs to the commandline, it will automatically escape the output if the cursor is currently inside single-quotes so it is suitable for single-quotes (meaning it escapes `'` and `\\`). It is bound to `Control`+`V` by default. `fish_clipboard_paste` works by calling a system-specific backend. If it doesn’t appear to work you may need to install yours. Currently supported are: * `pbpaste` * `wl-paste` using wayland * `xsel` and `xclip` for X11 * `powershell.exe` on Windows (this backend has encoding limitations and uses windows line endings that `fish_clipboard_paste` undoes) See also -------- * [fish\_clipboard\_copy - copy text to the system’s clipboard](fish_clipboard_copy) which does the inverse. fish string-join - join strings with delimiter string-join - join strings with delimiter ========================================= Synopsis -------- ``` string join [-q | --quiet] SEP [STRING ...] string join0 [-q | --quiet] [STRING ...] ``` Description ----------- `string join` joins its *STRING* arguments into a single string separated by *SEP*, which can be an empty string. Exit status: 0 if at least one join was performed, or 1 otherwise. If `-n` or `--no-empty` is specified, empty strings are excluded from consideration (e.g. `string join -n + a b "" c` would expand to `a+b+c` not `a+b++c`). `string join0` joins its *STRING* arguments into a single string separated by the zero byte (NUL), and adds a trailing NUL. This is most useful in conjunction with tools that accept NUL-delimited input, such as `sort -z`. Exit status: 0 if at least one join was performed, or 1 otherwise. Because Unix uses NUL as the string terminator, passing the output of `string join0` as an *argument* to a command (via a [command substitution](../language#expand-command-substitution)) won’t actually work. Fish will pass the correct bytes along, but the command won’t be able to tell where the argument ends. This is a limitation of Unix’ argument passing. Examples -------- ``` >_ seq 3 | string join ... 1...2...3 # Give a list of NUL-separated filenames to du (this is a GNU extension) >_ string join0 file1 file2 file\nwith\nmultiple\nlines | du --files0-from=- # Just put the strings together without a separator >_ string join '' a b c abc ``` fish string-lower - convert strings to lowercase string-lower - convert strings to lowercase =========================================== Synopsis -------- ``` string lower [-q | --quiet] [STRING ...] ``` Description ----------- `string lower` converts each string argument to lowercase. Exit status: 0 if at least one string was converted to lowercase, else 1. This means that in conjunction with the **-q** flag you can readily test whether a string is already lowercase. fish disown - remove a process from the list of jobs disown - remove a process from the list of jobs =============================================== Synopsis -------- ``` disown [PID ...] ``` Description ----------- `disown` removes the specified [job](../language#syntax-job-control) from the list of jobs. The job itself continues to exist, but fish does not keep track of it any longer. Jobs in the list of jobs are sent a hang-up signal when fish terminates, which usually causes the job to terminate; `disown` allows these processes to continue regardless. If no process is specified, the most recently-used job is removed (like <bg> and <fg>). If one or more PIDs are specified, jobs with the specified process IDs are removed from the job list. Invalid jobs are ignored and a warning is printed. If a job is stopped, it is sent a signal to continue running, and a warning is printed. It is not possible to use the <bg> builtin to continue a job once it has been disowned. `disown` returns 0 if all specified jobs were disowned successfully, and 1 if any problems were encountered. The **--help** or **-h** option displays help about using this command. Example ------- `firefox &; disown` will start the Firefox web browser in the background and remove it from the job list, meaning it will not be closed when the fish process is closed. `disown (jobs -p)` removes all <jobs> from the job list without terminating them. fish fish - the friendly interactive shell fish - the friendly interactive shell ===================================== Synopsis -------- ``` fish [OPTIONS] [FILE [ARG ...]] fish [OPTIONS] [-c COMMAND [ARG ...]] ``` Description ----------- **fish** is a command-line shell written mainly with interactive use in mind. This page briefly describes the options for invoking **fish**. The [full manual](../index#intro) is available in HTML by using the **help** command from inside fish, and in the `fish-doc(1)` man page. The [tutorial](../tutorial#tutorial) is available as HTML via `help tutorial` or in `man fish-tutorial`. The following options are available: **-c** or **--command=COMMAND** Evaluate the specified commands instead of reading from the commandline, passing additional positional arguments through `$argv`. **-C** or **--init-command=COMMANDS** Evaluate specified commands after reading the configuration but before executing command specified by **-c** or reading interactive input. **-d** or **--debug=DEBUG\_CATEGORIES** Enables debug output and specify a pattern for matching debug categories. See [Debugging](#debugging-fish) below for details. **-o** or **--debug-output=DEBUG\_FILE** Specifies a file path to receive the debug output, including categories and [`fish_trace`](../language#envvar-fish_trace). The default is stderr. **-i** or **--interactive** The shell is interactive. **-l** or **--login** Act as if invoked as a login shell. **-N** or **--no-config** Do not read configuration files. **-n** or **--no-execute** Do not execute any commands, only perform syntax checking. **-p** or **--profile=PROFILE\_FILE** when **fish** exits, output timing information on all executed commands to the specified file. This excludes time spent starting up and reading the configuration. **--profile-startup=PROFILE\_FILE** Will write timing for `fish` startup to specified file. **-P** or **--private** Enables [private mode](../interactive#private-mode): **fish** will not access old or store new history. **--print-rusage-self** When **fish** exits, output stats from getrusage. **--print-debug-categories** Print all debug categories, and then exit. **-v** or **--version** Print version and exit. **-f** or **--features=FEATURES** Enables one or more comma-separated [feature flags](../language#featureflags). The `fish` exit status is generally the [exit status of the last foreground command](../language#variables-status). Debugging --------- While fish provides extensive support for [debugging fish scripts](../language#debugging), it is also possible to debug and instrument its internals. Debugging can be enabled by passing the **--debug** option. For example, the following command turns on debugging for background IO thread events, in addition to the default categories, i.e. *debug*, *error*, *warning*, and *warning-path*: ``` > fish --debug=iothread ``` Available categories are listed by `fish --print-debug-categories`. The **--debug** option accepts a comma-separated list of categories, and supports glob syntax. The following command turns on debugging for *complete*, *history*, *history-file*, and *profile-history*, as well as the default categories: ``` > fish --debug='complete,*history*' ``` Debug messages output to stderr by default. Note that if [`fish_trace`](../language#envvar-fish_trace) is set, execution tracing also outputs to stderr by default. You can output to a file using the **--debug-output** option: ``` > fish --debug='complete,*history*' --debug-output=/tmp/fish.log --init-command='set fish_trace on' ``` These options can also be changed via the [`FISH_DEBUG`](../language#envvar-FISH_DEBUG) and [`FISH_DEBUG_OUTPUT`](../language#envvar-FISH_DEBUG_OUTPUT) variables. The categories enabled via **--debug** are *added* to the ones enabled by $FISH\_DEBUG, so they can be disabled by prefixing them with **-** (**reader-\*,-ast\*** enables reader debugging and disables ast debugging). The file given in **--debug-output** takes precedence over the file in [`FISH_DEBUG_OUTPUT`](../language#envvar-FISH_DEBUG_OUTPUT). fish psub - perform process substitution psub - perform process substitution =================================== Synopsis -------- ``` COMMAND1 ( COMMAND2 | psub [-F | --fifo] [-f | --file] [(-s | --suffix) SUFFIX] ) ``` Description ----------- Some shells (e.g., ksh, bash) feature a syntax that is a mix between command substitution and piping, called process substitution. It is used to send the output of a command into the calling command, much like command substitution, but with the difference that the output is not sent through commandline arguments but through a named pipe, with the filename of the named pipe sent as an argument to the calling program. `psub` combined with a regular command substitution provides the same functionality. The following options are available: **-f** or **--file** Use a regular file instead of a named pipe to communicate with the calling process. This will cause `psub` to be significantly slower when large amounts of data are involved, but has the advantage that the reading process can seek in the stream. This is the default. **-F** or **--fifo** Use a named pipe rather than a file. You should only use this if the command produces no more than 8 KiB of output. The limit on the amount of data a FIFO can buffer varies with the OS but is typically 8 KiB, 16 KiB or 64 KiB. If you use this option and the command on the left of the psub pipeline produces more output a deadlock is likely to occur. **-s** or **--suffix** *SUFFIX* Append SUFFIX to the filename. **-h** or **--help** Displays help about using this command. Example ------- ``` diff (sort a.txt | psub) (sort b.txt | psub) # shows the difference between the sorted versions of files ``a.txt`` and ``b.txt``. source-highlight -f esc (cpp main.c | psub -f -s .c) # highlights ``main.c`` after preprocessing as a C source. ``` fish string-repeat - multiply a string string-repeat - multiply a string ================================= Synopsis -------- ``` string repeat [(-n | --count) COUNT] [(-m | --max) MAX] [-N | --no-newline] [-q | --quiet] [STRING ...] ``` Description ----------- `string repeat` repeats the *STRING* **-n** or **--count** times. The **-m** or **--max** option will limit the number of outputted characters (excluding the newline). This option can be used by itself or in conjunction with **--count**. If both **--count** and **--max** are present, max char will be outputed unless the final repeated string size is less than max, in that case, the string will repeat until count has been reached. Both **--count** and **--max** will accept a number greater than or equal to zero, in the case of zero, nothing will be outputed. If **-N** or **--no-newline** is given, the output won’t contain a newline character at the end. Exit status: 0 if yielded string is not empty, 1 otherwise. Examples -------- ### Repeat Examples ``` >_ string repeat -n 2 'foo ' foo foo >_ echo foo | string repeat -n 2 foofoo >_ string repeat -n 2 -m 5 'foo' foofo >_ string repeat -m 5 'foo' foofo ```
programming_docs
fish fish_vcs_prompt - output version control system information for use in a prompt fish\_vcs\_prompt - output version control system information for use in a prompt ================================================================================= Synopsis -------- ``` fish_vcs_prompt ``` ``` function fish_prompt printf '%s' $PWD (fish_vcs_prompt) ' $ ' end ``` Description ----------- The `fish_vcs_prompt` function displays information about the current version control system (VCS) repository, if any. It calls out to VCS-specific functions. The currently supported systems are: * <fish_git_prompt> * <fish_hg_prompt> * <fish_svn_prompt> If a VCS isn’t installed, the respective function does nothing. The Subversion prompt is disabled by default, because it’s slow on large repositories. To enable it, modify `fish_vcs_prompt` to uncomment it. See <funced>. For more information, see the documentation for each of the functions above. Example ------- A simple prompt that displays all known VCS info: ``` function fish_prompt ... set -g __fish_git_prompt_showupstream auto printf '%s %s$' $PWD (fish_vcs_prompt) end ``` fish not - negate the exit status of a job not - negate the exit status of a job ===================================== Synopsis -------- ``` not COMMAND [OPTIONS ...] ``` Description ----------- `not` negates the exit status of another command. If the exit status is zero, `not` returns 1. Otherwise, `not` returns 0. The **-h** or **--help** option displays help about using this command. Example ------- The following code reports an error and exits if no file named spoon can be found. ``` if not test -f spoon echo There is no spoon exit 1 end ``` fish ulimit - set or get resource usage limits ulimit - set or get resource usage limits ========================================= Synopsis -------- ``` ulimit [OPTIONS] [LIMIT] ``` Description ----------- `ulimit` sets or outputs the resource usage limits of the shell and any processes spawned by it. If a new limit value is omitted, the current value of the limit of the resource is printed; otherwise, the specified limit is set to the new value. Use one of the following switches to specify which resource limit to set or report: **-b** or **--socket-buffers** The maximum size of socket buffers. **-c** or **--core-size** The maximum size of core files created. By setting this limit to zero, core dumps can be disabled. **-d** or **--data-size** The maximum size of a process’ data segment. **-e** or **--nice** Controls the maximum nice value; on Linux, this value is subtracted from 20 to give the effective value. **-f** or **--file-size** The maximum size of files created by a process. **-i** or **--pending-signals** The maximum number of signals that may be queued. **-l** or **--lock-size** The maximum size that may be locked into memory. **-m** or **--resident-set-size** The maximum resident set size. **-n** or **--file-descriptor-count** The maximum number of open file descriptors. **-q** or **--queue-size** The maximum size of data in POSIX message queues. **-r** or **--realtime-priority** The maximum realtime scheduling priority. **-s** or **--stack-size** The maximum stack size. **-t** or **--cpu-time** The maximum amount of CPU time in seconds. **-u** or **--process-count** The maximum number of processes available to the current user. **-w** or **--swap-size** The maximum swap space available to the current user. **-v** or **--virtual-memory-size** The maximum amount of virtual memory available to the shell. **-y** or **--realtime-maxtime** The maximum contiguous realtime CPU time in microseconds. **-K** or **--kernel-queues** The maximum number of kqueues (kernel queues) for the current user. **-P** or **--ptys** The maximum number of pseudo-terminals for the current user. **-T** or **--threads** The maximum number of simultaneous threads for the current user. Note that not all these limits are available in all operating systems; consult the documentation for `setrlimit` in your operating system. The value of limit can be a number in the unit specified for the resource or one of the special values `hard`, `soft`, or `unlimited`, which stand for the current hard limit, the current soft limit, and no limit, respectively. If limit is given, it is the new value of the specified resource. If no option is given, then **-f** is assumed. Values are in kilobytes, except for **-t**, which is in seconds and **-n** and **-u**, which are unscaled values. The exit status is 0 unless an invalid option or argument is supplied, or an error occurs while setting a new limit. `ulimit` also accepts the following options that determine what type of limit to set: **-H** or **--hard** Sets hard resource limit. **-S** or **--soft** Sets soft resource limit. A hard limit can only be decreased. Once it is set it cannot be increased; a soft limit may be increased up to the value of the hard limit. If neither **-H** nor **-S** is specified, both the soft and hard limits are updated when assigning a new limit value, and the soft limit is used when reporting the current value. The following additional options are also understood by `ulimit`: **-a** or **--all** Prints all current limits. **-h** or **--help** Displays help about using this command. The `fish` implementation of `ulimit` should behave identically to the implementation in bash, except for these differences: * Fish `ulimit` supports GNU-style long options for all switches. * Fish `ulimit` does not support the **-p** option for getting the pipe size. The bash implementation consists of a compile-time check that empirically guesses this number by writing to a pipe and waiting for SIGPIPE. Fish does not do this because this method of determining pipe size is unreliable. Depending on bash version, there may also be further additional limits to set in bash that do not exist in fish. * Fish `ulimit` does not support getting or setting multiple limits in one command, except reporting all values using the **-a** switch. Example ------- `ulimit -Hs 64` sets the hard stack size limit to 64 kB. fish exec - execute command in current process exec - execute command in current process ========================================= Synopsis -------- ``` exec COMMAND ``` Description ----------- `exec` replaces the currently running shell with a new command. On successful completion, `exec` never returns. `exec` cannot be used inside a pipeline. The **--help** or **-h** option displays help about using this command. Example ------- `exec emacs` starts up the emacs text editor, and exits `fish`. When emacs exits, the session will terminate. fish return - stop the current inner function return - stop the current inner function ======================================== Synopsis -------- ``` return [N] ``` Description ----------- **return** halts a currently running function. The exit status is set to *N* if it is given. If **return** is invoked outside of a function or dot script it is equivalent to exit. It is often added inside of a conditional block such as an <if> statement or a <switch> statement to conditionally stop the executing function and return to the caller; it can also be used to specify the exit status of a function. If at the top level of a script, it exits with the given status, like <exit>. If at the top level in an interactive session, it will set [`status`](../language#envvar-status), but not exit the shell. The **-h** or **--help** option displays help about using this command. Example ------- An implementation of the false command as a fish function: ``` function false return 1 end ``` fish source - evaluate contents of file source - evaluate contents of file ================================== Synopsis -------- ``` source FILE [ARGUMENTS ...] SOMECOMMAND | source ``` Description ----------- `source` evaluates the commands of the specified *FILE* in the current shell as a new block of code. This is different from starting a new process to perform the commands (i.e. `fish < FILE`) since the commands will be evaluated by the current shell, which means that changes in shell variables will affect the current shell. If additional arguments are specified after the file name, they will be inserted into the [`argv`](../language#envvar-argv) variable. The [`argv`](../language#envvar-argv) variable will not include the name of the sourced file. fish will search the working directory to resolve relative paths but will not search [`PATH`](../language#envvar-PATH) . If no file is specified and stdin is not the terminal, or if the file name `-` is used, stdin will be read. The exit status of `source` is the exit status of the last job to execute. If something goes wrong while opening or reading the file, `source` exits with a non-zero status. **.** (a single period) is an alias for the `source` command. The use of **.** is deprecated in favour of `source`, and **.** will be removed in a future version of fish. `source` creates a new [local scope](../language#variables-scope); `set --local` within a sourced block will not affect variables in the enclosing scope. The **-h** or **--help** option displays help about using this command. Example ------- ``` source ~/.config/fish/config.fish # Causes fish to re-read its initialization file. ``` Caveats ------- In fish versions prior to 2.3.0, the [`argv`](../language#envvar-argv) variable would have a single element (the name of the sourced file) if no arguments are present. Otherwise, it would contain arguments without the name of the sourced file. That behavior was very confusing and unlike other shells such as bash and zsh. fish open - open file in its default application open - open file in its default application =========================================== Synopsis -------- ``` open FILES ... ``` Description ----------- `open` opens a file in its default application, using the appropriate tool for the operating system. On GNU/Linux, this requires the common but optional `xdg-open` utility, from the `xdg-utils` package. Note that this function will not be used if a command by this name exists (which is the case on macOS or Haiku). Example ------- `open *.txt` opens all the text files in the current directory using your system’s default text editor. fish echo - display a line of text echo - display a line of text ============================= Synopsis -------- ``` echo [OPTIONS] [STRING] ``` Description ----------- `echo` displays *STRING* of text. The following options are available: **-n** Do not output a newline. **-s** Do not separate arguments with spaces. **-E** Disable interpretation of backslash escapes (default). **-e** Enable interpretation of backslash escapes. Unlike other shells, this echo accepts `--` to signal the end of the options. Escape Sequences ---------------- If `-e` is used, the following sequences are recognized: * `\` backslash * `\a` alert (BEL) * `\b` backspace * `\c` produce no further output * `\e` escape * `\f` form feed * `\n` new line * `\r` carriage return * `\t` horizontal tab * `\v` vertical tab * `\0NNN` byte with octal value NNN (1 to 3 digits) * `\xHH` byte with hexadecimal value HH (1 to 2 digits) Example ------- ``` > echo 'Hello World' Hello World > echo -e 'Top\nBottom' Top Bottom > echo -- -n -n ``` See Also -------- * the <printf> command, for more control over output formatting fish string-trim - remove trailing whitespace string-trim - remove trailing whitespace ======================================== Synopsis -------- ``` string trim [-l | --left] [-r | --right] [(-c | --chars) CHARS] [-q | --quiet] [STRING ...] ``` Description ----------- `string trim` removes leading and trailing whitespace from each *STRING*. If **-l** or **--left** is given, only leading whitespace is removed. If **-r** or **--right** is given, only trailing whitespace is trimmed. The **-c** or **--chars** switch causes the characters in *CHARS* to be removed instead of whitespace. Exit status: 0 if at least one character was trimmed, or 1 otherwise. Examples -------- ``` >_ string trim ' abc ' abc >_ string trim --right --chars=yz xyzzy zany x zan ``` fish abbr - manage fish abbreviations abbr - manage fish abbreviations ================================ Synopsis -------- ``` abbr --add NAME [--position command | anywhere] [-r | --regex PATTERN] [--set-cursor[=MARKER]] ([-f | --function FUNCTION] | EXPANSION) abbr --erase NAME ... abbr --rename OLD_WORD NEW_WORD abbr --show abbr --list abbr --query NAME ... ``` Description ----------- `abbr` manages abbreviations - user-defined words that are replaced with longer phrases when entered. Note Only typed-in commands use abbreviations. Abbreviations are not expanded in scripts. For example, a frequently-run command like `git checkout` can be abbreviated to `gco`. After entering `gco` and pressing `Space` or `Enter`, the full text `git checkout` will appear in the command line. To avoid expanding something that looks like an abbreviation, the default `Control`+`Space` binding inserts a space without expanding. An abbreviation may match a literal word, or it may match a pattern given by a regular expression. When an abbreviation matches a word, that word is replaced by new text, called its *expansion*. This expansion may be a fixed new phrase, or it can be dynamically created via a fish function. This expansion occurs after pressing space or enter. Combining these features, it is possible to create custom syntaxes, where a regular expression recognizes matching tokens, and the expansion function interprets them. See the [Examples](#examples) section. Abbreviations may be added to [config.fish](../language#configuration). “add” subcommand ---------------- ``` abbr [-a | --add] NAME [--position command | anywhere] [-r | --regex PATTERN] [--set-cursor[=MARKER]] ([-f | --function FUNCTION] | EXPANSION) ``` `abbr --add` creates a new abbreviation. With no other options, the string **NAME** is replaced by **EXPANSION**. With **--position command**, the abbreviation will only expand when it is positioned as a command, not as an argument to another command. With **--position anywhere** the abbreviation may expand anywhere in the command line. The default is **command**. With **--regex**, the abbreviation matches using the regular expression given by **PATTERN**, instead of the literal **NAME**. The pattern is interpreted using PCRE2 syntax and must match the entire token. If multiple abbreviations match the same token, the last abbreviation added is used. With **--set-cursor=MARKER**, the cursor is moved to the first occurrence of **MARKER** in the expansion. The **MARKER** value is erased. The **MARKER** may be omitted (i.e. simply `--set-cursor`), in which case it defaults to `%`. With **-f FUNCTION** or **--function FUNCTION**, **FUNCTION** is treated as the name of a fish function instead of a literal replacement. When the abbreviation matches, the function will be called with the matching token as an argument. If the function’s exit status is 0 (success), the token will be replaced by the function’s output; otherwise the token will be left unchanged. No **EXPANSION** may be given separately. ### Examples ``` abbr --add gco git checkout ``` Add a new abbreviation where `gco` will be replaced with `git checkout`. ``` abbr -a --position anywhere -- -C --color ``` Add a new abbreviation where `-C` will be replaced with `--color`. The `--` allows `-C` to be treated as the name of the abbreviation, instead of an option. ``` abbr -a L --position anywhere --set-cursor "% | less" ``` Add a new abbreviation where `L` will be replaced with `| less`, placing the cursor before the pipe. ``` function last_history_item echo $history[1] end abbr -a !! --position anywhere --function last_history_item ``` This first creates a function `last_history_item` which outputs the last entered command. It then adds an abbreviation which replaces `!!` with the result of calling this function. Taken together, this is similar to the `!!` history expansion feature of bash. ``` function vim_edit echo vim $argv end abbr -a vim_edit_texts --position command --regex ".+\.txt" --function vim_edit ``` This first creates a function `vim_edit` which prepends `vim` before its argument. It then adds an abbreviation which matches commands ending in `.txt`, and replaces the command with the result of calling this function. This allows text files to be “executed” as a command to open them in vim, similar to the “suffix alias” feature in zsh. ``` abbr 4DIRS --set-cursor=! "$(string join \n -- 'for dir in */' 'cd $dir' '!' 'cd ..' 'end')" ``` This creates an abbreviation “4DIRS” which expands to a multi-line loop “template.” The template enters each directory and then leaves it. The cursor is positioned ready to enter the command to run in each directory, at the location of the `!`, which is itself erased. Other subcommands ----------------- ``` abbr --rename OLD_NAME NEW_NAME ``` Renames an abbreviation, from *OLD\_NAME* to *NEW\_NAME* ``` abbr [-s | --show] ``` Show all abbreviations in a manner suitable for import and export ``` abbr [-l | --list] ``` Prints the names of all abbreviation ``` abbr [-e | --erase] NAME ``` Erases the abbreviation with the given name ``` abbr -q or --query [NAME...] ``` Return 0 (true) if one of the *NAME* is an abbreviation. ``` abbr -h or --help ``` Displays help for the `abbr` command. fish prompt_login - describe the login suitable for prompt prompt\_login - describe the login suitable for prompt ====================================================== Synopsis -------- ``` prompt_login ``` Description ----------- `prompt_login` is a function to describe the current login. It will show the user, the host and also whether the shell is running in a chroot (currently Debian’s `debian_chroot` file is supported). Examples -------- ``` function fish_prompt echo -n (prompt_login) (prompt_pwd) '$ ' end ``` ``` >_ prompt_login root@bananablaster ``` fish string-split - split strings by delimiter string-split - split strings by delimiter ========================================= Synopsis -------- ``` string split [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] SEP [STRING ...] string split0 [(-f | --fields) FIELDS] [(-m | --max) MAX] [-n | --no-empty] [-q | --quiet] [-r | --right] [STRING ...] ``` Description ----------- `string split` splits each *STRING* on the separator *SEP*, which can be an empty string. If **-m** or **--max** is specified, at most MAX splits are done on each *STRING*. If **-r** or **--right** is given, splitting is performed right-to-left. This is useful in combination with **-m** or **--max**. With **-n** or **--no-empty**, empty results are excluded from consideration (e.g. `hello\n\nworld` would expand to two strings and not three). Exit status: 0 if at least one split was performed, or 1 otherwise. Use **-f** or **--fields** to print out specific fields. FIELDS is a comma-separated string of field numbers and/or spans. Each field is one-indexed, and will be printed on separate lines. If a given field does not exist, then the command exits with status 1 and does not print anything, unless **--allow-empty** is used. See also the **--delimiter** option of the <read> command. `string split0` splits each *STRING* on the zero byte (NUL). Options are the same as `string split` except that no separator is given. `split0` has the important property that its output is not further split when used in a command substitution, allowing for the command substitution to produce elements containing newlines. This is most useful when used with Unix tools that produce zero bytes, such as `find -print0` or `sort -z`. See split0 examples below. Examples -------- ``` >_ string split . example.com example com >_ string split -r -m1 / /usr/local/bin/fish /usr/local/bin fish >_ string split '' abc a b c >_ string split --allow-empty -f1,3-4,5 '' abcd a c d ``` ### NUL Delimited Examples ``` >_ # Count files in a directory, without being confused by newlines. >_ count (find . -print0 | string split0) 42 >_ # Sort a list of elements which may contain newlines >_ set foo beta alpha\ngamma >_ set foo (string join0 $foo | sort -z | string split0) >_ string escape $foo[1] alpha\ngamma ```
programming_docs
fish math - perform mathematics calculations math - perform mathematics calculations ======================================= Synopsis -------- ``` math [(-s | --scale) N] [(-b | --base) BASE] EXPRESSION ... ``` Description ----------- `math` performs mathematical calculations. It supports simple operations such as addition, subtraction, and so on, as well as functions like `abs()`, `sqrt()` and `ln()`. By default, the output is a floating-point number with trailing zeroes trimmed. To get a fixed representation, the `--scale` option can be used, including `--scale=0` for integer output. Keep in mind that parameter expansion happens before expressions are evaluated. This can be very useful in order to perform calculations involving shell variables or the output of command substitutions, but it also means that parenthesis (`()`) and the asterisk (`*`) glob character have to be escaped or quoted. `x` can also be used to denote multiplication, but it needs to be followed by whitespace to distinguish it from hexadecimal numbers. Parentheses for functions are optional - `math sin pi` prints `0`. However, a comma will bind to the inner function, so `math pow sin 3, 5` is an error because it tries to give `sin` the arguments `3` and `5`. When in doubt, use parentheses. `math` ignores whitespace between arguments and takes its input as multiple arguments (internally joined with a space), so `math 2 +2` and `math "2 +    2"` work the same. `math 2 2` is an error. The following options are available: **-s** *N* or **--scale** *N* Sets the scale of the result. `N` must be an integer or the word “max” for the maximum scale. A scale of zero causes results to be truncated, not rounded. Any non-integer component is thrown away. So `3/2` returns `1` rather than `2` which `1.5` would normally round to. This is for compatibility with `bc` which was the basis for this command prior to fish 3.0.0. Scale values greater than zero causes the result to be rounded using the usual rules to the specified number of decimal places. **-b** *BASE* or **--base** *BASE* Sets the numeric base used for output (`math` always understands hexadecimal numbers as input). It currently understands “hex” or “16” for hexadecimal and “octal” or “8” for octal and implies a scale of 0 (other scales cause an error), so it will truncate the result down to an integer. This might change in the future. Hex numbers will be printed with a `0x` prefix. Octal numbers will have a prefix of `0` but aren’t understood by `math` as input. **-h** or **--help** Displays help about using this command. Return Values ------------- If the expression is successfully evaluated and doesn’t over/underflow or return NaN the return `status` is zero (success) else one. Syntax ------ `math` knows some operators, constants, functions and can (obviously) read numbers. For numbers, `.` is always the radix character regardless of locale - `2.5`, not `2,5`. Scientific notation (`10e5`) and hexadecimal (`0xFF`) are also available. `math` allows you to use underscores as visual separators for digit grouping. For example, you can write `1_000_000`, `0x_89_AB_CD_EF`, and `1.234_567_e89`. Operators --------- `math` knows the following operators: `+` for addition `-` for subtraction `* or x` for multiplication. `*` is the glob character and needs to be quoted or escaped, `x` needs to be followed by whitespace or it looks like `0x` hexadecimal notation. `/` for division `^` for exponentiation `%` for modulo `( or )` for grouping. These need to be quoted or escaped because `()` denotes a command substitution. They are all used in an infix manner - `5 + 2`, not `+ 5 2`. Constants --------- `math` knows the following constants: `e` Euler’s number `pi` π, you know this one. Half of Tau `tau` Equivalent to 2π, or the number of radians in a circle Use them without a leading `$` - `pi - 3` should be about 0. Functions --------- `math` supports the following functions: `abs` the absolute value, with positive sign `acos` arc cosine `asin` arc sine `atan` arc tangent `atan2` arc tangent of two variables `bitand, bitor and bitxor` perform bitwise operations. These will throw away any non-integer parts and interpret the rest as an int. Note: `bitnot` and `bitnand` don’t exist. This is because numbers in math don’t really have a *width* in terms of bits, and these operations necessarily care about leading zeroes. If you need to negate a specific number you can do it with an xor with a mask, e.g.: ``` > math --base=hex bitxor 0x0F, 0xFF 0xF0 > math --base=hex bitxor 0x2, 0x3 # Here we mask with 0x3 == 0b111, so our number is 3 bits wide # Only the 1 bit isn't set. 0x1 ``` `ceil` round number up to the nearest integer `cos` the cosine `cosh` hyperbolic cosine `exp` the base-e exponential function `fac` factorial - also known as `x!` (`x * (x - 1) * (x - 2) * ... * 1`) `floor` round number down to the nearest integer `ln` the base-e logarithm `log or log10` the base-10 logarithm `log2` the base-2 logarithm `max` returns the largest of the given numbers - this takes an arbitrary number of arguments (but at least one) `min` returns the smallest of the given numbers - this takes an arbitrary number of arguments (but at least one) `ncr` “from n choose r” combination function - how many subsets of size r can be taken from n (order doesn’t matter) `npr` the number of subsets of size r that can be taken from a set of n elements (including different order) `pow(x,y)` returns x to the y (and can be written as `x ^ y`) `round` rounds to the nearest integer, away from 0 `sin` the sine function `sinh` the hyperbolic sine `sqrt` the square root - (can also be written as `x ^ 0.5`) `tan` the tangent `tanh` the hyperbolic tangent All of the trigonometric functions use radians (the pi-based scale, not 360°). Examples -------- `math 1+1` outputs 2. `math $status - 128` outputs the numerical exit status of the last command minus 128. `math 10 / 6` outputs `1.666667`. `math -s0 10.0 / 6.0` outputs `1`. `math -s3 10 / 6` outputs `1.666`. `math "sin(pi)"` outputs `0`. `math 5 \* 2` or `math "5 * 2"` or `math 5 "*" 2` all output `10`. `math 0xFF` outputs 255, `math 0 x 3` outputs 0 (because it computes 0 multiplied by 3). `math bitand 0xFE, 0x2e` outputs 46. `math "bitor(9,2)"` outputs 11. `math --base=hex 192` prints `0xc0`. `math 'ncr(49,6)'` prints 13983816 - that’s the number of possible picks in 6-from-49 lotto. `math max 5,2,3,1` prints 5. Compatibility notes ------------------- Fish 1.x and 2.x releases relied on the `bc` command for handling `math` expressions. Starting with fish 3.0.0 fish uses the tinyexpr library and evaluates the expression without the involvement of any external commands. You don’t need to use `--` before the expression, even if it begins with a minus sign which might otherwise be interpreted as an invalid option. If you do insert `--` before the expression, it will cause option scanning to stop just like for every other command and it won’t be part of the expression. fish break - stop the current inner loop break - stop the current inner loop =================================== Synopsis -------- ``` LOOP_CONSTRUCT [COMMANDS ...] break [COMMANDS ...] end ``` Description ----------- `break` halts a currently running loop (*LOOP\_CONSTRUCT*), such as a <for> or <while> loop. It is usually added inside of a conditional block such as an <if> block. There are no parameters for `break`. Example ------- The following code searches all .c files for “smurf”, and halts at the first occurrence. ``` for i in *.c if grep smurf $i echo Smurfs are present in $i break end end ``` See Also -------- * the <continue> command, to skip the remainder of the current iteration of the current inner loop fish set - display and change shell variables set - display and change shell variables ======================================== Synopsis -------- ``` set set (-f | --function) (-l | local) (-g | --global) (-U | --universal) set [-Uflg] NAME [VALUE ...] set [-Uflg] NAME[[INDEX ...]] [VALUE ...] set (-a | --append) [-flgU] NAME VALUE ... set (-q | --query) (-e | --erase) [-flgU] [NAME][[INDEX]] ...] set (-S | --show) [NAME ...] ``` Description ----------- `set` manipulates [shell variables](../language#variables). If both *NAME* and *VALUE* are provided, `set` assigns any values to variable *NAME*. Variables in fish are [lists](../language#variables-lists), multiple values are allowed. One or more variable *INDEX* can be specified including ranges (not for all options.) If no *VALUE* is given, the variable will be set to the empty list i.e. `''`. If `set` is ran without arguments, it prints the names and values of all shell variables in sorted order. Passing [scope](../language#variables-scope) or [export](../language#variables-export) flags allows filtering this to only matching variables, so `set --local` would only show local variables. With `--erase` and optionally a scope flag `set` will erase the matching variable (or the variable of that name in the smallest possible scope). With `--show`, `set` will describe the given variable names, explaining how they have been defined - in which scope with which values and options. The following scope control variable scope: **-U** or **--universal** Sets a universal variable. The variable will be immediately available to all the user’s `fish` instances on the machine, and will be persist across restarts of the shell. **-f** or **--function** Sets a variable scoped to the executing function. It is erased when the function ends. **-l** or **--local** Sets a locally-scoped variable in this block. It is erased when the block ends. Outside of a block, this is the same as **--function**. **-g** or **--global** Sets a globally-scoped variable. Global variables don’t disappear and are available to all functions running in the same shell. They can even be modified. These options modify how variables operate: **--export** or **-x** Causes the specified shell variable to be exported to child processes (making it an “environment variable”). **--unexport** or **-u** Causes the specified shell variable to NOT be exported to child processes. **--path** Treat specified variable as a [path variable](../language#variables-path); variable will be split on colons (`:`) and will be displayed joined by colons colons when quoted (`echo "$PATH"`) or exported. **--unpath** Causes variable to no longer be tred as a [path variable](../language#variables-path). Note: variables ending in “PATH” are automatically path variables. Further options: **-a** or **--append** *NAME* *VALUE* … Appends *VALUES* to the current set of values for variable **NAME**. Can be used with **--prepend** to both append and prepend at the same time. This cannot be used when assigning to a variable slice. **-p** or **--prepend** *NAME* *VALUE* … Prepends *VALUES* to the current set of values for variable **NAME**. This can be used with **--append** to both append and prepend at the same time. This cannot be used when assigning to a variable slice. **-e** or **--erase** *NAME\*[\*INDEX*] Causes the specified shell variables to be erased. Supports erasing from multiple scopes at once. Individual items in a variable at *INDEX* in brackets can be specified. **-q** or **--query** *NAME\*[\*INDEX*] Test if the specified variable names are defined. If an *INDEX* is provided, check for items at that slot. Does not output anything, but the shell status is set to the number of variables specified that were not defined, up to a maximum of 255. If no variable was given, it also returns 255. **-n** or **--names** List only the names of all defined variables, not their value. The names are guaranteed to be sorted. **-S** or **--show** Shows information about the given variables. If no variable names are given then all variables are shown in sorted order. It shows the scopes the given variables are set in, along with the values in each and whether or not it is exported. No other flags can be used with this option. **-L** or **--long** Do not abbreviate long values when printing set variables. **-h** or **--help** Displays help about using this command. If a variable is set to more than one value, the variable will be a list with the specified elements. If a variable is set to zero elements, it will become a list with zero elements. If the variable name is one or more list elements, such as `PATH[1 3 7]`, only those list elements specified will be changed. If you specify a negative index when expanding or assigning to a list variable, the index will be calculated from the end of the list. For example, the index -1 means the last index of a list. The scoping rules when creating or updating a variable are: * Variables may be explicitly set as universal, global, function, or local. Variables with the same name but in a different scope will not be changed. * If the scope of a variable is not explicitly set *but a variable by that name has been previously defined*, the scope of the existing variable is used. If the variable is already defined in multiple scopes, the variable with the narrowest scope will be updated. * If a variable’s scope is not explicitly set and there is no existing variable by that name, the variable will be local to the currently executing function. Note that this is different from using the `-l` or `--local` flag, in which case the variable will be local to the most-inner currently executing block, while without them the variable will be local to the function as a whole. If no function is executing, the variable will be set in the global scope. The exporting rules when creating or updating a variable are identical to the scoping rules for variables: * Variables may be explicitly set to either exported or not exported. When an exported variable goes out of scope, it is unexported. * If a variable is not explicitly set to be exported or not exported, but has been previously defined, the previous exporting rule for the variable is kept. * If a variable is not explicitly set to be either exported or unexported and has never before been defined, the variable will not be exported. In query mode, the scope to be examined can be specified. Whether the variable has to be a path variable or exported can also be specified. In erase mode, if variable indices are specified, only the specified slices of the list variable will be erased. `set` requires all options to come before any other arguments. For example, `set flags -l` will have the effect of setting the value of the variable `flags` to ‘-l’, not making the variable local. Exit status ----------- In assignment mode, `set` does not modify the exit status, but passes along whatever [`status`](../language#envvar-status) was set, including by command substitutions. This allows capturing the output and exit status of a subcommand, like in `if set output (command)`. In query mode, the exit status is the number of variables that were not found. In erase mode, `set` exits with a zero exit status in case of success, with a non-zero exit status if the commandline was invalid, if any of the variables did not exist or was a [special read-only variable](../language#variables-special). Examples -------- Print all global, exported variables: ``` > set -gx ``` Set the value of the variable \_$foo\_ to be ‘hi’.: ``` > set foo hi ``` Append the value “there” to the variable $foo: ``` > set -a foo there ``` Remove \_$smurf\_ from the scope: ``` > set -e smurf ``` Remove \_$smurf\_ from the global and universal scoeps: ``` > set -e -Ug smurf ``` Change the fourth element of the $PATH list to ~/bin: ``` > set PATH[4] ~/bin ``` Outputs the path to Python if `type -p` returns true: ``` if set python_path (type -p python) echo "Python is at $python_path" end ``` Setting a variable doesn’t modify $status; a command substitution still will, though: ``` > echo $status 0 > false > set foo bar > echo $status 1 > true > set foo banana (false) > echo $status 1 ``` `VAR=VALUE command` sets a variable for just one command, like other shells. This runs fish with a temporary home directory: ``` > HOME=(mktemp -d) fish ``` (which is essentially the same as): ``` > begin; set -lx HOME (mktemp -d); fish; end ``` Notes ----- * Fish versions prior to 3.0 supported the syntax `set PATH[1] PATH[4] /bin /sbin`, which worked like `set PATH[1 4] /bin /sbin`. fish prompt_hostname - print the hostname, shortened for use in the prompt prompt\_hostname - print the hostname, shortened for use in the prompt ====================================================================== Synopsis -------- ``` prompt_hostname ``` Description ----------- `prompt_hostname` prints a shortened version the current hostname for use in the prompt. It will print just the first component of the hostname, everything up to the first dot. Examples -------- ``` function fish_prompt echo -n (whoami)@(prompt_hostname) (prompt_pwd) '$ ' end ``` ``` # The machine's full hostname is foo.bar.com >_ prompt_hostname foo ``` fish if - conditionally execute a command if - conditionally execute a command ==================================== Synopsis -------- ``` if CONDITION; COMMANDS_TRUE ...; [else if CONDITION2; COMMANDS_TRUE2 ...;] [else; COMMANDS_FALSE ...;] end ``` Description ----------- `if` will execute the command `CONDITION`. If the condition’s exit status is 0, the commands `COMMANDS_TRUE` will execute. If the exit status is not 0 and <else> is given, `COMMANDS_FALSE` will be executed. You can use <and> or <or> in the condition. See the second example below. The exit status of the last foreground command to exit can always be accessed using the [$status](../language#variables-status) variable. The **-h** or **--help** option displays help about using this command. Example ------- The following code will print `foo.txt exists` if the file foo.txt exists and is a regular file, otherwise it will print `bar.txt exists` if the file bar.txt exists and is a regular file, otherwise it will print `foo.txt and bar.txt do not exist`. ``` if test -f foo.txt echo foo.txt exists else if test -f bar.txt echo bar.txt exists else echo foo.txt and bar.txt do not exist end ``` The following code will print “foo.txt exists and is readable” if foo.txt is a regular file and readable ``` if test -f foo.txt and test -r foo.txt echo "foo.txt exists and is readable" end ``` fish fish_prompt - define the appearance of the command line prompt fish\_prompt - define the appearance of the command line prompt =============================================================== Synopsis -------- ``` fish_prompt ``` ``` function fish_prompt ... end ``` Description ----------- The `fish_prompt` function is executed when the prompt is to be shown, and the output is used as a prompt. The exit status of commands within `fish_prompt` will not modify the value of [$status](../language#variables-status) outside of the `fish_prompt` function. `fish` ships with a number of example prompts that can be chosen with the `fish_config` command. Example ------- A simple prompt: ``` function fish_prompt -d "Write out the prompt" # This shows up as USER@HOST /home/user/ >, with the directory colored # $USER and $hostname are set by fish, so you can just use them # instead of using `whoami` and `hostname` printf '%s@%s %s%s%s > ' $USER $hostname \ (set_color $fish_color_cwd) (prompt_pwd) (set_color normal) end ```
programming_docs
fish fish_key_reader - explore what characters keyboard keys send fish\_key\_reader - explore what characters keyboard keys send ============================================================== Synopsis -------- ``` fish_key_reader [OPTIONS] ``` Description ----------- **fish\_key\_reader** is used to explain how you would bind a certain key sequence. By default, it prints the <bind> command for one key sequence read interactively over standard input. If the character sequence matches a special key name (see `bind --key-names`), both `bind CHARS ...` and `bind -k KEYNAME ...` usage will be shown. In verbose mode (enabled by passing `--verbose`), additional details about the characters received, such as the delay between chars, are written to standard error. The following options are available: **-c** or **--continuous** Begins a session where multiple key sequences can be inspected. By default the program exits after capturing a single key sequence. **-V** or **--verbose** Tells fish\_key\_reader to output timing information and explain the sequence in more detail. **-h** or **--help** Displays help about using this command. **-v** or **--version** Displays the current **fish** version and then exits. Usage Notes ----------- In verbose mode, the delay in milliseconds since the previous character was received is included in the diagnostic information written to standard error. This information may be useful to determine the optimal `fish_escape_delay_ms` setting or learn the amount of lag introduced by tools like `ssh`, `mosh` or `tmux`. `fish_key_reader` intentionally disables handling of many signals. To terminate `fish_key_reader` in `--continuous` mode do: * press `Control`+`C` twice, or * press `Control`+`D` twice, or * type `exit`, or * type `quit` Example ------- ``` > fish_key_reader Press a key: # press up-arrow bind \e\[A 'do something' > fish_key_reader --verbose Press a key: # press alt+enter hex: 1B char: \c[ (or \e) ( 0.027 ms) hex: D char: \cM (or \r) bind \e\r 'do something' ``` fish popd - move through directory stack popd - move through directory stack =================================== Synopsis -------- ``` popd ``` Description ----------- `popd` removes the top directory from the [directory stack](../interactive#directory-stack) and changes the working directory to the new top directory. Use <pushd> to add directories to the stack. The **-h** or **--help** option displays help about using this command. Example ------- ``` pushd /usr/src # Working directory is now /usr/src # Directory stack contains /usr/src pushd /usr/src/fish-shell # Working directory is now /usr/src/fish-shell # Directory stack contains /usr/src /usr/src/fish-shell popd # Working directory is now /usr/src # Directory stack contains /usr/src ``` See Also -------- * the <dirs> command to print the directory stack * the <cdh> command which provides a more intuitive way to navigate to recently visited directories. fish block - temporarily block delivery of events block - temporarily block delivery of events ============================================ Synopsis -------- ``` block [(--local | --global)] block --erase ``` Description ----------- `block` prevents events triggered by `fish` or the <emit> command from being delivered and acted upon while the block is in place. In functions, `block` can be useful while performing work that should not be interrupted by the shell. The block can be removed. Any events which triggered while the block was in place will then be delivered. Event blocks should not be confused with code blocks, which are created with `begin`, `if`, `while` or `for` Without options, the `block` command acts with function scope. The following options are available: **-l** or **--local** Release the block automatically at the end of the current innermost code block scope. **-g** or **--global** Never automatically release the lock. **-e** or **--erase** Release global block. **-h** or **--help** Displays help about using this command. Example ------- ``` # Create a function that listens for events function --on-event foo foo; echo 'foo fired'; end # Block the delivery of events block -g emit foo # No output will be produced block -e # 'foo fired' will now be printed ``` Notes ----- Events are only received from the current fish process as there is no way to send events from one fish process to another (yet). fish fish_status_to_signal - convert exit codes to human-friendly signals fish\_status\_to\_signal - convert exit codes to human-friendly signals ======================================================================= Synopsis -------- ``` fish_status_to_signal NUM ``` ``` function fish_prompt echo -n (fish_status_to_signal $pipestatus | string join '|') (prompt_pwd) '$ ' end ``` Description ----------- `fish_status_to_signal` converts exit codes to their corresponding human-friendly signals if one exists. This is likely to be useful for prompts in conjunction with the `$status` and `$pipestatus` variables. Example ------- ``` >_ sleep 5 ^C⏎ >_ fish_status_to_signal $status SIGINT ``` fish begin - start a new block of code begin - start a new block of code ================================= Synopsis -------- ``` begin; [COMMANDS ...]; end ``` Description ----------- `begin` is used to create a new block of code. A block allows the introduction of a new [variable scope](../language#variables-scope), redirection of the input or output of a set of commands as a group, or to specify precedence when using the conditional commands like `and`. The block is unconditionally executed. `begin; ...; end` is equivalent to `if true; ...; end`. `begin` does not change the current exit status itself. After the block has completed, `$status` will be set to the status returned by the most recent command. The **-h** or **--help** option displays help about using this command. Example ------- The following code sets a number of variables inside of a block scope. Since the variables are set inside the block and have local scope, they will be automatically deleted when the block ends. ``` begin set -l PIRATE Yarrr ... end echo $PIRATE # This will not output anything, since the PIRATE variable # went out of scope at the end of the block ``` In the following code, all output is redirected to the file out.html. ``` begin echo $xml_header echo $html_header if test -e $file ... end ... end > out.html ``` fish function - create a function function - create a function ============================ Synopsis -------- ``` function NAME [OPTIONS]; BODY; end ``` Description ----------- `function` creates a new function *NAME* with the body *BODY*. A function is a list of commands that will be executed when the name of the function is given as a command. The following options are available: **-a** *NAMES* or **--argument-names** *NAMES* Assigns the value of successive command-line arguments to the names given in *NAMES*. These are the same arguments given in [`argv`](../language#envvar-argv), and are still available there. See also [Argument Handling](../language#variables-argv). **-d** *DESCRIPTION* or **--description** *DESCRIPTION* A description of what the function does, suitable as a completion description. **-w** *WRAPPED\_COMMAND* or **--wraps** *WRAPPED\_COMMAND* Inherit completions from the given *WRAPPED\_COMMAND*. See the documentation for <complete> for more information. **-e** *EVENT\_NAME* or **--on-event** *EVENT\_NAME* Run this function when the specified named event is emitted. Fish internally generates named events, for example,when showing the prompt. Custom events can be emitted using the <emit> command. **-v** *VARIABLE\_NAME* or **--on-variable** *VARIABLE\_NAME* Run this function when the variable *VARIABLE\_NAME* changes value. Note that **fish** makes no guarantees on any particular timing or even that the function will be run for every single `set`. Rather it will be run when the variable has been set at least once, possibly skipping some values or being run when the variable has been set to the same value (except for universal variables set in other shells - only changes in the value will be picked up for those). **-j** *PID* or **--on-job-exit** *PID* Run this function when the job containing a child process with the given process identifier *PID* exits. Instead of a PID, the string ‘caller’ can be specified. This is only allowed when in a command substitution, and will result in the handler being triggered by the exit of the job which created this command substitution. **-p** *PID* or **--on-process-exit** *PID* Run this function when the fish child process with process ID PID exits. Instead of a PID, for backward compatibility, “`%self`” can be specified as an alias for `$fish_pid`, and the function will be run when the current fish instance exits. **-s** *SIGSPEC* or **--on-signal** *SIGSPEC* Run this function when the signal `SIGSPEC` is delivered. `SIGSPEC` can be a signal number, or the signal name, such as `SIGHUP` (or just `HUP`). Note that the signal must have been delivered to **fish**; for example, ``Ctrl`-`C`` sends `SIGINT` to the foreground process group, which will not be **fish** if you are running another command at the time. Observing a signal will prevent fish from exiting in response to that signal. **-S** or **--no-scope-shadowing** Allows the function to access the variables of calling functions. Normally, any variables inside the function that have the same name as variables from the calling function are “shadowed”, and their contents are independent of the calling function. It’s important to note that this does not capture referenced variables or the scope at the time of function declaration! At this time, fish does not have any concept of closures, and variable lifetimes are never extended. In other words, by using **--no-scope-shadowing** the scope of the function each time it is run is shared with the scope it was *called* from rather than the scope it was *defined* in. **-V** or **--inherit-variable NAME** Snapshots the value of the variable `NAME` and defines a local variable with that same name and value when the function is defined. This is similar to a closure in other languages like Python but a bit different. Note the word “snapshot” in the first sentence. If you change the value of the variable after defining the function, even if you do so in the same scope (typically another function) the new value will not be used by the function you just created using this option. See the `function notify` example below for how this might be used. The event handler switches (`on-event`, `on-variable`, `on-job-exit`, `on-process-exit` and `on-signal`) cause a function to run automatically at specific events. New named events for `--on-event` can be fired using the <emit> builtin. Fish already generates a few events, see [Event handlers](../language#event) for more. Functions may not be named the same as a reserved keyword. These are elements of fish syntax or builtin commands which are essential for the operations of the shell. Current reserved words are `[`, `_`, `and`, `argparse`, `begin`, `break`, `builtin`, `case`, `command`, `continue`, `else`, `end`, `eval`, `exec`, `for`, `function`, `if`, `not`, `or`, `read`, `return`, `set`, `status`, `string`, `switch`, `test`, `time`, and `while`. Example ------- ``` function ll ls -l $argv end ``` will run the `ls` command, using the `-l` option, while passing on any additional files and switches to `ls`. ``` function mkdir -d "Create a directory and set CWD" command mkdir $argv if test $status = 0 switch $argv[(count $argv)] case '-*' case '*' cd $argv[(count $argv)] return end end end ``` This will run the `mkdir` command, and if it is successful, change the current working directory to the one just created. ``` function notify set -l job (jobs -l -g) or begin; echo "There are no jobs" >&2; return 1; end function _notify_job_$job --on-job-exit $job --inherit-variable job echo -n \a # beep functions -e _notify_job_$job end end ``` This will beep when the most recent job completes. Notes ----- Events are only received from the current fish process as there is no way to send events from one fish process to another. See more -------- For more explanation of how functions fit into fish, see [Functions](../language#syntax-function). fish switch - conditionally execute a block of commands switch - conditionally execute a block of commands ================================================== Synopsis -------- ``` switch VALUE; [case [GLOB ...]; [COMMANDS ...]; ...] end ``` Description ----------- `switch` performs one of several blocks of commands, depending on whether a specified value equals one of several globbed values. `case` is used together with the `switch` statement in order to determine which block should be executed. Each `case` command is given one or more parameters. The first `case` command with a parameter that matches the string specified in the switch command will be evaluated. `case` parameters may contain globs. These need to be escaped or quoted in order to avoid regular glob expansion using filenames. Note that fish does not fall through on case statements. Only the first matching case is executed. Note that <break> cannot be used to exit a case/switch block early like in other languages. It can only be used in loops. Note that command substitutions in a case statement will be evaluated even if its body is not taken. All substitutions, including command substitutions, must be performed before the value can be compared against the parameter. Example ------- If the variable `animal` contains the name of an animal, the following code would attempt to classify it: ``` switch $animal case cat echo evil case wolf dog human moose dolphin whale echo mammal case duck goose albatross echo bird case shark trout stingray echo fish case '*' echo I have no idea what a $animal is end ``` If the above code was run with `$animal` set to `whale`, the output would be `mammal`. fish or - conditionally execute a command or - conditionally execute a command ==================================== Synopsis -------- ``` COMMAND1; or COMMAND2 ``` Description ----------- `or` is used to execute a command if the previous command was not successful (returned a status of something other than 0). `or` statements may be used as part of the condition in an <if> or <while> block. `or` does not change the current exit status itself, but the command it runs most likely will. The exit status of the last foreground command to exit can always be accessed using the [$status](../language#variables-status) variable. The **-h** or **--help** option displays help about using this command. Example ------- The following code runs the `make` command to build a program. If the build succeeds, the program is installed. If either step fails, `make clean` is run, which removes the files created by the build process. ``` make; and make install; or make clean ``` See Also -------- * <and> command fish exit - exit the shell exit - exit the shell ===================== Synopsis -------- ``` exit [CODE] ``` Description ----------- **exit** is a special builtin that causes the shell to exit. Either 255 or the *CODE* supplied is used, whichever is lesser. Otherwise, the exit status will be that of the last command executed. If exit is called while sourcing a file (using the <source> builtin) the rest of the file will be skipped, but the shell itself will not exit. The **--help** or **-h** option displays help about using this command. fish cd - change directory cd - change directory ===================== Synopsis -------- ``` cd [DIRECTORY] ``` Description ----------- `cd` changes the current working directory. If *DIRECTORY* is given, it will become the new directory. If no parameter is given, the [`HOME`](../language#envvar-HOME) environment variable will be used. If *DIRECTORY* is a relative path, all the paths in the [`CDPATH`](../language#envvar-CDPATH) will be tried as prefixes for it, in addition to [`PWD`](../language#envvar-PWD). It is recommended to keep **.** as the first element of [`CDPATH`](../language#envvar-CDPATH), or [`PWD`](../language#envvar-PWD) will be tried last. Fish will also try to change directory if given a command that looks like a directory (starting with **.**, **/** or **~**, or ending with **/**), without explicitly requiring **cd**. Fish also ships a wrapper function around the builtin **cd** that understands `cd -` as changing to the previous directory. See also <prevd>. This wrapper function maintains a history of the 25 most recently visited directories in the `$dirprev` and `$dirnext` global variables. If you make those universal variables your **cd** history is shared among all fish instances. As a special case, `cd .` is equivalent to `cd $PWD`, which is useful in cases where a mountpoint has been recycled or a directory has been removed and recreated. The **--help** or **-h** option displays help about using this command, and does not change the directory. Examples -------- ``` cd # changes the working directory to your home directory. cd /usr/src/fish-shell # changes the working directory to /usr/src/fish-shell ``` See Also -------- Navigate directories using the [directory history](../interactive#directory-history) or the [directory stack](../interactive#directory-stack) fish printf - display text according to a format string printf - display text according to a format string ================================================== Synopsis -------- ``` printf FORMAT [ARGUMENT ...] ``` Description ----------- `printf` uses the format string *FORMAT* to print the *ARGUMENT* arguments. This means that it takes format specifiers in the format string and replaces each with an argument. The *FORMAT* argument is re-used as many times as necessary to convert all of the given arguments. So `printf %s\n flounder catfish clownfish shark` will print four lines. Unlike <echo>, `printf` does not append a new line unless it is specified as part of the string. It doesn’t support any options, so there is no need for a `--` separator, which makes it easier to use for arbitrary input than `echo`. [[1]](#id2) Format Specifiers ----------------- Valid format specifiers are taken from the C library function `printf(3)`: * `%d` or `%i`: Argument will be used as decimal integer (signed or unsigned) * `%o`: An octal unsigned integer * `%u`: An unsigned decimal integer - this means negative numbers will wrap around * `%x` or `%X`: An unsigned hexadecimal integer * `%f`, `%g` or `%G`: A floating-point number. `%f` defaults to 6 places after the decimal point (which is locale-dependent - e.g. in de\_DE it will be a `,`). `%g` and `%G` will trim trailing zeroes and switch to scientific notation (like `%e`) if the numbers get small or large enough. * `%e` or `%E`: A floating-point number in scientific (XXXeYY) notation * `%s`: A string * `%b`: As a string, interpreting backslash escapes, except that octal escapes are of the form 0 or 0ooo. `%%` signifies a literal “%”. Conversion can fail, e.g. “102.234” can’t losslessly convert to an integer, causing printf to print an error. If you are okay with losing information, silence errors with `2>/dev/null`. A number between the `%` and the format letter specifies the width. The result will be left-padded with spaces. Backslash Escapes ----------------- printf also knows a number of backslash escapes: * `\"` double quote * `\\` backslash * `\a` alert (bell) * `\b` backspace * `\c` produce no further output * `\e` escape * `\f` form feed * `\n` new line * `\r` carriage return * `\t` horizontal tab * `\v` vertical tab * `\ooo` octal number (ooo is 1 to 3 digits) * `\xhh` hexadecimal number (hhh is 1 to 2 digits) * `\uhhhh` 16-bit Unicode character (hhhh is 4 digits) * `\Uhhhhhhhh` 32-bit Unicode character (hhhhhhhh is 8 digits) Errors and Return Status ------------------------ If the given argument doesn’t work for the given format (like when you try to convert a number like 3.141592 to an integer), printf prints an error, to stderr. printf will then also return non-zero, but will still try to print as much as it can. It will also return non-zero if no argument at all was given, in which case it will print nothing. This printf has been imported from the printf in GNU Coreutils version 6.9. If you would like to use a newer version of printf, for example the one shipped with your OS, try `command printf`. Example ------- ``` printf '%s\t%s\n' flounder fish ``` Will print “flounder fish” (separated with a tab character), followed by a newline character. This is useful for writing completions, as fish expects completion scripts to output the option followed by the description, separated with a tab character. ``` printf '%s: %d' "Number of bananas in my pocket" 42 ``` Will print “Number of bananas in my pocket: 42”, `without` a newline. See Also -------- * the <echo> command, for simpler output Footnotes ---------
programming_docs
fish string-unescape - expand escape sequences string-unescape - expand escape sequences ========================================= Synopsis -------- ``` string escape [-n | --no-quoted] [--style=] [STRING ...] string unescape [--style=] [STRING ...] ``` Description ----------- `string escape` escapes each *STRING* in one of three ways. The first is **--style=script**. This is the default. It alters the string such that it can be passed back to `eval` to produce the original argument again. By default, all special characters are escaped, and quotes are used to simplify the output when possible. If **-n** or **--no-quoted** is given, the simplifying quoted format is not used. Exit status: 0 if at least one string was escaped, or 1 otherwise. **--style=var** ensures the string can be used as a variable name by hex encoding any non-alphanumeric characters. The string is first converted to UTF-8 before being encoded. **--style=url** ensures the string can be used as a URL by hex encoding any character which is not legal in a URL. The string is first converted to UTF-8 before being encoded. **--style=regex** escapes an input string for literal matching within a regex expression. The string is first converted to UTF-8 before being encoded. `string unescape` performs the inverse of the `string escape` command. If the string to be unescaped is not properly formatted it is ignored. For example, doing `string unescape --style=var (string escape --style=var $str)` will return the original string. There is no support for unescaping **--style=regex**. Examples -------- ``` >_ echo \x07 | string escape \cg >_ string escape --style=var 'a1 b2'\u6161 a1_20_b2_E6_85_A1_ ``` fish isatty - test if a file descriptor is a terminal isatty - test if a file descriptor is a terminal ================================================ Synopsis -------- ``` isatty [FILE_DESCRIPTOR] ``` Description ----------- `isatty` tests if a file descriptor is a terminal (as opposed to a file). The name is derived from the system call of the same name, which for historical reasons refers to a teletypewriter (TTY). `FILE DESCRIPTOR` may be either the number of a file descriptor, or one of the strings `stdin`, `stdout`, or `stderr`. If not specified, zero is assumed. If the specified file descriptor is a terminal device, the exit status of the command is zero. Otherwise, the exit status is non-zero. No messages are printed to standard error. The **-h** or **--help** option displays help about using this command. Examples -------- From an interactive shell, the commands below exit with a return value of zero: ``` isatty isatty stdout isatty 2 echo | isatty 1 ``` And these will exit non-zero: ``` echo | isatty isatty 9 isatty stdout > file isatty 2 2> file ``` fish umask - set or get the file creation mode mask umask - set or get the file creation mode mask ============================================== Synopsis -------- ``` umask [OPTIONS] [MASK] ``` Description ----------- `umask` displays and manipulates the “umask”, or file creation mode mask, which is used to restrict the default access to files. The umask may be expressed either as an octal number, which represents the rights that will be removed by default, or symbolically, which represents the only rights that will be granted by default. Access rights are explained in the manual page for the `chmod(1)` program. With no parameters, the current file creation mode mask is printed as an octal number. **-S** or **--symbolic** Prints the umask in symbolic form instead of octal form. **-p** or **--as-command** Outputs the umask in a form that may be reused as input. **-h** or **--help** Displays help about using this command. If a numeric mask is specified as a parameter, the current shell’s umask will be set to that value, and the rights specified by that mask will be removed from new files and directories by default. If a symbolic mask is specified, the desired permission bits, and not the inverse, should be specified. A symbolic mask is a comma separated list of rights. Each right consists of three parts: * The first part specifies to whom this set of right applies, and can be one of `u`, `g`, `o` or `a`, where `u` specifies the user who owns the file, `g` specifies the group owner of the file, `o` specific other users rights and `a` specifies all three should be changed. * The second part of a right specifies the mode, and can be one of `=`, `+` or `-`, where `=` specifies that the rights should be set to the new value, `+` specifies that the specified right should be added to those previously specified and `-` specifies that the specified rights should be removed from those previously specified. * The third part of a right specifies what rights should be changed and can be any combination of `r`, `w` and `x`, representing read, write and execute rights. If the first and second parts are skipped, they are assumed to be `a` and `=`, respectively. As an example, `r,u+w` means all users should have read access and the file owner should also have write access. Note that symbolic masks currently do not work as intended. Example ------- `umask 177` or `umask u=rw` sets the file creation mask to read and write for the owner and no permissions at all for any other users. fish for - perform a set of commands multiple times for - perform a set of commands multiple times ============================================== Synopsis -------- ``` for VARNAME in [VALUES ...]; COMMANDS ...; end ``` Description ----------- **for** is a loop construct. It will perform the commands specified by *COMMANDS* multiple times. On each iteration, the local variable specified by *VARNAME* is assigned a new value from *VALUES*. If *VALUES* is empty, *COMMANDS* will not be executed at all. The *VARNAME* is visible when the loop terminates and will contain the last value assigned to it. If *VARNAME* does not already exist it will be set in the local scope. For our purposes if the **for** block is inside a function there must be a local variable with the same name. If the **for** block is not nested inside a function then global and universal variables of the same name will be used if they exist. Much like <set>, **for** does not modify $status, but the evaluation of its subordinate commands can. The **-h** or **--help** option displays help about using this command. Example ------- ``` for i in foo bar baz; echo $i; end # would output: foo bar baz ``` Notes ----- The `VARNAME` was local to the for block in releases prior to 3.0.0. This means that if you did something like this: ``` for var in a b c if break_from_loop break end end echo $var ``` The last value assigned to `var` when the loop terminated would not be available outside the loop. What `echo $var` would write depended on what it was set to before the loop was run. Likely nothing. fish wait - wait for jobs to complete wait - wait for jobs to complete ================================ Synopsis -------- ``` wait [-n | --any] [PID | PROCESS_NAME] ... ``` Description ----------- `wait` waits for child jobs to complete. If a *PID* is specified, the command waits for the job that the process with that process ID belongs to. If a *PROCESS\_NAME* is specified, the command waits for the jobs that the matched processes belong to. If neither a pid nor a process name is specified, the command waits for all background jobs. If the **-n** or **--any** flag is provided, the command returns as soon as the first job completes. If it is not provided, it returns after all jobs complete. The **-h** or **--help** option displays help about using this command. Example ------- ``` sleep 10 & wait $last_pid ``` spawns `sleep` in the background, and then waits until it finishes. ``` for i in (seq 1 5); sleep 10 &; end wait ``` spawns five jobs in the background, and then waits until all of them finishes. ``` for i in (seq 1 5); sleep 10 &; end hoge & wait sleep ``` spawns five jobs and `hoge` in the background, and then waits until all `sleep`s finish, and doesn’t wait for `hoge` finishing. fish fish_config - start the web-based configuration interface fish\_config - start the web-based configuration interface ========================================================== Synopsis -------- ``` fish_config [browse] fish_config prompt (choose | list | save | show) fish_config theme (choose | demo | dump | list | save | show) ``` Description ----------- `fish_config` is used to configure fish. Without arguments or with the `browse` command it starts the web-based configuration interface. The web interface allows you to view your functions, variables and history, and to make changes to your prompt and color configuration. It starts a local web server and opens a browser window. When you are finished, close the browser window and press the Enter key to terminate the configuration session. If the `BROWSER` environment variable is set, it will be used as the name of the web browser to open instead of the system default. With the `prompt` command `fish_config` can be used to view and choose a prompt from fish’s sample prompts inside the terminal directly. Available subcommands for the `prompt` command: * `choose` loads a sample prompt in the current session. * `list` lists the names of the available sample prompts. * `save` saves the current prompt to a file (via <funcsave>). * `show` shows what the given sample prompts (or all) would look like. With the `theme` command `fish_config` can be used to view and choose a theme (meaning a color scheme) inside the terminal. Available subcommands for the `theme` command: * `choose` loads a sample theme in the current session. * `demo` displays some sample text in the current theme. * `dump` prints the current theme in a loadable format. * `list` lists the names of the available sample themes. * `save` saves the given theme to [universal variables](../language#variables-universal). * `show` shows what the given sample theme (or all) would look like. The themes are loaded from the theme directory shipped with fish or a `themes` directory in the fish configuration directory (typically `~/.config/fish/themes`). The **-h** or **--help** option displays help about using this command. Example ------- `fish_config` or `fish_config browse` opens a new web browser window and allows you to configure certain fish settings. `fish_config prompt show` demos the available sample prompts. `fish_config prompt choose disco` makes the disco prompt the prompt for the current session. This can also be used in [config.fish](../language#configuration) to set the prompt. `fish_config prompt save` saves the current prompt to an [autoloaded](../language#syntax-function-autoloading) file. `fish_config prompt save default` chooses the default prompt and saves it. fish end - end a block of commands end - end a block of commands ============================= Synopsis -------- ``` begin [COMMANDS ...] end ``` ``` function NAME [OPTIONS]; COMMANDS ...; end if CONDITION; COMMANDS_TRUE ...; [else; COMMANDS_FALSE ...;] end switch VALUE; [case [WILDCARD ...]; [COMMANDS ...]; ...] end while CONDITION; COMMANDS ...; end for VARNAME in [VALUES ...]; COMMANDS ...; end ``` Description ----------- The **end** keyword ends a block of commands started by one of the following commands: * <begin> to start a block of commands * <function> to define a function * <if>, <switch> to conditionally execute commands * <while>, <for> to perform commands multiple times The **end** keyword does not change the current exit status. Instead, the status after it will be the status returned by the most recent command. fish path - manipulate and check paths path - manipulate and check paths ================================= Synopsis -------- ``` path basename GENERAL_OPTIONS [PATH ...] path dirname GENERAL_OPTIONS [PATH ...] path extension GENERAL_OPTIONS [PATH ...] path filter GENERAL_OPTIONS [-v | --invert] [-d] [-f] [-l] [-r] [-w] [-x] [(-t | --type) TYPE] [(-p | --perm) PERMISSION] [PATH ...] path is GENERAL_OPTIONS [(-v | --invert)] [(-t | --type) TYPE] [-d] [-f] [-l] [-r] [-w] [-x] [(-p | --perm) PERMISSION] [PATH ...] path mtime GENERAL_OPTIONS [(-R | --relative)] [PATH ...] path normalize GENERAL_OPTIONS [PATH ...] path resolve GENERAL_OPTIONS [PATH ...] path change-extension GENERAL_OPTIONS EXTENSION [PATH ...] path sort GENERAL_OPTIONS [-r | --reverse] [-u | --unique] [--key=basename|dirname|path] [PATH ...] GENERAL_OPTIONS [-z | --null-in] [-Z | --null-out] [-q | --quiet] ``` Description ----------- `path` performs operations on paths. PATH arguments are taken from the command line unless standard input is connected to a pipe or a file, in which case they are read from standard input, one PATH per line. It is an error to supply PATH arguments on both the command line and on standard input. Arguments starting with `-` are normally interpreted as switches; `--` causes the following arguments not to be treated as switches even if they begin with `-`. Switches and required arguments are recognized only on the command line. When a path starts with `-`, `path filter` and `path normalize` will prepend `./` on output to avoid it being interpreted as an option otherwise, so it’s safe to pass path’s output to other commands that can handle relative paths. All subcommands accept a `-q` or `--quiet` switch, which suppresses the usual output but exits with the documented status. In this case these commands will quit early, without reading all of the available input. All subcommands also accept a `-Z` or `--null-out` switch, which makes them print output separated with NUL instead of newlines. This is for further processing, e.g. passing to another `path`, or `xargs -0`. This is not recommended when the output goes to the terminal or a command substitution. All subcommands also accept a `-z` or `--null-in` switch, which makes them accept arguments from stdin separated with NULL-bytes. Since Unix paths can’t contain NULL, that makes it possible to handle all possible paths and read input from e.g. `find -print0`. If arguments are given on the commandline this has no effect. This should mostly be unnecessary since `path` automatically starts splitting on NULL if one appears in the first PATH\_MAX bytes, PATH\_MAX being the operating system’s maximum length for a path plus a NULL byte. Some subcommands operate on the paths as strings and so work on nonexistent paths, while others need to access the paths themselves and so filter out nonexistent paths. The following subcommands are available. “basename” subcommand --------------------- ``` path basename [-z | --null-in] [-Z | --null-out] [-q | --quiet] [PATH ...] ``` `path basename` returns the last path component of the given path, by removing the directory prefix and removing trailing slashes. In other words, it is the part that is not the dirname. For files you might call it the “filename”. It returns 0 if there was a basename, i.e. if the path wasn’t empty or just slashes. ### Examples ``` >_ path basename ./foo.mp4 foo.mp4 >_ path basename ../banana banana >_ path basename /usr/bin/ bin >_ path basename /usr/bin/* # This prints all files in /usr/bin/ # A selection: cp fish grep rm ``` “dirname” subcommand -------------------- ``` path dirname [-z | --null-in] [-Z | --null-out] [-q | --quiet] [PATH ...] ``` `path dirname` returns the dirname for the given path. This is the part before the last “/”, discounting trailing slashes. In other words, it is the part that is not the basename (discounting superfluous slashes). It returns 0 if there was a dirname, i.e. if the path wasn’t empty or just slashes. ### Examples ``` >_ path dirname ./foo.mp4 . >_ path dirname ../banana .. >_ path dirname /usr/bin/ /usr ``` “extension” subcommand ---------------------- ``` path extension [-z | --null-in] [-Z | --null-out] [-q | --quiet] [PATH ...] ``` `path extension` returns the extension of the given path. This is the part after (and including) the last “.”, unless that “.” followed a “/” or the basename is “.” or “..”, in which case there is no extension and an empty line is printed. If the filename ends in a “.”, only a “.” is printed. It returns 0 if there was an extension. ### Examples ``` >_ path extension ./foo.mp4 .mp4 >_ path extension ../banana # an empty line, status 1 >_ path extension ~/.config # an empty line, status 1 >_ path extension ~/.config.d .d >_ path extension ~/.config. . >_ set -l path (path change-extension '' ./foo.mp4) >_ set -l extension (path extension ./foo.mp4) > echo $path$extension # reconstructs the original path again. ./foo.mp4 ``` “filter” subcommand ------------------- ``` path filter [-z | --null-in] [-Z | --null-out] [-q | --quiet] \ [-d] [-f] [-l] [-r] [-w] [-x] \ [-v | --invert] [(-t | --type) TYPE] [(-p | --perm) PERMISSION] [PATH ...] ``` `path filter` returns all of the given paths that match the given checks. In all cases, the paths need to exist, nonexistent paths are always filtered. The available filters are: * `-t` or `--type` with the options: “dir”, “file”, “link”, “block”, “char”, “fifo” and “socket”, in which case the path needs to be a directory, file, link, block device, character device, named pipe or socket, respectively. * `-d`, `-f` and `-l` are short for `--type=dir`, `--type=file` and `--type=link`, respectively. There are no shortcuts for the other types. * `-p` or `--perm` with the options: “read”, “write”, and “exec”, as well as “suid”, “sgid”, “user” (referring to the path owner) and “group” (referring to the path’s group), in which case the path needs to have all of the given permissions for the current user. * `-r`, `-w` and `-x` are short for `--perm=read`, `--perm=write` and `--perm=exec`, respectively. There are no shortcuts for the other permissions. Note that the path needs to be *any* of the given types, but have *all* of the given permissions. This is because having a path that is both writable and executable makes sense, but having a path that is both a directory and a file doesn’t. Links will count as the type of the linked-to file, so links to files count as files, links to directories count as directories. The filter options can either be given as multiple options, or comma-separated - `path filter -t dir,file` or `path filter --type dir --type file` are equivalent. With `--invert`, the meaning of the filtering is inverted - any path that wouldn’t pass (including by not existing) passes, and any path that would pass fails. When a path starts with `-`, `path filter` will prepend `./` to avoid it being interpreted as an option otherwise. It returns 0 if at least one path passed the filter. `path is` is shorthand for `path filter -q`, i.e. just checking without producing output, see [The is subcommand](#cmd-path-is). ### Examples ``` >_ path filter /usr/bin /usr/argagagji # The (hopefully) nonexistent argagagji is filtered implicitly: /usr/bin >_ path filter --type file /usr/bin /usr/bin/fish # Only fish is a file /usr/bin/fish >_ path filter --type file,dir --perm exec,write /usr/bin/fish /home/me # fish is a file, which passes, and executable, which passes, # but probably not writable, which fails. # # $HOME is a directory and both writable and executable, typically. # So it passes. /home/me >_ path filter -fdxw /usr/bin/fish /home/me # This is the same as above: "-f" is "--type=file", "-d" is "--type=dir", # "-x" is short for "--perm=exec" and "-w" short for "--perm=write"! /home/me >_ path filter -fx $PATH/* # Prints all possible commands - the first entry of each name is what fish would execute! ``` “is” subcommand --------------- ``` path is [-z | --null-in] [-Z | --null-out] [-q | --quiet] \ [-d] [-f] [-l] [-r] [-w] [-x] \ [-v | --invert] [(-t | --type) TYPE] [(-p | --perm) PERMISSION] [PATH ...] ``` `path is` is short for `path filter -q`. It returns true if any of the given files passes the filter, but does not produce any output. `--quiet` can still be passed for compatibility but is redundant. The options are the same as for `path filter`. ### Examples ``` >_ path is /usr/bin /usr/argagagji # /usr/bin exists, so this returns a status of 0 (true). It prints nothing. >_ path is /usr/argagagji # /usr/argagagji does not, so this returns a status of 1 (false). It also prints nothing. >_ path is -fx /bin/sh # /bin/sh is usually an executable file, so this returns true. ``` “mtime” subcommand ------------------ ``` path mtime [-z | --null-in] [-Z | --null-out] [-q | --quiet] [-R | --relative] [PATH ...] ``` `path mtime` returns the last modification time (“mtime” in unix jargon) of the given paths, in seconds since the unix epoch (the beginning of the 1st of January 1970). With `--relative` (or `-R`), it prints the number of seconds since the modification time. It only reads the current time once at start, so in case multiple paths are given the times are all relative to the *start* of `path mtime -R` running. If you want to know if a file is newer or older than another file, consider using `test -nt` instead. See [the test documentation](test). It returns 0 if reading mtime for any path succeeded. ### Examples ``` >_ date +%s # This prints the current time as seconds since the epoch 1657217847 >_ path mtime /etc/ 1657213796 >_ path mtime -R /etc/ 4078 # So /etc/ on this system was last modified a little over an hour ago # This is the same as >_ math (date +%s) - (path mtime /etc/) ``` “normalize” subcommand ---------------------- ``` path normalize [-z | --null-in] [-Z | --null-out] [-q | --quiet] [PATH ...] ``` `path normalize` returns the normalized versions of all paths. That means it squashes duplicate “/” (except for two leading “//”), collapses “../” with earlier components and removes “.” components. Unlike `realpath` or `path resolve`, it does not make the paths absolute. It also does not resolve any symlinks. As such it can operate on non-existent paths. Because it operates on paths as strings and doesn’t resolve symlinks, it works sort of like `pwd -L` and `cd`. E.g. `path normalize link/..` will return `.`, just like `cd link; cd ..` would return to the current directory. For a physical view of the filesystem, see `path resolve`. Leading “./” components are usually removed. But when a path starts with `-`, `path normalize` will add it instead to avoid confusion with options. It returns 0 if any normalization was done, i.e. any given path wasn’t in canonical form. ### Examples ``` >_ path normalize /usr/bin//../../etc/fish # The "//" is squashed and the ".." components neutralize the components before /etc/fish >_ path normalize /bin//bash # The "//" is squashed, but /bin isn't resolved even if your system links it to /usr/bin. /bin/bash >_ path normalize ./my/subdirs/../sub2 my/sub2 >_ path normalize -- -/foo ./-/foo ``` “resolve” subcommand -------------------- ``` path resolve [-z | --null-in] [-Z | --null-out] [-q | --quiet] [PATH ...] ``` `path resolve` returns the normalized, physical and absolute versions of all paths. That means it resolves symlinks and does what `path normalize` does: it squashes duplicate “/”, collapses “../” with earlier components and removes “.” components. Then it turns that path into the absolute path starting from the filesystem root “/”. It is similar to `realpath`, as it creates the “real”, canonical version of the path. However, for paths that can’t be resolved, e.g. if they don’t exist or form a symlink loop, it will resolve as far as it can and normalize the rest. Because it resolves symlinks, it works sort of like `pwd -P`. E.g. `path resolve link/..` will return the parent directory of what the link points to, just like `cd link; cd (pwd -P)/..` would go to it. For a logical view of the filesystem, see `path normalize`. It returns 0 if any normalization or resolution was done, i.e. any given path wasn’t in canonical form. ### Examples ``` >_ path resolve /bin//sh # The "//" is squashed, and /bin is resolved if your system links it to /usr/bin. # sh here is bash (this is common on linux systems) /usr/bin/bash >_ path resolve /bin/foo///bar/../baz # Assuming /bin exists and is a symlink to /usr/bin, but /bin/foo doesn't. # This resolves the /bin/ and normalizes the nonexistent rest: /usr/bin/foo/baz ``` “change-extension” subcommand ----------------------------- ``` path change-extension [-z | --null-in] [-Z | --null-out] \ [-q | --quiet] EXTENSION [PATH ...] ``` `path change-extension` returns the given paths, with their extension changed to the given new extension. The extension is the part after (and including) the last “.”, unless that “.” followed a “/” or the basename is “.” or “..”, in which case there is no previous extension and the new one is simply added. If the extension is empty, any previous extension is stripped, along with the “.”. This is, of course, the inverse of `path extension`. One leading dot on the extension is ignored, so “.mp3” and “mp3” are treated the same. It returns 0 if it was given any paths. ### Examples ``` >_ path change-extension mp4 ./foo.wmv ./foo.mp4 >_ path change-extension .mp4 ./foo.wmv ./foo.mp4 >_ path change-extension '' ../banana ../banana # but status 1, because there was no extension. >_ path change-extension '' ~/.config /home/alfa/.config # status 1 >_ path change-extension '' ~/.config.d /home/alfa/.config # status 0 >_ path change-extension '' ~/.config. /home/alfa/.config # status 0 ``` “sort” subcommand ----------------- ``` path sort [-z | --null-in] [-Z | --null-out] \ [-q | --quiet] [-r | --reverse] \ [--key=basename|dirname|path] [PATH ...] ``` `path sort` returns the given paths in sorted order. They are sorted in the same order as globs - alphabetically, but with runs of numerical digits compared numerically. With `--reverse` or `-r` the sort is reversed. With `--key=` only the given part of the path is compared, e.g. `--key=dirname` causes only the dirname to be compared, `--key=basename` only the basename and `--key=path` causes the entire path to be compared (this is the default). With `--unique` or `-u` the sort is deduplicated, meaning only the first of a run that have the same key is kept. So if you are sorting by basename, then only the first of each basename is used. The sort used is stable, so sorting first by basename and then by dirname works and causes the files to be grouped according to directory. It currently returns 0 if it was given any paths. ### Examples ``` >_ path sort 10-foo 2-bar 2-bar 10-foo >_ path sort --reverse 10-foo 2-bar 10-foo 2-bar >_ path sort --unique --key=basename $fish_function_path/*.fish # prints a list of all function files fish would use, sorted by name. ``` Combining `path` ---------------- `path` is meant to be easy to combine with itself, other tools and fish. This is why * `path`’s output is automatically split by fish if it goes into a command substitution, so just doing `(path ...)` handles all paths, even those containing newlines, correctly * `path` has `--null-in` to handle null-delimited input (typically automatically detected!), and `--null-out` to pass on null-delimited output Some examples of combining `path`: ``` # Expand all paths in the current directory, leave only executable files, and print their resolved path path filter -zZ -xf -- * | path resolve -z # The same thing, but using find (note -maxdepth needs to come first or find will scream) # (this also depends on your particular version of find) # Note the `-z` is unnecessary for any sensible version of find - if `path` sees a NULL, # it will split on NULL automatically. find . -maxdepth 1 -type f -executable -print0 | path resolve -z set -l paths (path filter -p exec $PATH/fish -Z | path resolve) ```
programming_docs
cpp Metaprogramming library (since C++11) Metaprogramming library (since C++11) ===================================== C++ provides metaprogramming facilities, such as type traits, compile-time rational arithmetic, and compile-time integer sequences. ### Type property These type traits define compile-time template-based interfaces to query the properties of types. Attempting to specialize a template defined in the [`<type_traits>`](header/type_traits "cpp/header/type traits") header and described in this section results in undefined behavior. A template defined in the [`<type_traits>`](header/type_traits "cpp/header/type traits") header may be instantiated with an incomplete type unless otherwise specified, notwithstanding the general prohibition against instantiating standard library templates with incomplete types. | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | Primary type categories | | [is\_void](types/is_void "cpp/types/is void") (C++11) | checks if a type is `void` (class template) | | [is\_null\_pointer](types/is_null_pointer "cpp/types/is null pointer") (C++14) | checks if a type is `[std::nullptr\_t](http://en.cppreference.com/w/cpp/types/nullptr_t)` (class template) | | [is\_integral](types/is_integral "cpp/types/is integral") (C++11) | checks if a type is an integral type (class template) | | [is\_floating\_point](types/is_floating_point "cpp/types/is floating point") (C++11) | checks if a type is a floating-point type (class template) | | [is\_array](types/is_array "cpp/types/is array") (C++11) | checks if a type is an array type (class template) | | [is\_enum](types/is_enum "cpp/types/is enum") (C++11) | checks if a type is an enumeration type (class template) | | [is\_union](types/is_union "cpp/types/is union") (C++11) | checks if a type is an union type (class template) | | [is\_class](types/is_class "cpp/types/is class") (C++11) | checks if a type is a non-union class type (class template) | | [is\_function](types/is_function "cpp/types/is function") (C++11) | checks if a type is a function type (class template) | | [is\_pointer](types/is_pointer "cpp/types/is pointer") (C++11) | checks if a type is a pointer type (class template) | | [is\_lvalue\_reference](types/is_lvalue_reference "cpp/types/is lvalue reference") (C++11) | checks if a type is a *lvalue reference* (class template) | | [is\_rvalue\_reference](types/is_rvalue_reference "cpp/types/is rvalue reference") (C++11) | checks if a type is a *rvalue reference* (class template) | | [is\_member\_object\_pointer](types/is_member_object_pointer "cpp/types/is member object pointer") (C++11) | checks if a type is a pointer to a non-static member object (class template) | | [is\_member\_function\_pointer](types/is_member_function_pointer "cpp/types/is member function pointer") (C++11) | checks if a type is a pointer to a non-static member function (class template) | | Composite type categories | | [is\_fundamental](types/is_fundamental "cpp/types/is fundamental") (C++11) | checks if a type is a fundamental type (class template) | | [is\_arithmetic](types/is_arithmetic "cpp/types/is arithmetic") (C++11) | checks if a type is an arithmetic type (class template) | | [is\_scalar](types/is_scalar "cpp/types/is scalar") (C++11) | checks if a type is a scalar type (class template) | | [is\_object](types/is_object "cpp/types/is object") (C++11) | checks if a type is an object type (class template) | | [is\_compound](types/is_compound "cpp/types/is compound") (C++11) | checks if a type is a compound type (class template) | | [is\_reference](types/is_reference "cpp/types/is reference") (C++11) | checks if a type is either a *lvalue reference* or *rvalue reference* (class template) | | [is\_member\_pointer](types/is_member_pointer "cpp/types/is member pointer") (C++11) | checks if a type is a pointer to an non-static member function or object (class template) | | Type properties | | [is\_const](types/is_const "cpp/types/is const") (C++11) | checks if a type is const-qualified (class template) | | [is\_volatile](types/is_volatile "cpp/types/is volatile") (C++11) | checks if a type is volatile-qualified (class template) | | [is\_trivial](types/is_trivial "cpp/types/is trivial") (C++11) | checks if a type is trivial (class template) | | [is\_trivially\_copyable](types/is_trivially_copyable "cpp/types/is trivially copyable") (C++11) | checks if a type is trivially copyable (class template) | | [is\_standard\_layout](types/is_standard_layout "cpp/types/is standard layout") (C++11) | checks if a type is a [standard-layout](language/data_members#Standard_layout "cpp/language/data members") type (class template) | | [is\_pod](types/is_pod "cpp/types/is pod") (C++11)(deprecated in C++20) | checks if a type is a plain-old data (POD) type (class template) | | [is\_literal\_type](types/is_literal_type "cpp/types/is literal type") (C++11)(deprecated in C++17)(removed in C++20) | checks if a type is a literal type (class template) | | [has\_unique\_object\_representations](types/has_unique_object_representations "cpp/types/has unique object representations") (C++17) | checks if every bit in the type's object representation contributes to its value (class template) | | [is\_empty](types/is_empty "cpp/types/is empty") (C++11) | checks if a type is a class (but not union) type and has no non-static data members (class template) | | [is\_polymorphic](types/is_polymorphic "cpp/types/is polymorphic") (C++11) | checks if a type is a polymorphic class type (class template) | | [is\_abstract](types/is_abstract "cpp/types/is abstract") (C++11) | checks if a type is an abstract class type (class template) | | [is\_final](types/is_final "cpp/types/is final") (C++14) | checks if a type is a final class type (class template) | | [is\_aggregate](types/is_aggregate "cpp/types/is aggregate") (C++17) | checks if a type is an aggregate type (class template) | | [is\_signed](types/is_signed "cpp/types/is signed") (C++11) | checks if a type is a signed arithmetic type (class template) | | [is\_unsigned](types/is_unsigned "cpp/types/is unsigned") (C++11) | checks if a type is an unsigned arithmetic type (class template) | | [is\_bounded\_array](types/is_bounded_array "cpp/types/is bounded array") (C++20) | checks if a type is an array type of known bound (class template) | | [is\_unbounded\_array](types/is_unbounded_array "cpp/types/is unbounded array") (C++20) | checks if a type is an array type of unknown bound (class template) | | [is\_scoped\_enum](types/is_scoped_enum "cpp/types/is scoped enum") (C++23) | checks if a type is a scoped enumeration type (class template) | | | | --- | | Supported operations | | [is\_constructibleis\_trivially\_constructibleis\_nothrow\_constructible](types/is_constructible "cpp/types/is constructible") (C++11)(C++11)(C++11) | checks if a type has a constructor for specific arguments (class template) | | [is\_default\_constructibleis\_trivially\_default\_constructibleis\_nothrow\_default\_constructible](types/is_default_constructible "cpp/types/is default constructible") (C++11)(C++11)(C++11) | checks if a type has a default constructor (class template) | | [is\_copy\_constructibleis\_trivially\_copy\_constructibleis\_nothrow\_copy\_constructible](types/is_copy_constructible "cpp/types/is copy constructible") (C++11)(C++11)(C++11) | checks if a type has a copy constructor (class template) | | [is\_move\_constructibleis\_trivially\_move\_constructibleis\_nothrow\_move\_constructible](types/is_move_constructible "cpp/types/is move constructible") (C++11)(C++11)(C++11) | checks if a type can be constructed from an rvalue reference (class template) | | [is\_assignableis\_trivially\_assignableis\_nothrow\_assignable](types/is_assignable "cpp/types/is assignable") (C++11)(C++11)(C++11) | checks if a type has a assignment operator for a specific argument (class template) | | [is\_copy\_assignableis\_trivially\_copy\_assignableis\_nothrow\_copy\_assignable](types/is_copy_assignable "cpp/types/is copy assignable") (C++11)(C++11)(C++11) | checks if a type has a copy assignment operator (class template) | | [is\_move\_assignableis\_trivially\_move\_assignableis\_nothrow\_move\_assignable](types/is_move_assignable "cpp/types/is move assignable") (C++11)(C++11)(C++11) | checks if a type has a move assignment operator (class template) | | [is\_destructibleis\_trivially\_destructibleis\_nothrow\_destructible](types/is_destructible "cpp/types/is destructible") (C++11)(C++11)(C++11) | checks if a type has a non-deleted destructor (class template) | | [has\_virtual\_destructor](types/has_virtual_destructor "cpp/types/has virtual destructor") (C++11) | checks if a type has a virtual destructor (class template) | | [is\_swappable\_withis\_swappableis\_nothrow\_swappable\_withis\_nothrow\_swappable](types/is_swappable "cpp/types/is swappable") (C++17)(C++17)(C++17)(C++17) | checks if objects of a type can be swapped with objects of same or different type (class template) | | | | --- | | Property queries | | [alignment\_of](types/alignment_of "cpp/types/alignment of") (C++11) | obtains the type's alignment requirements (class template) | | [rank](types/rank "cpp/types/rank") (C++11) | obtains the number of dimensions of an array type (class template) | | [extent](types/extent "cpp/types/extent") (C++11) | obtains the size of an array type along a specified dimension (class template) | | | | --- | | Type relationships | | [is\_same](types/is_same "cpp/types/is same") (C++11) | checks if two types are the same (class template) | | [is\_base\_of](types/is_base_of "cpp/types/is base of") (C++11) | checks if a type is derived from the other type (class template) | | [is\_convertibleis\_nothrow\_convertible](types/is_convertible "cpp/types/is convertible") (C++11)(C++20) | checks if a type can be converted to the other type (class template) | | [is\_invocableis\_invocable\_ris\_nothrow\_invocableis\_nothrow\_invocable\_r](types/is_invocable "cpp/types/is invocable") (C++17) | checks if a type can be invoked (as if by `[std::invoke](utility/functional/invoke "cpp/utility/functional/invoke")`) with the given argument types (class template) | | [reference\_constructs\_from\_temporary](types/reference_constructs_from_temporary "cpp/types/reference constructs from temporary") (C++23) | checks if a reference is bound to a temporary in direct-initialization (class template) | | [reference\_converts\_from\_temporary](types/reference_converts_from_temporary "cpp/types/reference converts from temporary") (C++23) | checks if a reference is bound to a temporary in copy-initialization (class template) | | [is\_layout\_compatible](types/is_layout_compatible "cpp/types/is layout compatible") (C++20) | checks if two types are [*layout-compatible*](language/data_members#Standard_layout "cpp/language/data members") (class template) | | [is\_pointer\_interconvertible\_base\_of](types/is_pointer_interconvertible_base_of "cpp/types/is pointer interconvertible base of") (C++20) | checks if a type is a *pointer-interconvertible* (initial) base of another type (class template) | | [is\_pointer\_interconvertible\_with\_class](types/is_pointer_interconvertible_with_class "cpp/types/is pointer interconvertible with class") (C++20) | checks if objects of a type are pointer-interconvertible with the specified subobject of that type (function template) | | [is\_corresponding\_member](types/is_corresponding_member "cpp/types/is corresponding member") (C++20) | checks if two specified members correspond to each other in the common initial subsequence of two specified types (function template) | #### Operations on traits | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | [conjunction](types/conjunction "cpp/types/conjunction") (C++17) | variadic logical AND metafunction (class template) | | [disjunction](types/disjunction "cpp/types/disjunction") (C++17) | variadic logical OR metafunction (class template) | | [negation](types/negation "cpp/types/negation") (C++17) | logical NOT metafunction (class template) | #### Base classes | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | [integral\_constantbool\_constant](types/integral_constant "cpp/types/integral constant") (C++11)(C++17) | compile-time constant of specified type with specified value (class template) | Two specializations of `[std::integral\_constant](types/integral_constant "cpp/types/integral constant")` for the type `bool` are provided: | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | Specializations | | Type | Definition | | `true_type` | `[std::integral\_constant](http://en.cppreference.com/w/cpp/types/integral_constant)<bool, true>` | | `false_type` | `[std::integral\_constant](http://en.cppreference.com/w/cpp/types/integral_constant)<bool, false>` | ### Type modifications These type traits apply modifications on a template parameter, and declare (sometimes conditionally) the `type` member typedef as the resulting type. Attempting to specialize a template defined in the `<type_traits>` header and described in this section results in undefined behavior, except that `[std::common\_type](types/common_type "cpp/types/common type")` and [`std::basic_common_reference`](types/common_reference "cpp/types/common reference") (since C++20) may be specialized as required in description. A template defined in the `<type_traits>` header may be instantiated with an incomplete type unless otherwise specified, notwithstanding the general prohibition against instantiating standard library templates with incomplete types. | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | Const-volatility specifiers | | [remove\_cvremove\_constremove\_volatile](types/remove_cv "cpp/types/remove cv") (C++11)(C++11)(C++11) | removes `const` or/and `volatile` specifiers from the given type (class template) | | [add\_cvadd\_constadd\_volatile](types/add_cv "cpp/types/add cv") (C++11)(C++11)(C++11) | adds `const` or/and `volatile` specifiers to the given type (class template) | | References | | [remove\_reference](types/remove_reference "cpp/types/remove reference") (C++11) | removes a reference from the given type (class template) | | [add\_lvalue\_referenceadd\_rvalue\_reference](types/add_reference "cpp/types/add reference") (C++11)(C++11) | adds a *lvalue* or *rvalue* reference to the given type (class template) | | Pointers | | [remove\_pointer](types/remove_pointer "cpp/types/remove pointer") (C++11) | removes a pointer from the given type (class template) | | [add\_pointer](types/add_pointer "cpp/types/add pointer") (C++11) | adds a pointer to the given type (class template) | | Sign modifiers | | [make\_signed](types/make_signed "cpp/types/make signed") (C++11) | makes the given integral type signed (class template) | | [make\_unsigned](types/make_unsigned "cpp/types/make unsigned") (C++11) | makes the given integral type unsigned (class template) | | Arrays | | [remove\_extent](types/remove_extent "cpp/types/remove extent") (C++11) | removes one extent from the given array type (class template) | | [remove\_all\_extents](types/remove_all_extents "cpp/types/remove all extents") (C++11) | removes all extents from the given array type (class template) | #### Miscellaneous transformations | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | [aligned\_storage](types/aligned_storage "cpp/types/aligned storage") (C++11)(deprecated in C++23) | defines the type suitable for use as uninitialized storage for types of given size (class template) | | [aligned\_union](types/aligned_union "cpp/types/aligned union") (C++11)(deprecated in C++23) | defines the type suitable for use as uninitialized storage for all given types (class template) | | [decay](types/decay "cpp/types/decay") (C++11) | applies type transformations as when passing a function argument by value (class template) | | [remove\_cvref](types/remove_cvref "cpp/types/remove cvref") (C++20) | combines `[std::remove\_cv](types/remove_cv "cpp/types/remove cv")` and `[std::remove\_reference](types/remove_reference "cpp/types/remove reference")` (class template) | | [enable\_if](types/enable_if "cpp/types/enable if") (C++11) | conditionally [removes](language/sfinae "cpp/language/sfinae") a function overload or template specialization from overload resolution (class template) | | [conditional](types/conditional "cpp/types/conditional") (C++11) | chooses one type or another based on compile-time boolean (class template) | | [common\_type](types/common_type "cpp/types/common type") (C++11) | determines the common type of a group of types (class template) | | [common\_referencebasic\_common\_reference](types/common_reference "cpp/types/common reference") (C++20) | determines the common reference type of a group of types (class template) | | [underlying\_type](types/underlying_type "cpp/types/underlying type") (C++11) | obtains the underlying integer type for a given enumeration type (class template) | | [result\_ofinvoke\_result](types/result_of "cpp/types/result of") (C++11)(removed in C++20)(C++17) | deduces the result type of invoking a callable object with a set of arguments (class template) | | [void\_t](types/void_t "cpp/types/void t") (C++17) | void variadic alias template (alias template) | | [type\_identity](types/type_identity "cpp/types/type identity") (C++20) | returns the type argument unchanged (class template) | ### [Compile-time rational arithmetic](numeric/ratio "cpp/numeric/ratio") (since C++11) The header [`<ratio>`](header/ratio "cpp/header/ratio") provides [types and functions for manipulating and storing compile-time ratios](numeric/ratio "cpp/numeric/ratio"). ### Compile-time integer sequences (since C++14) | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | --- | | [integer\_sequence](utility/integer_sequence "cpp/utility/integer sequence") (C++14) | implements compile-time sequence of integers (class template) | cpp C++23 C++23 ===== The next generation of the C++ standard. see: [The current IS schedule for C++23](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p1000r4.pdf). New language features ---------------------- New library features --------------------- ### New headers * [`<expected>`](header/expected "cpp/header/expected") * [`<flat_map>`](header/flat_map "cpp/header/flat map") * [`<flat_set>`](header/flat_set "cpp/header/flat set") * [`<generator>`](header/generator "cpp/header/generator") * [`<mdspan>`](header/mdspan "cpp/header/mdspan") * [`<print>`](header/print "cpp/header/print") * [`<spanstream>`](header/spanstream "cpp/header/spanstream") * [`<stacktrace>`](header/stacktrace "cpp/header/stacktrace") * [`<stdfloat>`](header/stdfloat "cpp/header/stdfloat") Headers that were borrowed from C: * [`<stdatomic.h>`](header/stdatomic.h "cpp/header/stdatomic.h") Defect reports --------------- Compiler support ----------------- Main Article: [C++23 compiler support](compiler_support#C.2B.2B23_features "cpp/compiler support"). ### C++23 core language features | C++23 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [Literal suffix](language/integer_literal "cpp/language/integer literal") for (signed) [`size_t`](types/size_t "cpp/types/size t") | [P0330R8](https://wg21.link/P0330R8) | 11 | 13 | | 13.1.6\* | | | | | | | | | | Make `()` more optional for [lambdas](language/lambda "cpp/language/lambda") | [P1102R2](https://wg21.link/P1102R2) | 11 | 13 | | 13.1.6\* | 6.3 | | | | | | | | | [`if consteval`](language/if#Consteval_if "cpp/language/if") | [P1938R3](https://wg21.link/P1938R3) | 12 | 14 | | 14.0.0\* | 6.3 | | | | | | | | | Removing Garbage Collection Support | [P2186R2](https://wg21.link/P2186R2) | 12 | | | | N/A | | | | | | | | | DR: C++ Identifier Syntax using Unicode Standard Annex 31 | [P1949R7](https://wg21.link/P1949R7) | 12 | 14 | | 14.0.0\* | 6.4 | | | | | | | | | DR: Allow Duplicate Attributes | [P2156R1](https://wg21.link/P2156R1) | 11 | 13 | | 13.1.6\* | | | | | | | | | | Narrowing contextual conversions in [`static_assert`](language/static_assert "cpp/language/static assert") and [constexpr if](language/if#Constexpr_if "cpp/language/if") | [P1401R5](https://wg21.link/P1401R5) | 9 | 13 (partial)\*14 | | 14.0.0\* | | | | | | | | | | Trimming whitespaces before line splicing | [P2223R2](https://wg21.link/P2223R2) | Yes | Yes | | Yes | | | | | | | | | | Make declaration order layout mandated | [P1847R4](https://wg21.link/P1847R4) | Yes | Yes | Yes | Yes | | | | | | | | | | Removing mixed wide [string literal concatenation](language/string_literal#Concatenation "cpp/language/string literal") | [P2201R1](https://wg21.link/P2201R1) | Yes | Yes | Yes | Yes | Yes | Yes | | | | | | | | [Deducing this](language/member_functions#Explicit_object_parameter "cpp/language/member functions") | [P0847R7](https://wg21.link/P0847R7) | | | 19.32\*(partial)\* | | 6.3 | | | | | | | | | [`auto(x)` and `auto{x}`](language/explicit_cast "cpp/language/explicit cast") | [P0849R8](https://wg21.link/P0849R8) | 12 | 15 | | | 6.4 | | | | | | | | | Change scope of lambda trailing-return-type | [P2036R3](https://wg21.link/P2036R3) | | | | | | | | | | | | | | [`#elifdef` and `#elifndef`](preprocessor/conditional "cpp/preprocessor/conditional") | [P2334R1](https://wg21.link/P2334R1) | 12 | 13 | | 13.1.6\* | | | | | | | | | | Non-literal variables (and labels and gotos) in [`constexpr`](language/constexpr "cpp/language/constexpr") functions | [P2242R3](https://wg21.link/P2242R3) | 12 | 15 | | | 6.3 | | | | | | | | | Consistent character literal encoding | [P2316R2](https://wg21.link/P2316R2) | Yes | Yes | | Yes | Yes | | | | | | | | | Character sets and encodings | [P2314R4](https://wg21.link/P2314R4) | | Yes | | Yes | | | | | | | | | | Extend init-statement to allow alias-declaration | [P2360R0](https://wg21.link/P2360R0) | 12 | 14 | | 14.0.0\* | | | | | | | | | | Multidimensional subscript operator | [P2128R6](https://wg21.link/P2128R6) | 12 | 15 | | | | | | | | | | | | Attributes on [lambdas](language/lambda "cpp/language/lambda") | [P2173R1](https://wg21.link/P2173R1) | 9 | 13 | | 13.1.6\* | | | | | | | | | | DR: Adjusting the value of feature testing macro `__cpp_concepts` | [P2493R0](https://wg21.link/P2493R0) | 12 | | 19.32\* | | 6.4 | | | | | | | | | [`#warning`](preprocessor/error "cpp/preprocessor/error") | [P2437R1](https://wg21.link/P2437R1) | Yes\* | Yes | | Yes | Yes | Yes | | | | | | | | Remove non-encodable wide character literals and multicharacter wide character literals | [P2362R3](https://wg21.link/P2362R3) | 13 | 14 | | | | | | | | | | | | Labels at the end of compound statements | [P2324R2](https://wg21.link/P2324R2) | 13 | | | | | | | | | | | | | Delimited escape sequences | [P2290R3](https://wg21.link/P2290R3) | 13 | 15 | | | | | | | | | | | | Named universal character escapes | [P2071R2](https://wg21.link/P2071R2) | 13 | 15 | | | | | | | | | | | | Relaxing some `constexpr` restrictions | [P2448R2](https://wg21.link/P2448R2) | | | | | | | | | | | | | | Simpler implicit move | [P2266R3](https://wg21.link/P2266R3) | | 13 | | | | | | | | | | | | `static operator()` | [P1169R4](https://wg21.link/P1169R4) | | | | | | | | | | | | | | Requirements for optional extended floating-point types | [P1467R9](https://wg21.link/P1467R9) | | | N/A | | | | | | | | | | | Class template argument deduction from inherited constructors | [P2582R1](https://wg21.link/P2582R1) | | | | | | | | | | | | | | Attribute `[[[assume](https://en.cppreference.com/mwiki/index.php?title=cpp/language/attributes/assume&action=edit&redlink=1 "cpp/language/attributes/assume (page does not exist)")]]` | [P1774R8](https://wg21.link/P1774R8) | | | | | | | | | | | | | | Support for UTF-8 as a portable source file encoding | [P2295R6](https://wg21.link/P2295R6) | | 15\* | | | | | | | | | | | | DR: De-deprecating volatile bitwise compound assignment operations | [P2327R1](https://wg21.link/P2327R1) | 13 | 15 | | | | | | | | | | | | DR: Relax requirements on `wchar_t` to match existing practices | [P2460R2](https://wg21.link/P2460R2) | Yes | Yes | | | | | | | | | | | | DR: Using unknown pointers and references in constant expressions | [P2280R4](https://wg21.link/P2280R4) | | | | | | | | | | | | | | DR: The Equality Operator You Are Looking For | [P2468R2](https://wg21.link/P2468R2) | | | | | | | | | | | | | | DR: `char8_t` Compatibility and Portability Fix | [P2513R3](https://wg21.link/P2513R3) | | | | | | | | | | | | | | C++23 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | ### C++23 library features | C++23 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [Stacktrace library](error#Stacktrace "cpp/error") | [P0881R7](https://wg21.link/P0881R7)[P2301R1](https://wg21.link/P2301R1) | 12 (partial)\* | | 19.34\* | | | | | | [`<stdatomic.h>`](header/stdatomic.h "cpp/header/stdatomic.h") | [P0943R6](https://wg21.link/P0943R6) | 12 | 15 | 19.31\* | | | | | | [`std::is_scoped_enum`](types/is_scoped_enum "cpp/types/is scoped enum") | [P1048R1](https://wg21.link/P1048R1) | 11 | 12 | 19.30\* | 13.0.0\* | | | | | [`basic_string::contains()`](string/basic_string/contains "cpp/string/basic string/contains"), [`basic_string_view::contains()`](string/basic_string_view/contains "cpp/string/basic string view/contains") | [P1679R3](https://wg21.link/P1679R3) | 11 | 12 | 19.30\* | 13.0.0\* | | | | | [`std::to_underlying`](utility/to_underlying "cpp/utility/to underlying") | [P1682R3](https://wg21.link/P1682R3) | 11 | 13 | 19.30\* | 13.1.6\* | | | | | Relaxing requirements for `[time\_point<>::clock](chrono/time_point "cpp/chrono/time point")` | [P2212R2](https://wg21.link/P2212R2) | N/A | | N/A | | | | | | DR: `[std::visit()](utility/variant/visit "cpp/utility/variant/visit")` for classes derived from `[std::variant](utility/variant "cpp/utility/variant")` | [P2162R2](https://wg21.link/P2162R2) | 11.3 | 13 | 19.20\*\*19.30\* | 13.1.6\* | | | | | DR: Conditionally borrowed ranges | [P2017R1](https://wg21.link/P2017R1) | 11 | | 19.30\* | | | | | | DR: Repairing [input range adaptors](ranges#Views "cpp/ranges") and `[std::counted\_iterator](iterator/counted_iterator "cpp/iterator/counted iterator")` | [P2259R1](https://wg21.link/P2259R1) | 12 | | 19.30\*(partial)\*19.31\* | | | | | | Providing size feedback in the Allocator interface | [P0401R6](https://wg21.link/P0401R6) | | 15 | 19.30\* | | | | | | [`<spanstream>`](header/spanstream "cpp/header/spanstream"): string-stream with [`std::span`](container/span "cpp/container/span")-based buffer | [P0448R4](https://wg21.link/P0448R4) | 12 | | 19.31\* | | | | | | [`std::out_ptr()`](memory/out_ptr_t/out_ptr "cpp/memory/out ptr t/out ptr"), [`std::inout_ptr()`](memory/inout_ptr_t/inout_ptr "cpp/memory/inout ptr t/inout ptr") | [P1132R8](https://wg21.link/P1132R8) | | | 19.30\* | | | | | | `constexpr` [`type_info::operator==()`](types/type_info/operator_cmp "cpp/types/type info/operator cmp") | [P1328R1](https://wg21.link/P1328R1) | 12 | | 19.33\* | | | | | | Iterator pair constructors for [`std::stack`](container/stack/stack "cpp/container/stack/stack") and [`std::queue`](container/queue/queue "cpp/container/queue/queue") | [P1425R4](https://wg21.link/P1425R4) | 12 | 14 | 19.31\* | | | | | | Non-deduction context for allocators in container deduction guides | [P1518R2](https://wg21.link/P1518R2) | 12 | 13 | 19.31\* | 13.1.6\* | | | | | [`ranges::starts_with()`](algorithm/ranges/starts_with "cpp/algorithm/ranges/starts with") and [`ranges::ends_with()`](algorithm/ranges/ends_with "cpp/algorithm/ranges/ends with") | [P1659R3](https://wg21.link/P1659R3) | | | 19.31\* | | | | | | Prohibiting `[std::basic\_string](string/basic_string "cpp/string/basic string")` and `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` construction from [`nullptr`](language/nullptr "cpp/language/nullptr") | [P2166R1](https://wg21.link/P2166R1) | 12 | 13 | 19.30\* | 13.1.6\* | | | | | [`std::invoke_r()`](utility/functional/invoke "cpp/utility/functional/invoke") | [P2136R3](https://wg21.link/P2136R3) | 12 | | 19.31\* | | | | | | Range [constructor](string/basic_string_view/basic_string_view "cpp/string/basic string view/basic string view") for `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` | [P1989R2](https://wg21.link/P1989R2) | 11 | 14 | 19.30\* | | | | | | Default template arguments for [`pair`](utility/pair "cpp/utility/pair")'s forwarding [constructor](utility/pair/pair "cpp/utility/pair/pair") | [P1951R1](https://wg21.link/P1951R1) | | 14 | 19.30\* | | | | | | Remove Garbage Collection and Reachability-Based Leak Detection ([library support](memory#Garbage_collector_support "cpp/memory")) | [P2186R2](https://wg21.link/P2186R2) | 12 | 14 | 19.30\* | | | | | | DR: [`views::join`](ranges/join_view "cpp/ranges/join view") should join all views of ranges | [P2328R1](https://wg21.link/P2328R1) | 11.2 | | 19.30\* | | | | | | DR: [`view`](ranges/view "cpp/ranges/view") does not require [`default_initializable`](concepts/default_initializable "cpp/concepts/default initializable") | [P2325R3](https://wg21.link/P2325R3) | 11.3 | | 19.30\* | | | | | | DR: Range adaptor objects bind arguments by value | [P2281R1](https://wg21.link/P2281R1) | 11 | | 19.29 (16.10)\*(partial)\*19.31\* | | | | | | DR: [`constexpr`](language/constexpr "cpp/language/constexpr") for `[std::optional](utility/optional "cpp/utility/optional")` and `[std::variant](utility/variant "cpp/utility/variant")` | [P2231R1](https://wg21.link/P2231R1) | 11.3 (partial)\*12 | 13 (partial)\* | 19.31\* | 13.1.6\* (partial). | | | | | DR: [`std::format()`](utility/format/format "cpp/utility/format/format") improvements | [P2216R3](https://wg21.link/P2216R3) | | 14 (partial)\* 15 | 19.32\* | | | | | | DR: [`views::lazy_split`](ranges/lazy_split_view "cpp/ranges/lazy split view") and redesigned [`views::split`](ranges/split_view "cpp/ranges/split view") | [P2210R2](https://wg21.link/P2210R2) | 12 | | 19.31\* | | | | | | zip: `views::zip`, `views::zip_transform`, `views::adjacent`, and `views::adjacent_transform` | [P2321R2](https://wg21.link/P2321R2) | 13 (partial)\* | 15 (partial)\* | 19.33\* (partial)\* | | | | | | Heterogeneous erasure overloads for associative containers | [P2077R3](https://wg21.link/P2077R3) | | | 19.32\* | | | | | | [`std::byteswap()`](numeric/byteswap "cpp/numeric/byteswap") | [P1272R4](https://wg21.link/P1272R4) | 12 | 14 | 19.31\* | | | | | | [Printing](io/basic_ostream/operator_ltlt "cpp/io/basic ostream/operator ltlt") `volatile T*` | [P1147R1](https://wg21.link/P1147R1) | 11.3 | 14 | 19.31\* | | | | | | [`basic_string::resize_and_overwrite()`](string/basic_string/resize_and_overwrite "cpp/string/basic string/resize and overwrite") | [P1072R10](https://wg21.link/P1072R10) | 12 | 14 | 19.31\* | | | | | | Monadic operations for `[std::optional](utility/optional "cpp/utility/optional")` | [P0798R8](https://wg21.link/P0798R8) | 12 | 14 | 19.32\* | | | | | | [`std::move_only_function`](utility/functional/move_only_function "cpp/utility/functional/move only function") | [P0288R9](https://wg21.link/P0288R9) | 12 | | 19.32\* | | | | | | Add a conditional noexcept specification to `[std::exchange](utility/exchange "cpp/utility/exchange")` | [P2401R0](https://wg21.link/P2401R0) | 12 | 14 | 19.25\* | | | | | | Require [`span`](container/span "cpp/container/span") & [`basic_string_view`](string/basic_string_view "cpp/string/basic string view") to be [TriviallyCopyable](named_req/triviallycopyable "cpp/named req/TriviallyCopyable") | [P2251R1](https://wg21.link/P2251R1) | Yes | Yes | Yes | Yes | | | | | Clarifying the status of the “C headers” | [P2340R1](https://wg21.link/P2340R1) | Yes | Yes | Yes | Yes | | | | | DR: Fix `views::istream` | [P2432R1](https://wg21.link/P2432R1) | 12 | | 19.31\* | | | | | | DR: Add support for non-const-formattable types to `[std::format](utility/format/format "cpp/utility/format/format")` | [P2418R2](https://wg21.link/P2418R2) | | 15 | 19.32\* | | | | | | DR: [`view`](ranges/view "cpp/ranges/view") with ownership | [P2415R2](https://wg21.link/P2415R2) | 12 | 14 | 19.31\* | | | | | | DR: Fixing locale handling in chrono formatters | [P2372R3](https://wg21.link/P2372R3) | | | 19.31\* | | | | | | DR: Cleaning up integer-class types | [P2393R1](https://wg21.link/P2393R1) | | | 19.32\* | | | | | | [`<expected>`](header/expected "cpp/header/expected") | [P0323R12](https://wg21.link/P0323R12)[P2549R1](https://wg21.link/P2549R1) | 12 | | 19.33\* | | | | | | constexpr for [`<cmath>`](header/cmath "cpp/header/cmath") and [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") | [P0533R9](https://wg21.link/P0533R9) | 4.6 (partial)\* | | | | | | | | [`std::unreachable()`](utility/unreachable "cpp/utility/unreachable") | [P0627R6](https://wg21.link/P0627R6) | 12 | 15 | 19.32\* | | | | | | Deprecating `[std::aligned\_storage](types/aligned_storage "cpp/types/aligned storage")` and `[std::aligned\_union](types/aligned_union "cpp/types/aligned union")` | [P1413R3](https://wg21.link/P1413R3) | | | 19.33\* | | | | | | [`std::reference_constructs_from_temporary`](types/reference_constructs_from_temporary "cpp/types/reference constructs from temporary") & [`std::reference_converts_from_temporary`](types/reference_converts_from_temporary "cpp/types/reference converts from temporary") | [P2255R2](https://wg21.link/P2255R2) | 13 (partial)\* | | | | | | | | constexpr `[std::unique\_ptr](memory/unique_ptr "cpp/memory/unique ptr")` | [P2273R3](https://wg21.link/P2273R3) | 12 | 16 (partial)\* | 19.33\* | | | | | | [`ranges::to()`](ranges/to "cpp/ranges/to") | [P1206R7](https://wg21.link/P1206R7) | | | | | | | | | Pipe support for user-defined range adaptors | [P2387R3](https://wg21.link/P2387R3) | | | 19.34\* | | | | | | [`ranges::iota()`](algorithm/ranges/iota "cpp/algorithm/ranges/iota"), [`ranges::shift_left()`](algorithm/ranges/shift "cpp/algorithm/ranges/shift"), and [`ranges::shift_right()`](algorithm/ranges/shift "cpp/algorithm/ranges/shift") | [P2440R1](https://wg21.link/P2440R1) | | | 19.34\* | | | | | | `views::join_with` | [P2441R2](https://wg21.link/P2441R2) | | | 19.34\* | | | | | | `views::chunk` and `views::slide` | [P2442R1](https://wg21.link/P2442R1) | | | 19.33\* | | | | | | `views::chunk_by` | [P2443R1](https://wg21.link/P2443R1) | | | 19.33\* | | | | | | `std::mdspan`: a non-owning multidimensional array reference | [P0009R18](https://wg21.link/P0009R18)[P2599R2](https://wg21.link/P2599R2)[P2604R0](https://wg21.link/P2604R0)[P2613R1](https://wg21.link/P2613R1) | | | | | | | | | [`<flat_map>`](header/flat_map "cpp/header/flat map") | [P0429R9](https://wg21.link/P0429R9) | | | | | | | | | [`<flat_set>`](header/flat_set "cpp/header/flat set") | [P1222R4](https://wg21.link/P1222R4) | | | | | | | | | [`ranges::find_last()`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last"), [`ranges::find_last_if()`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last"), and [`ranges::find_last_if_not()`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last") | [P1223R5](https://wg21.link/P1223R5) | | | | | | | | | `views::stride` | [P1899R3](https://wg21.link/P1899R3) | | | 19.34\* | | | | | | Formatted output library | [P2093R14](https://wg21.link/P2093R14) | | | | | | | | | Compatibility between `[std::tuple](utility/tuple "cpp/utility/tuple")` and tuple-like objects | [P2165R4](https://wg21.link/P2165R4) | | 2.9 (partial)\* | | Partial\* | | | | | Rectifying constant iterators, sentinels, and ranges | [P2278R4](https://wg21.link/P2278R4) | | | | | | | | | Formatting Ranges | [P2286R8](https://wg21.link/P2286R8) | | | | | | | | | `constexpr` for integral overloads of [`std::to_chars()`](utility/to_chars "cpp/utility/to chars") and [`std::from_chars()`](utility/from_chars "cpp/utility/from chars"). | [P2291R3](https://wg21.link/P2291R3) | | | 19.34\* | | | | | | [`ranges::contains()`](algorithm/ranges/contains "cpp/algorithm/ranges/contains") and [`ranges::contains_subrange()`](algorithm/ranges/contains "cpp/algorithm/ranges/contains") | [P2302R4](https://wg21.link/P2302R4) | | | 19.34\* | | | | | | Ranges fold algorithms | [P2322R6](https://wg21.link/P2322R6) | | | | | | | | | `views::cartesian_product` | [P2374R4](https://wg21.link/P2374R4)[P2540R1](https://wg21.link/P2540R1) | | | | | | | | | Adding move-only types support for comparison concepts | [P2404R3](https://wg21.link/P2404R3) | | | | | | | | | Ranges iterators as inputs to non-ranges algorithms | [P2408R5](https://wg21.link/P2408R5) | | | 19.34\* | | | | | | constexpr `[std::bitset](utility/bitset "cpp/utility/bitset")` | [P2417R2](https://wg21.link/P2417R2) | | 16 | 19.34\* | | | | | | [`basic_string::substr()`](string/basic_string/substr "cpp/string/basic string/substr") `&&` | [P2438R2](https://wg21.link/P2438R2) | | | 19.34\* | | | | | | `views::as_rvalue` | [P2446R2](https://wg21.link/P2446R2) | | | 19.34\* | | | | | | Standard Library Modules | [P2465R3](https://wg21.link/P2465R3) | | | | | | | | | [`std::forward_like()`](utility/forward_like "cpp/utility/forward like") | [P2445R1](https://wg21.link/P2445R1) | | 16 | 19.34\* | | | | | | Support exclusive mode for `[std::fstream](io/basic_fstream "cpp/io/basic fstream")` | [P2467R1](https://wg21.link/P2467R1) | 12 | | | | | | | | `views::repeat` | [P2474R2](https://wg21.link/P2474R2) | | | | | | | | | Relaxing range adaptors to allow for move-only types | [P2494R2](https://wg21.link/P2494R2) | | | 19.34\* | | | | | | `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` range [constructor](string/basic_string_view/basic_string_view "cpp/string/basic string view/basic string view") should be explicit | [P2499R0](https://wg21.link/P2499R0) | 12.2 | 16 | 19.34\* | | | | | | `std::generator`: synchronous coroutine generator for ranges | [P2502R2](https://wg21.link/P2502R2) | | | | | | | | | `std::basic_format_string` | [P2508R1](https://wg21.link/P2508R1) | | 15 | | | | | | | Add a conditional noexcept specification to `[std::apply](utility/apply "cpp/utility/apply")` | [P2517R0](https://wg21.link/P2517R0) | 10 | | 19.34\* | | | | | | Improve default container formatting | [P2585R1](https://wg21.link/P2585R1) | | | | | | | | | Explicit lifetime management | [P2590R2](https://wg21.link/P2590R2) | | | | | | | | | Clarify handling of encodings in localized formatting of chrono types | [P2419R2](https://wg21.link/P2419R2) | | | 19.34\*\* | | | | | | `[std::move\_iterator](iterator/move_iterator "cpp/iterator/move iterator")` should not always be [`input_iterator`](iterator/input_iterator "cpp/iterator/input iterator") | [P2520R0](https://wg21.link/P2520R0) | | | 19.34\*\* | | | | | | Deduction guides update for [deducing `this`](language/member_functions#Explicit_object_parameter "cpp/language/member functions") | [LWG3617](https://cplusplus.github.io/LWG/issue3617) | | | 19.34\* | | | | | | Deduction guides update for `static operator()` | [P1169R4](https://wg21.link/P1169R4) | | | | | | | | | Standard names and library support for extended floating-point types | [P1467R9](https://wg21.link/P1467R9) | | | | | | | | | C++23 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | *\** - hover over a cell with the version number to see notes.
programming_docs
cpp Filesystem library (since C++17) Filesystem library (since C++17) ================================ The Filesystem library provides facilities for performing operations on file systems and their components, such as paths, regular files, and directories. The filesystem library was originally developed as [boost.filesystem](http://www.boost.org/doc/libs/release/libs/filesystem/doc/index.htm), was published as [the technical specification ISO/IEC TS 18822:2015](https://en.cppreference.com/w/cpp/experimental/fs "cpp/experimental/fs"), and finally merged to ISO C++ as of C++17. The boost implementation is currently available on more compilers and platforms than the C++17 library. The filesystem library facilities may be unavailable if a hierarchical file system is not accessible to the implementation, or if it does not provide the necessary capabilities. Some features may not be available if they are not supported by the underlying file system (e.g. the FAT filesystem lacks symbolic links and forbids multiple hardlinks). In those cases, errors must be reported. The behavior is [undefined](language/ub "cpp/language/ub") if the calls to functions in this library introduce a *file system race*, that is, when multiple threads, processes, or computers interleave access and modification to the same object in a file system. #### Library-wide definitions * *file*: a file system object that holds data, can be written to, read from, or both. Files have names, attributes, one of which is file type: + *directory*: a file that acts as a container of directory entries, which identify other files (some of which may be other, nested directories). When discussing a particular file, the directory in which it appears as an entry is its *parent directory*. The parent directory can be represented by the relative pathname `".."`. + *regular file*: a directory entry that associates a name with an existing file (i.e. a *hard link*). If multiple hard links are supported, the file is removed after the last hard link to it is removed. + *symbolic link*: a directory entry that associates a name with a path, which may or may not exist. + other special file types: *block*, *character*, *fifo*, *socket*. * *file name*: a string of characters that names a file. Permissible characters, case sensitivity, maximum length, and the disallowed names are implementation-defined. Names `"."` (dot) and `".."` (dot-dot) have special meaning at library level. * *path*: sequence of elements that identifies a file. It begins with an optional root-name (e.g. `"C:"` or `"//server"` on Windows), followed by an optional root-directory (e.g. `"/"` on Unix), followed by a sequence of zero or more file names (all but last of which have to be directories or links to directories). The native format (e.g. which characters are used as separators) and character encoding of the string representation of a path (the *pathname*) is implementation-defined, this library provides portable representation of paths. + *absolute path*: a path that unambiguously identifies the location of a file. + *canonical path*: an absolute path that includes no symlinks, `"."` or `".."` elements. + *relative path*: a path that identifies the location of a file relative to some location on the file system. The special path names `"."` (dot, "current directory") and `".."` (dot-dot, "parent directory") are relative paths. | | | --- | | Classes | | Defined in header `[<filesystem>](header/filesystem "cpp/header/filesystem")` | | Defined in namespace `std::filesystem` | | [path](filesystem/path "cpp/filesystem/path") (C++17) | represents a path (class) | | [filesystem\_error](filesystem/filesystem_error "cpp/filesystem/filesystem error") (C++17) | an exception thrown on file system errors (class) | | [directory\_entry](filesystem/directory_entry "cpp/filesystem/directory entry") (C++17) | a directory entry (class) | | [directory\_iterator](filesystem/directory_iterator "cpp/filesystem/directory iterator") (C++17) | an iterator to the contents of the directory (class) | | [recursive\_directory\_iterator](filesystem/recursive_directory_iterator "cpp/filesystem/recursive directory iterator") (C++17) | an iterator to the contents of a directory and its subdirectories (class) | | [file\_status](filesystem/file_status "cpp/filesystem/file status") (C++17) | represents file type and permissions (class) | | [space\_info](filesystem/space_info "cpp/filesystem/space info") (C++17) | information about free and available space on the filesystem (class) | | [file\_type](filesystem/file_type "cpp/filesystem/file type") (C++17) | the type of a file (enum) | | [perms](filesystem/perms "cpp/filesystem/perms") (C++17) | identifies file system permissions (enum) | | [perm\_options](filesystem/perm_options "cpp/filesystem/perm options") (C++17) | specifies semantics of permissions operations (enum) | | [copy\_options](filesystem/copy_options "cpp/filesystem/copy options") (C++17) | specifies semantics of copy operations (enum) | | [directory\_options](filesystem/directory_options "cpp/filesystem/directory options") (C++17) | options for iterating directory contents (enum) | | [file\_time\_type](filesystem/file_time_type "cpp/filesystem/file time type") (C++17) | represents file time values (typedef) | | Non-member functions | | Defined in header `[<filesystem>](header/filesystem "cpp/header/filesystem")` | | Defined in namespace `std::filesystem` | | [absolute](filesystem/absolute "cpp/filesystem/absolute") (C++17) | composes an absolute path (function) | | [canonicalweakly\_canonical](filesystem/canonical "cpp/filesystem/canonical") (C++17) | composes a canonical path (function) | | [relativeproximate](filesystem/relative "cpp/filesystem/relative") (C++17) | composes a relative path (function) | | [copy](filesystem/copy "cpp/filesystem/copy") (C++17) | copies files or directories (function) | | [copy\_file](filesystem/copy_file "cpp/filesystem/copy file") (C++17) | copies file contents (function) | | [copy\_symlink](filesystem/copy_symlink "cpp/filesystem/copy symlink") (C++17) | copies a symbolic link (function) | | [create\_directorycreate\_directories](filesystem/create_directory "cpp/filesystem/create directory") (C++17)(C++17) | creates new directory (function) | | [create\_hard\_link](filesystem/create_hard_link "cpp/filesystem/create hard link") (C++17) | creates a hard link (function) | | [create\_symlinkcreate\_directory\_symlink](filesystem/create_symlink "cpp/filesystem/create symlink") (C++17)(C++17) | creates a symbolic link (function) | | [current\_path](filesystem/current_path "cpp/filesystem/current path") (C++17) | returns or sets the current working directory (function) | | [exists](filesystem/exists "cpp/filesystem/exists") (C++17) | checks whether path refers to existing file system object (function) | | [equivalent](filesystem/equivalent "cpp/filesystem/equivalent") (C++17) | checks whether two paths refer to the same file system object (function) | | [file\_size](filesystem/file_size "cpp/filesystem/file size") (C++17) | returns the size of a file (function) | | [hard\_link\_count](filesystem/hard_link_count "cpp/filesystem/hard link count") (C++17) | returns the number of hard links referring to the specific file (function) | | [last\_write\_time](filesystem/last_write_time "cpp/filesystem/last write time") (C++17) | gets or sets the time of the last data modification (function) | | [permissions](filesystem/permissions "cpp/filesystem/permissions") (C++17) | modifies file access permissions (function) | | [read\_symlink](filesystem/read_symlink "cpp/filesystem/read symlink") (C++17) | obtains the target of a symbolic link (function) | | [removeremove\_all](filesystem/remove "cpp/filesystem/remove") (C++17)(C++17) | removes a file or empty directoryremoves a file or directory and all its contents, recursively (function) | | [rename](filesystem/rename "cpp/filesystem/rename") (C++17) | moves or renames a file or directory (function) | | [resize\_file](filesystem/resize_file "cpp/filesystem/resize file") (C++17) | changes the size of a regular file by truncation or zero-fill (function) | | [space](filesystem/space "cpp/filesystem/space") (C++17) | determines available free space on the file system (function) | | [statussymlink\_status](filesystem/status "cpp/filesystem/status") (C++17)(C++17) | determines file attributesdetermines file attributes, checking the symlink target (function) | | [temp\_directory\_path](filesystem/temp_directory_path "cpp/filesystem/temp directory path") (C++17) | returns a directory suitable for temporary files (function) | | File types | | [is\_block\_file](filesystem/is_block_file "cpp/filesystem/is block file") (C++17) | checks whether the given path refers to block device (function) | | [is\_character\_file](filesystem/is_character_file "cpp/filesystem/is character file") (C++17) | checks whether the given path refers to a character device (function) | | [is\_directory](filesystem/is_directory "cpp/filesystem/is directory") (C++17) | checks whether the given path refers to a directory (function) | | [is\_empty](filesystem/is_empty "cpp/filesystem/is empty") (C++17) | checks whether the given path refers to an empty file or directory (function) | | [is\_fifo](filesystem/is_fifo "cpp/filesystem/is fifo") (C++17) | checks whether the given path refers to a named pipe (function) | | [is\_other](filesystem/is_other "cpp/filesystem/is other") (C++17) | checks whether the argument refers to an *other* file (function) | | [is\_regular\_file](filesystem/is_regular_file "cpp/filesystem/is regular file") (C++17) | checks whether the argument refers to a regular file (function) | | [is\_socket](filesystem/is_socket "cpp/filesystem/is socket") (C++17) | checks whether the argument refers to a named IPC socket (function) | | [is\_symlink](filesystem/is_symlink "cpp/filesystem/is symlink") (C++17) | checks whether the argument refers to a symbolic link (function) | | [status\_known](filesystem/status_known "cpp/filesystem/status known") (C++17) | checks whether file status is known (function) | ### Notes Using this library may require additional compiler/linker options. GNU implementation prior to 9.1 requires linking with `-lstdc++fs` and LLVM implementation prior to LLVM 9.0 requires linking with `-lc++fs`. | [Feature-test](utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_filesystem`](feature_test#Library_features "cpp/feature test") | `201703L` | (C++17) | ### See also | | | --- | | [C++ documentation](https://en.cppreference.com/w/cpp/experimental/fs "cpp/experimental/fs") for File System TS | cpp Numerics library Numerics library ================ The C++ numerics library includes common mathematical functions and types, as well as optimized numeric arrays and support for random number generation. Mathematical functions and types --------------------------------- ### [Common mathematical functions](numeric/math "cpp/numeric/math") The header [`<cmath>`](header/cmath "cpp/header/cmath") provides [standard C library mathematical functions](numeric/math "cpp/numeric/math") such as `[std::fabs](numeric/math/fabs "cpp/numeric/math/fabs")`, `[std::sqrt](numeric/math/sqrt "cpp/numeric/math/sqrt")`, and `[std::sin](numeric/math/sin "cpp/numeric/math/sin")`. ### [Mathematical special functions](numeric/special_functions "cpp/numeric/special functions") (since C++17) The header [`<cmath>`](header/cmath "cpp/header/cmath") also provides several mathematical special functions such as `[std::beta](numeric/special_functions/beta "cpp/numeric/special functions/beta")`, `[std::hermite](numeric/special_functions/hermite "cpp/numeric/special functions/hermite")`, and `[std::cyl\_bessel\_i](numeric/special_functions/cyl_bessel_i "cpp/numeric/special functions/cyl bessel i")`. ### [Mathematical constants](numeric/constants "cpp/numeric/constants") (since C++20) The header [`<numbers>`](header/numbers "cpp/header/numbers") provides several mathematical constants, such as `[std::numbers::pi](numeric/constants "cpp/numeric/constants")` or `[std::numbers::sqrt2](numeric/constants "cpp/numeric/constants")`. ### Complex number arithmetic | Defined in header `[<complex>](header/complex "cpp/header/complex")` | | --- | | [complex](numeric/complex "cpp/numeric/complex") | a complex number type (class template) | ### Numeric arrays | Defined in header `[<valarray>](header/valarray "cpp/header/valarray")` | | --- | | [valarray](numeric/valarray "cpp/numeric/valarray") | numeric arrays, array masks and array slices (class template) | Numeric algorithms ------------------- The header [`<numeric>`](header/numeric "cpp/header/numeric") provides numeric algorithms below: ### Factor operations | Defined in header `[<numeric>](header/numeric "cpp/header/numeric")` | | --- | | [gcd](numeric/gcd "cpp/numeric/gcd") (C++17) | `constexpr` function template returning the greatest common divisor of two integers (function template) | | [lcm](numeric/lcm "cpp/numeric/lcm") (C++17) | `constexpr` function template returning the least common multiple of two integers (function template) | ### Interpolation operations | Defined in header `[<numeric>](header/numeric "cpp/header/numeric")` | | --- | | [midpoint](numeric/midpoint "cpp/numeric/midpoint") (C++20) | midpoint between two numbers or pointers (function template) | | Defined in header `[<cmath>](header/cmath "cpp/header/cmath")` | | [lerp](numeric/lerp "cpp/numeric/lerp") (C++20) | linear interpolation function (function) | ### Numeric operations | Defined in header `[<numeric>](header/numeric "cpp/header/numeric")` | | --- | | [iota](algorithm/iota "cpp/algorithm/iota") (C++11) | fills a range with successive increments of the starting value (function template) | | [ranges::iota](algorithm/ranges/iota "cpp/algorithm/ranges/iota") (C++23) | fills a range with successive increments of the starting value (niebloid) | | [accumulate](algorithm/accumulate "cpp/algorithm/accumulate") | sums up a range of elements (function template) | | [reduce](algorithm/reduce "cpp/algorithm/reduce") (C++17) | similar to `[std::accumulate](algorithm/accumulate "cpp/algorithm/accumulate")`, except out of order (function template) | | [transform\_reduce](algorithm/transform_reduce "cpp/algorithm/transform reduce") (C++17) | applies an invocable, then reduces out of order (function template) | | [inner\_product](algorithm/inner_product "cpp/algorithm/inner product") | computes the inner product of two ranges of elements (function template) | | [adjacent\_difference](algorithm/adjacent_difference "cpp/algorithm/adjacent difference") | computes the differences between adjacent elements in a range (function template) | | [partial\_sum](algorithm/partial_sum "cpp/algorithm/partial sum") | computes the partial sum of a range of elements (function template) | | [inclusive\_scan](algorithm/inclusive_scan "cpp/algorithm/inclusive scan") (C++17) | similar to `[std::partial\_sum](algorithm/partial_sum "cpp/algorithm/partial sum")`, includes the ith input element in the ith sum (function template) | | [exclusive\_scan](algorithm/exclusive_scan "cpp/algorithm/exclusive scan") (C++17) | similar to `[std::partial\_sum](algorithm/partial_sum "cpp/algorithm/partial sum")`, excludes the ith input element from the ith sum (function template) | | [transform\_inclusive\_scan](algorithm/transform_inclusive_scan "cpp/algorithm/transform inclusive scan") (C++17) | applies an invocable, then calculates inclusive scan (function template) | | [transform\_exclusive\_scan](algorithm/transform_exclusive_scan "cpp/algorithm/transform exclusive scan") (C++17) | applies an invocable, then calculates exclusive scan (function template) | Miscellanous ------------- ### [Pseudo-random number generation](numeric/random "cpp/numeric/random") The header [`<random>`](header/random "cpp/header/random") defines [pseudo-random number generators and numerical distributions](numeric/random "cpp/numeric/random"). The header [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") also includes C-style random number generation via `[std::srand](numeric/random/srand "cpp/numeric/random/srand")` and `[std::rand](numeric/random/rand "cpp/numeric/random/rand")`. ### [Floating-point environment](numeric/fenv "cpp/numeric/fenv") (since C++11) The header [`<cfenv>`](header/cfenv "cpp/header/cfenv") defines [flags and functions related to exceptional floating-point state](numeric/fenv "cpp/numeric/fenv"), such as overflow and division by zero. ### Bit manipulation (since C++20) The header [`<bit>`](header/bit "cpp/header/bit") provides several function templates to access, manipulate, and process individual bits and bit sequences. | Defined in header `[<bit>](header/bit "cpp/header/bit")` | | --- | | Defined in namespace `std` | | [bit\_cast](numeric/bit_cast "cpp/numeric/bit cast") (C++20) | reinterpret the object representation of one type as that of another (function template) | | [byteswap](numeric/byteswap "cpp/numeric/byteswap") (C++23) | reverses the bytes in the given integer value (function template) | | [has\_single\_bit](numeric/has_single_bit "cpp/numeric/has single bit") (C++20) | checks if a number is an integral power of two (function template) | | [bit\_ceil](numeric/bit_ceil "cpp/numeric/bit ceil") (C++20) | finds the smallest integral power of two not less than the given value (function template) | | [bit\_floor](numeric/bit_floor "cpp/numeric/bit floor") (C++20) | finds the largest integral power of two not greater than the given value (function template) | | [bit\_width](numeric/bit_width "cpp/numeric/bit width") (C++20) | finds the smallest number of bits needed to represent the given value (function template) | | [rotl](numeric/rotl "cpp/numeric/rotl") (C++20) | computes the result of bitwise left-rotation (function template) | | [rotr](numeric/rotr "cpp/numeric/rotr") (C++20) | computes the result of bitwise right-rotation (function template) | | [countl\_zero](numeric/countl_zero "cpp/numeric/countl zero") (C++20) | counts the number of consecutive 0 bits, starting from the most significant bit (function template) | | [countl\_one](numeric/countl_one "cpp/numeric/countl one") (C++20) | counts the number of consecutive 1 bits, starting from the most significant bit (function template) | | [countr\_zero](numeric/countr_zero "cpp/numeric/countr zero") (C++20) | counts the number of consecutive 0 bits, starting from the least significant bit (function template) | | [countr\_one](numeric/countr_one "cpp/numeric/countr one") (C++20) | counts the number of consecutive 1 bits, starting from the least significant bit (function template) | | [popcount](numeric/popcount "cpp/numeric/popcount") (C++20) | counts the number of 1 bits in an unsigned integer (function template) | | [endian](types/endian "cpp/types/endian") (C++20) | indicates the endianness of scalar types (enum) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/numeric "c/numeric") for Numerics | cpp Regular expressions library (since C++11) Regular expressions library (since C++11) ========================================= The regular expressions library provides a class that represents [regular expressions](https://en.wikipedia.org/wiki/Regular_expression "enwiki:Regular expression"), which are a kind of mini-language used to perform pattern matching within strings. Almost all operations with regexes can be characterized by operating on several of the following objects: * **Target sequence**. The character sequence that is searched for a pattern. This may be a range specified by two iterators, a null-terminated character string or a `[std::string](string/basic_string "cpp/string/basic string")`. * **Pattern**. This is the regular expression itself. It determines what constitutes a match. It is an object of type `[std::basic\_regex](regex/basic_regex "cpp/regex/basic regex")`, constructed from a string with special syntax. See `[regex\_constants::syntax\_option\_type](regex/syntax_option_type "cpp/regex/syntax option type")` for the description of supported syntax variations. * **Matched array**. The information about matches may be retrieved as an object of type `[std::match\_results](regex/match_results "cpp/regex/match results")`. * **Replacement string**. This is a string that determines how to replace the matches, see `[regex\_constants::match\_flag\_type](regex/match_flag_type "cpp/regex/match flag type")` for the description of supported syntax variations. ### Main classes These classes encapsulate a regular expression and the results of matching a regular expression within a target sequence of characters. | | | | --- | --- | | [basic\_regex](regex/basic_regex "cpp/regex/basic regex") (C++11) | regular expression object (class template) | | [sub\_match](regex/sub_match "cpp/regex/sub match") (C++11) | identifies the sequence of characters matched by a sub-expression (class template) | | [match\_results](regex/match_results "cpp/regex/match results") (C++11) | identifies one regular expression match, including all sub-expression matches (class template) | ### Algorithms These functions are used to apply the regular expression encapsulated in a regex to a target sequence of characters. | | | | --- | --- | | [regex\_match](regex/regex_match "cpp/regex/regex match") (C++11) | attempts to match a regular expression to an entire character sequence (function template) | | [regex\_search](regex/regex_search "cpp/regex/regex search") (C++11) | attempts to match a regular expression to any part of a character sequence (function template) | | [regex\_replace](regex/regex_replace "cpp/regex/regex replace") (C++11) | replaces occurrences of a regular expression with formatted replacement text (function template) | ### Iterators The regex iterators are used to traverse the entire set of regular expression matches found within a sequence. | | | | --- | --- | | [regex\_iterator](regex/regex_iterator "cpp/regex/regex iterator") (C++11) | iterates through all regex matches within a character sequence (class template) | | [regex\_token\_iterator](regex/regex_token_iterator "cpp/regex/regex token iterator") (C++11) | iterates through the specified sub-expressions within all regex matches in a given string or through unmatched substrings (class template) | ### Exceptions This class defines the type of objects thrown as exceptions to report errors from the regular expressions library. | | | | --- | --- | | [regex\_error](regex/regex_error "cpp/regex/regex error") (C++11) | reports errors generated by the regular expressions library (class) | ### Traits The regex traits class is used to encapsulate the localizable aspects of a regex. | | | | --- | --- | | [regex\_traits](regex/regex_traits "cpp/regex/regex traits") (C++11) | provides metainformation about a character type, required by the regex library (class template) | ### Constants | Defined in namespace `std::regex_constants` | | --- | | [syntax\_option\_type](regex/syntax_option_type "cpp/regex/syntax option type") (C++11) | general options controlling regex behavior (typedef) | | [match\_flag\_type](regex/match_flag_type "cpp/regex/match flag type") (C++11) | options specific to matching (typedef) | | [error\_type](regex/error_type "cpp/regex/error type") (C++11) | describes different types of matching errors (typedef) | ### Example ``` #include <iostream> #include <iterator> #include <string> #include <regex> int main() { std::string s = "Some people, when confronted with a problem, think " "\"I know, I'll use regular expressions.\" " "Now they have two problems."; std::regex self_regex("REGULAR EXPRESSIONS", std::regex_constants::ECMAScript | std::regex_constants::icase); if (std::regex_search(s, self_regex)) { std::cout << "Text contains the phrase 'regular expressions'\n"; } std::regex word_regex("(\\w+)"); auto words_begin = std::sregex_iterator(s.begin(), s.end(), word_regex); auto words_end = std::sregex_iterator(); std::cout << "Found " << std::distance(words_begin, words_end) << " words\n"; const int N = 6; std::cout << "Words longer than " << N << " characters:\n"; for (std::sregex_iterator i = words_begin; i != words_end; ++i) { std::smatch match = *i; std::string match_str = match.str(); if (match_str.size() > N) { std::cout << " " << match_str << '\n'; } } std::regex long_word_regex("(\\w{7,})"); std::string new_s = std::regex_replace(s, long_word_regex, "[$&]"); std::cout << new_s << '\n'; } ``` Output: ``` Text contains the phrase 'regular expressions' Found 20 words Words longer than 6 characters: confronted problem regular expressions problems Some people, when [confronted] with a [problem], think "I know, I'll use [regular] [expressions]." Now they have two [problems]. ```
programming_docs
cpp Algorithms library Algorithms library ================== The algorithms library defines functions for a variety of purposes (e.g. searching, sorting, counting, manipulating) that operate on ranges of elements. Note that a range is defined as `[first, last)` where `last` refers to the element *past* the last element to inspect or modify. | | | | --- | --- | | [Constrained algorithms](algorithm/ranges "cpp/algorithm/ranges") C++20 provides [constrained](language/constraints "cpp/language/constraints") versions of most algorithms in the namespace `std::ranges`. In these algorithms, a range can be specified as either an [iterator](iterator/input_or_output_iterator "cpp/iterator/input or output iterator")-[sentinel](iterator/sentinel_for "cpp/iterator/sentinel for") pair or as a single [`range`](ranges/range "cpp/ranges/range") argument, and projections and pointer-to-member callables are supported. Additionally, the [return types](algorithm/ranges#Return_types "cpp/algorithm/ranges") of most algorithms have been changed to return all potentially useful information computed during the execution of the algorithm. ``` std::vector<int> v = {7, 1, 4, 0, -1}; std::ranges::sort(v); // constrained algorithm ``` | (since C++20) | | | | | | | | | | | | | | | | | | | | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Execution policies Most algorithms have overloads that accept execution policies. The standard library algorithms support several [execution policies](algorithm/execution_policy_tag_t "cpp/algorithm/execution policy tag t"), and the library provides corresponding execution policy types and objects. Users may select an execution policy statically by invoking a parallel algorithm with an [execution policy object](algorithm/execution_policy_tag "cpp/algorithm/execution policy tag") of the corresponding type. Standard library implementations (but not the users) may define additional execution policies as an extension. The semantics of parallel algorithms invoked with an execution policy object of implementation-defined type is implementation-defined. Parallel version of algorithms (except for `[std::for\_each](algorithm/for_each "cpp/algorithm/for each")` and `[std::for\_each\_n](algorithm/for_each_n "cpp/algorithm/for each n")`) are allowed to make arbitrary copies of elements from ranges, as long as both `[std::is\_trivially\_copy\_constructible\_v](http://en.cppreference.com/w/cpp/types/is_copy_constructible)<T>` and `[std::is\_trivially\_destructible\_v](http://en.cppreference.com/w/cpp/types/is_destructible)<T>` are `true`, where `T` is the type of elements. | Defined in header `[<execution>](header/execution "cpp/header/execution")` | | --- | | Defined in namespace `std::execution` | | [sequenced\_policyparallel\_policyparallel\_unsequenced\_policyunsequenced\_policy](algorithm/execution_policy_tag_t "cpp/algorithm/execution policy tag t") (C++17)(C++17)(C++17)(C++20) | execution policy types (class) | | [seqparpar\_unsequnseq](algorithm/execution_policy_tag "cpp/algorithm/execution policy tag") (C++17)(C++17)(C++17)(C++20) | global execution policy objects (constant) | | Defined in namespace `std` | | [is\_execution\_policy](algorithm/is_execution_policy "cpp/algorithm/is execution policy") (C++17) | test whether a class represents an execution policy (class template) | | [Feature-test](utility/feature_test "cpp/utility/feature test") macro | Comment | | --- | --- | | [`__cpp_lib_parallel_algorithm`](feature_test#Library_features "cpp/feature test") | for parallel version of algorithms | | [Feature-test](utility/feature_test "cpp/utility/feature test") macro | Comment | | --- | --- | | [`__cpp_lib_execution`](feature_test#Library_features "cpp/feature test") | for execution policies | | (since C++17) | | | | --- | | Non-modifying sequence operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [all\_ofany\_ofnone\_of](algorithm/all_any_none_of "cpp/algorithm/all any none of") (C++11)(C++11)(C++11) | checks if a predicate is `true` for all, any or none of the elements in a range (function template) | | [ranges::all\_ofranges::any\_ofranges::none\_of](algorithm/ranges/all_any_none_of "cpp/algorithm/ranges/all any none of") (C++20)(C++20)(C++20) | checks if a predicate is `true` for all, any or none of the elements in a range (niebloid) | | [for\_each](algorithm/for_each "cpp/algorithm/for each") | applies a function to a range of elements (function template) | | [ranges::for\_each](algorithm/ranges/for_each "cpp/algorithm/ranges/for each") (C++20) | applies a function to a range of elements (niebloid) | | [for\_each\_n](algorithm/for_each_n "cpp/algorithm/for each n") (C++17) | applies a function object to the first n elements of a sequence (function template) | | [ranges::for\_each\_n](algorithm/ranges/for_each_n "cpp/algorithm/ranges/for each n") (C++20) | applies a function object to the first n elements of a sequence (niebloid) | | [countcount\_if](algorithm/count "cpp/algorithm/count") | returns the number of elements satisfying specific criteria (function template) | | [ranges::countranges::count\_if](algorithm/ranges/count "cpp/algorithm/ranges/count") (C++20)(C++20) | returns the number of elements satisfying specific criteria (niebloid) | | [mismatch](algorithm/mismatch "cpp/algorithm/mismatch") | finds the first position where two ranges differ (function template) | | [ranges::mismatch](algorithm/ranges/mismatch "cpp/algorithm/ranges/mismatch") (C++20) | finds the first position where two ranges differ (niebloid) | | [findfind\_iffind\_if\_not](algorithm/find "cpp/algorithm/find") (C++11) | finds the first element satisfying specific criteria (function template) | | [ranges::findranges::find\_ifranges::find\_if\_not](algorithm/ranges/find "cpp/algorithm/ranges/find") (C++20)(C++20)(C++20) | finds the first element satisfying specific criteria (niebloid) | | [ranges::find\_lastranges::find\_last\_ifranges::find\_last\_if\_not](algorithm/ranges/find_last "cpp/algorithm/ranges/find last") (C++23)(C++23)(C++23) | finds the last element satisfying specific criteria (niebloid) | | [find\_end](algorithm/find_end "cpp/algorithm/find end") | finds the last sequence of elements in a certain range (function template) | | [ranges::find\_end](algorithm/ranges/find_end "cpp/algorithm/ranges/find end") (C++20) | finds the last sequence of elements in a certain range (niebloid) | | [find\_first\_of](algorithm/find_first_of "cpp/algorithm/find first of") | searches for any one of a set of elements (function template) | | [ranges::find\_first\_of](algorithm/ranges/find_first_of "cpp/algorithm/ranges/find first of") (C++20) | searches for any one of a set of elements (niebloid) | | [adjacent\_find](algorithm/adjacent_find "cpp/algorithm/adjacent find") | finds the first two adjacent items that are equal (or satisfy a given predicate) (function template) | | [ranges::adjacent\_find](algorithm/ranges/adjacent_find "cpp/algorithm/ranges/adjacent find") (C++20) | finds the first two adjacent items that are equal (or satisfy a given predicate) (niebloid) | | [search](algorithm/search "cpp/algorithm/search") | searches for a range of elements (function template) | | [ranges::search](algorithm/ranges/search "cpp/algorithm/ranges/search") (C++20) | searches for a range of elements (niebloid) | | [search\_n](algorithm/search_n "cpp/algorithm/search n") | searches a range for a number of consecutive copies of an element (function template) | | [ranges::search\_n](algorithm/ranges/search_n "cpp/algorithm/ranges/search n") (C++20) | searches for a number consecutive copies of an element in a range (niebloid) | | [ranges::starts\_with](algorithm/ranges/starts_with "cpp/algorithm/ranges/starts with") (C++23) | checks whether a range starts with another range (niebloid) | | [ranges::ends\_with](algorithm/ranges/ends_with "cpp/algorithm/ranges/ends with") (C++23) | checks whether a range ends with another range (niebloid) | | Modifying sequence operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [copycopy\_if](algorithm/copy "cpp/algorithm/copy") (C++11) | copies a range of elements to a new location (function template) | | [ranges::copyranges::copy\_if](algorithm/ranges/copy "cpp/algorithm/ranges/copy") (C++20)(C++20) | copies a range of elements to a new location (niebloid) | | [copy\_n](algorithm/copy_n "cpp/algorithm/copy n") (C++11) | copies a number of elements to a new location (function template) | | [ranges::copy\_n](algorithm/ranges/copy_n "cpp/algorithm/ranges/copy n") (C++20) | copies a number of elements to a new location (niebloid) | | [copy\_backward](algorithm/copy_backward "cpp/algorithm/copy backward") | copies a range of elements in backwards order (function template) | | [ranges::copy\_backward](algorithm/ranges/copy_backward "cpp/algorithm/ranges/copy backward") (C++20) | copies a range of elements in backwards order (niebloid) | | [move](algorithm/move "cpp/algorithm/move") (C++11) | moves a range of elements to a new location (function template) | | [ranges::move](algorithm/ranges/move "cpp/algorithm/ranges/move") (C++20) | moves a range of elements to a new location (niebloid) | | [move\_backward](algorithm/move_backward "cpp/algorithm/move backward") (C++11) | moves a range of elements to a new location in backwards order (function template) | | [ranges::move\_backward](algorithm/ranges/move_backward "cpp/algorithm/ranges/move backward") (C++20) | moves a range of elements to a new location in backwards order (niebloid) | | [fill](algorithm/fill "cpp/algorithm/fill") | copy-assigns the given value to every element in a range (function template) | | [ranges::fill](algorithm/ranges/fill "cpp/algorithm/ranges/fill") (C++20) | assigns a range of elements a certain value (niebloid) | | [fill\_n](algorithm/fill_n "cpp/algorithm/fill n") | copy-assigns the given value to N elements in a range (function template) | | [ranges::fill\_n](algorithm/ranges/fill_n "cpp/algorithm/ranges/fill n") (C++20) | assigns a value to a number of elements (niebloid) | | [transform](algorithm/transform "cpp/algorithm/transform") | applies a function to a range of elements, storing results in a destination range (function template) | | [ranges::transform](algorithm/ranges/transform "cpp/algorithm/ranges/transform") (C++20) | applies a function to a range of elements (niebloid) | | [generate](algorithm/generate "cpp/algorithm/generate") | assigns the results of successive function calls to every element in a range (function template) | | [ranges::generate](algorithm/ranges/generate "cpp/algorithm/ranges/generate") (C++20) | saves the result of a function in a range (niebloid) | | [generate\_n](algorithm/generate_n "cpp/algorithm/generate n") | assigns the results of successive function calls to N elements in a range (function template) | | [ranges::generate\_n](algorithm/ranges/generate_n "cpp/algorithm/ranges/generate n") (C++20) | saves the result of N applications of a function (niebloid) | | [removeremove\_if](algorithm/remove "cpp/algorithm/remove") | removes elements satisfying specific criteria (function template) | | [ranges::removeranges::remove\_if](algorithm/ranges/remove "cpp/algorithm/ranges/remove") (C++20)(C++20) | removes elements satisfying specific criteria (niebloid) | | [remove\_copyremove\_copy\_if](algorithm/remove_copy "cpp/algorithm/remove copy") | copies a range of elements omitting those that satisfy specific criteria (function template) | | [ranges::remove\_copyranges::remove\_copy\_if](algorithm/ranges/remove_copy "cpp/algorithm/ranges/remove copy") (C++20)(C++20) | copies a range of elements omitting those that satisfy specific criteria (niebloid) | | [replacereplace\_if](algorithm/replace "cpp/algorithm/replace") | replaces all values satisfying specific criteria with another value (function template) | | [ranges::replaceranges::replace\_if](algorithm/ranges/replace "cpp/algorithm/ranges/replace") (C++20)(C++20) | replaces all values satisfying specific criteria with another value (niebloid) | | [replace\_copyreplace\_copy\_if](algorithm/replace_copy "cpp/algorithm/replace copy") | copies a range, replacing elements satisfying specific criteria with another value (function template) | | [ranges::replace\_copyranges::replace\_copy\_if](algorithm/ranges/replace_copy "cpp/algorithm/ranges/replace copy") (C++20)(C++20) | copies a range, replacing elements satisfying specific criteria with another value (niebloid) | | [swap](algorithm/swap "cpp/algorithm/swap") | swaps the values of two objects (function template) | | [swap\_ranges](algorithm/swap_ranges "cpp/algorithm/swap ranges") | swaps two ranges of elements (function template) | | [ranges::swap\_ranges](algorithm/ranges/swap_ranges "cpp/algorithm/ranges/swap ranges") (C++20) | swaps two ranges of elements (niebloid) | | [iter\_swap](algorithm/iter_swap "cpp/algorithm/iter swap") | swaps the elements pointed to by two iterators (function template) | | [reverse](algorithm/reverse "cpp/algorithm/reverse") | reverses the order of elements in a range (function template) | | [ranges::reverse](algorithm/ranges/reverse "cpp/algorithm/ranges/reverse") (C++20) | reverses the order of elements in a range (niebloid) | | [reverse\_copy](algorithm/reverse_copy "cpp/algorithm/reverse copy") | creates a copy of a range that is reversed (function template) | | [ranges::reverse\_copy](algorithm/ranges/reverse_copy "cpp/algorithm/ranges/reverse copy") (C++20) | creates a copy of a range that is reversed (niebloid) | | [rotate](algorithm/rotate "cpp/algorithm/rotate") | rotates the order of elements in a range (function template) | | [ranges::rotate](algorithm/ranges/rotate "cpp/algorithm/ranges/rotate") (C++20) | rotates the order of elements in a range (niebloid) | | [rotate\_copy](algorithm/rotate_copy "cpp/algorithm/rotate copy") | copies and rotate a range of elements (function template) | | [ranges::rotate\_copy](algorithm/ranges/rotate_copy "cpp/algorithm/ranges/rotate copy") (C++20) | copies and rotate a range of elements (niebloid) | | [shift\_leftshift\_right](algorithm/shift "cpp/algorithm/shift") (C++20) | shifts elements in a range (function template) | | [ranges::shift\_leftranges::shift\_right](algorithm/ranges/shift "cpp/algorithm/ranges/shift") (C++23) | shifts elements in a range (niebloid) | | [random\_shuffleshuffle](algorithm/random_shuffle "cpp/algorithm/random shuffle") (until C++17)(C++11) | randomly re-orders elements in a range (function template) | | [ranges::shuffle](algorithm/ranges/shuffle "cpp/algorithm/ranges/shuffle") (C++20) | randomly re-orders elements in a range (niebloid) | | [sample](algorithm/sample "cpp/algorithm/sample") (C++17) | selects n random elements from a sequence (function template) | | [ranges::sample](algorithm/ranges/sample "cpp/algorithm/ranges/sample") (C++20) | selects n random elements from a sequence (niebloid) | | [unique](algorithm/unique "cpp/algorithm/unique") | removes consecutive duplicate elements in a range (function template) | | [ranges::unique](algorithm/ranges/unique "cpp/algorithm/ranges/unique") (C++20) | removes consecutive duplicate elements in a range (niebloid) | | [unique\_copy](algorithm/unique_copy "cpp/algorithm/unique copy") | creates a copy of some range of elements that contains no consecutive duplicates (function template) | | [ranges::unique\_copy](algorithm/ranges/unique_copy "cpp/algorithm/ranges/unique copy") (C++20) | creates a copy of some range of elements that contains no consecutive duplicates (niebloid) | | Partitioning operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [is\_partitioned](algorithm/is_partitioned "cpp/algorithm/is partitioned") (C++11) | determines if the range is partitioned by the given predicate (function template) | | [ranges::is\_partitioned](algorithm/ranges/is_partitioned "cpp/algorithm/ranges/is partitioned") (C++20) | determines if the range is partitioned by the given predicate (niebloid) | | [partition](algorithm/partition "cpp/algorithm/partition") | divides a range of elements into two groups (function template) | | [ranges::partition](algorithm/ranges/partition "cpp/algorithm/ranges/partition") (C++20) | divides a range of elements into two groups (niebloid) | | [partition\_copy](algorithm/partition_copy "cpp/algorithm/partition copy") (C++11) | copies a range dividing the elements into two groups (function template) | | [ranges::partition\_copy](algorithm/ranges/partition_copy "cpp/algorithm/ranges/partition copy") (C++20) | copies a range dividing the elements into two groups (niebloid) | | [stable\_partition](algorithm/stable_partition "cpp/algorithm/stable partition") | divides elements into two groups while preserving their relative order (function template) | | [ranges::stable\_partition](algorithm/ranges/stable_partition "cpp/algorithm/ranges/stable partition") (C++20) | divides elements into two groups while preserving their relative order (niebloid) | | [partition\_point](algorithm/partition_point "cpp/algorithm/partition point") (C++11) | locates the partition point of a partitioned range (function template) | | [ranges::partition\_point](algorithm/ranges/partition_point "cpp/algorithm/ranges/partition point") (C++20) | locates the partition point of a partitioned range (niebloid) | | Sorting operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [is\_sorted](algorithm/is_sorted "cpp/algorithm/is sorted") (C++11) | checks whether a range is sorted into ascending order (function template) | | [ranges::is\_sorted](algorithm/ranges/is_sorted "cpp/algorithm/ranges/is sorted") (C++20) | checks whether a range is sorted into ascending order (niebloid) | | [is\_sorted\_until](algorithm/is_sorted_until "cpp/algorithm/is sorted until") (C++11) | finds the largest sorted subrange (function template) | | [ranges::is\_sorted\_until](algorithm/ranges/is_sorted_until "cpp/algorithm/ranges/is sorted until") (C++20) | finds the largest sorted subrange (niebloid) | | [sort](algorithm/sort "cpp/algorithm/sort") | sorts a range into ascending order (function template) | | [ranges::sort](algorithm/ranges/sort "cpp/algorithm/ranges/sort") (C++20) | sorts a range into ascending order (niebloid) | | [partial\_sort](algorithm/partial_sort "cpp/algorithm/partial sort") | sorts the first N elements of a range (function template) | | [ranges::partial\_sort](algorithm/ranges/partial_sort "cpp/algorithm/ranges/partial sort") (C++20) | sorts the first N elements of a range (niebloid) | | [partial\_sort\_copy](algorithm/partial_sort_copy "cpp/algorithm/partial sort copy") | copies and partially sorts a range of elements (function template) | | [ranges::partial\_sort\_copy](algorithm/ranges/partial_sort_copy "cpp/algorithm/ranges/partial sort copy") (C++20) | copies and partially sorts a range of elements (niebloid) | | [stable\_sort](algorithm/stable_sort "cpp/algorithm/stable sort") | sorts a range of elements while preserving order between equal elements (function template) | | [ranges::stable\_sort](algorithm/ranges/stable_sort "cpp/algorithm/ranges/stable sort") (C++20) | sorts a range of elements while preserving order between equal elements (niebloid) | | [nth\_element](algorithm/nth_element "cpp/algorithm/nth element") | partially sorts the given range making sure that it is partitioned by the given element (function template) | | [ranges::nth\_element](algorithm/ranges/nth_element "cpp/algorithm/ranges/nth element") (C++20) | partially sorts the given range making sure that it is partitioned by the given element (niebloid) | | Binary search operations (on sorted ranges) | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [lower\_bound](algorithm/lower_bound "cpp/algorithm/lower bound") | returns an iterator to the first element *not less* than the given value (function template) | | [ranges::lower\_bound](algorithm/ranges/lower_bound "cpp/algorithm/ranges/lower bound") (C++20) | returns an iterator to the first element *not less* than the given value (niebloid) | | [upper\_bound](algorithm/upper_bound "cpp/algorithm/upper bound") | returns an iterator to the first element *greater* than a certain value (function template) | | [ranges::upper\_bound](algorithm/ranges/upper_bound "cpp/algorithm/ranges/upper bound") (C++20) | returns an iterator to the first element *greater* than a certain value (niebloid) | | [binary\_search](algorithm/binary_search "cpp/algorithm/binary search") | determines if an element exists in a partially-ordered range (function template) | | [ranges::binary\_search](algorithm/ranges/binary_search "cpp/algorithm/ranges/binary search") (C++20) | determines if an element exists in a partially-ordered range (niebloid) | | [equal\_range](algorithm/equal_range "cpp/algorithm/equal range") | returns range of elements matching a specific key (function template) | | [ranges::equal\_range](algorithm/ranges/equal_range "cpp/algorithm/ranges/equal range") (C++20) | returns range of elements matching a specific key (niebloid) | | Other operations on sorted ranges | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [merge](algorithm/merge "cpp/algorithm/merge") | merges two sorted ranges (function template) | | [ranges::merge](algorithm/ranges/merge "cpp/algorithm/ranges/merge") (C++20) | merges two sorted ranges (niebloid) | | [inplace\_merge](algorithm/inplace_merge "cpp/algorithm/inplace merge") | merges two ordered ranges in-place (function template) | | [ranges::inplace\_merge](algorithm/ranges/inplace_merge "cpp/algorithm/ranges/inplace merge") (C++20) | merges two ordered ranges in-place (niebloid) | | Set operations (on sorted ranges) | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [includes](algorithm/includes "cpp/algorithm/includes") | returns true if one sequence is a subsequence of another (function template) | | [ranges::includes](algorithm/ranges/includes "cpp/algorithm/ranges/includes") (C++20) | returns true if one sequence is a subsequence of another (niebloid) | | [set\_difference](algorithm/set_difference "cpp/algorithm/set difference") | computes the difference between two sets (function template) | | [ranges::set\_difference](algorithm/ranges/set_difference "cpp/algorithm/ranges/set difference") (C++20) | computes the difference between two sets (niebloid) | | [set\_intersection](algorithm/set_intersection "cpp/algorithm/set intersection") | computes the intersection of two sets (function template) | | [ranges::set\_intersection](algorithm/ranges/set_intersection "cpp/algorithm/ranges/set intersection") (C++20) | computes the intersection of two sets (niebloid) | | [set\_symmetric\_difference](algorithm/set_symmetric_difference "cpp/algorithm/set symmetric difference") | computes the symmetric difference between two sets (function template) | | [ranges::set\_symmetric\_difference](algorithm/ranges/set_symmetric_difference "cpp/algorithm/ranges/set symmetric difference") (C++20) | computes the symmetric difference between two sets (niebloid) | | [set\_union](algorithm/set_union "cpp/algorithm/set union") | computes the union of two sets (function template) | | [ranges::set\_union](algorithm/ranges/set_union "cpp/algorithm/ranges/set union") (C++20) | computes the union of two sets (niebloid) | | Heap operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [is\_heap](algorithm/is_heap "cpp/algorithm/is heap") (C++11) | checks if the given range is a max heap (function template) | | [ranges::is\_heap](algorithm/ranges/is_heap "cpp/algorithm/ranges/is heap") (C++20) | checks if the given range is a max heap (niebloid) | | [is\_heap\_until](algorithm/is_heap_until "cpp/algorithm/is heap until") (C++11) | finds the largest subrange that is a max heap (function template) | | [ranges::is\_heap\_until](algorithm/ranges/is_heap_until "cpp/algorithm/ranges/is heap until") (C++20) | finds the largest subrange that is a max heap (niebloid) | | [make\_heap](algorithm/make_heap "cpp/algorithm/make heap") | creates a max heap out of a range of elements (function template) | | [ranges::make\_heap](algorithm/ranges/make_heap "cpp/algorithm/ranges/make heap") (C++20) | creates a max heap out of a range of elements (niebloid) | | [push\_heap](algorithm/push_heap "cpp/algorithm/push heap") | adds an element to a max heap (function template) | | [ranges::push\_heap](algorithm/ranges/push_heap "cpp/algorithm/ranges/push heap") (C++20) | adds an element to a max heap (niebloid) | | [pop\_heap](algorithm/pop_heap "cpp/algorithm/pop heap") | removes the largest element from a max heap (function template) | | [ranges::pop\_heap](algorithm/ranges/pop_heap "cpp/algorithm/ranges/pop heap") (C++20) | removes the largest element from a max heap (niebloid) | | [sort\_heap](algorithm/sort_heap "cpp/algorithm/sort heap") | turns a max heap into a range of elements sorted in ascending order (function template) | | [ranges::sort\_heap](algorithm/ranges/sort_heap "cpp/algorithm/ranges/sort heap") (C++20) | turns a max heap into a range of elements sorted in ascending order (niebloid) | | Minimum/maximum operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [max](algorithm/max "cpp/algorithm/max") | returns the greater of the given values (function template) | | [ranges::max](algorithm/ranges/max "cpp/algorithm/ranges/max") (C++20) | returns the greater of the given values (niebloid) | | [max\_element](algorithm/max_element "cpp/algorithm/max element") | returns the largest element in a range (function template) | | [ranges::max\_element](algorithm/ranges/max_element "cpp/algorithm/ranges/max element") (C++20) | returns the largest element in a range (niebloid) | | [min](algorithm/min "cpp/algorithm/min") | returns the smaller of the given values (function template) | | [ranges::min](algorithm/ranges/min "cpp/algorithm/ranges/min") (C++20) | returns the smaller of the given values (niebloid) | | [min\_element](algorithm/min_element "cpp/algorithm/min element") | returns the smallest element in a range (function template) | | [ranges::min\_element](algorithm/ranges/min_element "cpp/algorithm/ranges/min element") (C++20) | returns the smallest element in a range (niebloid) | | [minmax](algorithm/minmax "cpp/algorithm/minmax") (C++11) | returns the smaller and larger of two elements (function template) | | [ranges::minmax](algorithm/ranges/minmax "cpp/algorithm/ranges/minmax") (C++20) | returns the smaller and larger of two elements (niebloid) | | [minmax\_element](algorithm/minmax_element "cpp/algorithm/minmax element") (C++11) | returns the smallest and the largest elements in a range (function template) | | [ranges::minmax\_element](algorithm/ranges/minmax_element "cpp/algorithm/ranges/minmax element") (C++20) | returns the smallest and the largest elements in a range (niebloid) | | [clamp](algorithm/clamp "cpp/algorithm/clamp") (C++17) | clamps a value between a pair of boundary values (function template) | | [ranges::clamp](algorithm/ranges/clamp "cpp/algorithm/ranges/clamp") (C++20) | clamps a value between a pair of boundary values (niebloid) | | Comparison operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [equal](algorithm/equal "cpp/algorithm/equal") | determines if two sets of elements are the same (function template) | | [ranges::equal](algorithm/ranges/equal "cpp/algorithm/ranges/equal") (C++20) | determines if two sets of elements are the same (niebloid) | | [lexicographical\_compare](algorithm/lexicographical_compare "cpp/algorithm/lexicographical compare") | returns true if one range is lexicographically less than another (function template) | | [ranges::lexicographical\_compare](algorithm/ranges/lexicographical_compare "cpp/algorithm/ranges/lexicographical compare") (C++20) | returns true if one range is lexicographically less than another (niebloid) | | [lexicographical\_compare\_three\_way](algorithm/lexicographical_compare_three_way "cpp/algorithm/lexicographical compare three way") (C++20) | compares two ranges using three-way comparison (function template) | | Permutation operations | | Defined in header `[<algorithm>](header/algorithm "cpp/header/algorithm")` | | [is\_permutation](algorithm/is_permutation "cpp/algorithm/is permutation") (C++11) | determines if a sequence is a permutation of another sequence (function template) | | [ranges::is\_permutation](algorithm/ranges/is_permutation "cpp/algorithm/ranges/is permutation") (C++20) | determines if a sequence is a permutation of another sequence (niebloid) | | [next\_permutation](algorithm/next_permutation "cpp/algorithm/next permutation") | generates the next greater lexicographic permutation of a range of elements (function template) | | [ranges::next\_permutation](algorithm/ranges/next_permutation "cpp/algorithm/ranges/next permutation") (C++20) | generates the next greater lexicographic permutation of a range of elements (niebloid) | | [prev\_permutation](algorithm/prev_permutation "cpp/algorithm/prev permutation") | generates the next smaller lexicographic permutation of a range of elements (function template) | | [ranges::prev\_permutation](algorithm/ranges/prev_permutation "cpp/algorithm/ranges/prev permutation") (C++20) | generates the next smaller lexicographic permutation of a range of elements (niebloid) | | Numeric operations | | Defined in header `[<numeric>](header/numeric "cpp/header/numeric")` | | [iota](algorithm/iota "cpp/algorithm/iota") (C++11) | fills a range with successive increments of the starting value (function template) | | [ranges::iota](algorithm/ranges/iota "cpp/algorithm/ranges/iota") (C++23) | fills a range with successive increments of the starting value (niebloid) | | [accumulate](algorithm/accumulate "cpp/algorithm/accumulate") | sums up a range of elements (function template) | | [inner\_product](algorithm/inner_product "cpp/algorithm/inner product") | computes the inner product of two ranges of elements (function template) | | [adjacent\_difference](algorithm/adjacent_difference "cpp/algorithm/adjacent difference") | computes the differences between adjacent elements in a range (function template) | | [partial\_sum](algorithm/partial_sum "cpp/algorithm/partial sum") | computes the partial sum of a range of elements (function template) | | [reduce](algorithm/reduce "cpp/algorithm/reduce") (C++17) | similar to `[std::accumulate](algorithm/accumulate "cpp/algorithm/accumulate")`, except out of order (function template) | | [exclusive\_scan](algorithm/exclusive_scan "cpp/algorithm/exclusive scan") (C++17) | similar to `[std::partial\_sum](algorithm/partial_sum "cpp/algorithm/partial sum")`, excludes the ith input element from the ith sum (function template) | | [inclusive\_scan](algorithm/inclusive_scan "cpp/algorithm/inclusive scan") (C++17) | similar to `[std::partial\_sum](algorithm/partial_sum "cpp/algorithm/partial sum")`, includes the ith input element in the ith sum (function template) | | [transform\_reduce](algorithm/transform_reduce "cpp/algorithm/transform reduce") (C++17) | applies an invocable, then reduces out of order (function template) | | [transform\_exclusive\_scan](algorithm/transform_exclusive_scan "cpp/algorithm/transform exclusive scan") (C++17) | applies an invocable, then calculates exclusive scan (function template) | | [transform\_inclusive\_scan](algorithm/transform_inclusive_scan "cpp/algorithm/transform inclusive scan") (C++17) | applies an invocable, then calculates inclusive scan (function template) | | Operations on uninitialized memory | | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | [uninitialized\_copy](memory/uninitialized_copy "cpp/memory/uninitialized copy") | copies a range of objects to an uninitialized area of memory (function template) | | [ranges::uninitialized\_copy](memory/ranges/uninitialized_copy "cpp/memory/ranges/uninitialized copy") (C++20) | copies a range of objects to an uninitialized area of memory (niebloid) | | [uninitialized\_copy\_n](memory/uninitialized_copy_n "cpp/memory/uninitialized copy n") (C++11) | copies a number of objects to an uninitialized area of memory (function template) | | [ranges::uninitialized\_copy\_n](memory/ranges/uninitialized_copy_n "cpp/memory/ranges/uninitialized copy n") (C++20) | copies a number of objects to an uninitialized area of memory (niebloid) | | [uninitialized\_fill](memory/uninitialized_fill "cpp/memory/uninitialized fill") | copies an object to an uninitialized area of memory, defined by a range (function template) | | [ranges::uninitialized\_fill](memory/ranges/uninitialized_fill "cpp/memory/ranges/uninitialized fill") (C++20) | copies an object to an uninitialized area of memory, defined by a range (niebloid) | | [uninitialized\_fill\_n](memory/uninitialized_fill_n "cpp/memory/uninitialized fill n") | copies an object to an uninitialized area of memory, defined by a start and a count (function template) | | [ranges::uninitialized\_fill\_n](memory/ranges/uninitialized_fill_n "cpp/memory/ranges/uninitialized fill n") (C++20) | copies an object to an uninitialized area of memory, defined by a start and a count (niebloid) | | [uninitialized\_move](memory/uninitialized_move "cpp/memory/uninitialized move") (C++17) | moves a range of objects to an uninitialized area of memory (function template) | | [ranges::uninitialized\_move](memory/ranges/uninitialized_move "cpp/memory/ranges/uninitialized move") (C++20) | moves a range of objects to an uninitialized area of memory (niebloid) | | [uninitialized\_move\_n](memory/uninitialized_move_n "cpp/memory/uninitialized move n") (C++17) | moves a number of objects to an uninitialized area of memory (function template) | | [ranges::uninitialized\_move\_n](memory/ranges/uninitialized_move_n "cpp/memory/ranges/uninitialized move n") (C++20) | moves a number of objects to an uninitialized area of memory (niebloid) | | [uninitialized\_default\_construct](memory/uninitialized_default_construct "cpp/memory/uninitialized default construct") (C++17) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a range (function template) | | [ranges::uninitialized\_default\_construct](memory/ranges/uninitialized_default_construct "cpp/memory/ranges/uninitialized default construct") (C++20) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a range (niebloid) | | [uninitialized\_default\_construct\_n](memory/uninitialized_default_construct_n "cpp/memory/uninitialized default construct n") (C++17) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a start and a count (function template) | | [ranges::uninitialized\_default\_construct\_n](memory/ranges/uninitialized_default_construct_n "cpp/memory/ranges/uninitialized default construct n") (C++20) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a start and count (niebloid) | | [uninitialized\_value\_construct](memory/uninitialized_value_construct "cpp/memory/uninitialized value construct") (C++17) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a range (function template) | | [ranges::uninitialized\_value\_construct](memory/ranges/uninitialized_value_construct "cpp/memory/ranges/uninitialized value construct") (C++20) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a range (niebloid) | | [uninitialized\_value\_construct\_n](memory/uninitialized_value_construct_n "cpp/memory/uninitialized value construct n") (C++17) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a start and a count (function template) | | [ranges::uninitialized\_value\_construct\_n](memory/ranges/uninitialized_value_construct_n "cpp/memory/ranges/uninitialized value construct n") (C++20) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a start and a count (niebloid) | | [destroy](memory/destroy "cpp/memory/destroy") (C++17) | destroys a range of objects (function template) | | [ranges::destroy](memory/ranges/destroy "cpp/memory/ranges/destroy") (C++20) | destroys a range of objects (niebloid) | | [destroy\_n](memory/destroy_n "cpp/memory/destroy n") (C++17) | destroys a number of objects in a range (function template) | | [ranges::destroy\_n](memory/ranges/destroy_n "cpp/memory/ranges/destroy n") (C++20) | destroys a number of objects in a range (niebloid) | | [destroy\_at](memory/destroy_at "cpp/memory/destroy at") (C++17) | destroys an object at a given address (function template) | | [ranges::destroy\_at](memory/ranges/destroy_at "cpp/memory/ranges/destroy at") (C++20) | destroys an object at a given address (niebloid) | | [construct\_at](memory/construct_at "cpp/memory/construct at") (C++20) | creates an object at a given address (function template) | | [ranges::construct\_at](memory/ranges/construct_at "cpp/memory/ranges/construct at") (C++20) | creates an object at a given address (niebloid) | | C library | | Defined in header `[<cstdlib>](header/cstdlib "cpp/header/cstdlib")` | | [qsort](algorithm/qsort "cpp/algorithm/qsort") | sorts a range of elements with unspecified type (function) | | [bsearch](algorithm/bsearch "cpp/algorithm/bsearch") | searches an array for an element of unspecified type (function) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/algorithm "c/algorithm") for Algorithms |
programming_docs
cpp C++ Programming Language C++ Programming Language ======================== The interface of C++ standard library is defined by the following collection of headers. | | | --- | | Concepts library | | [<concepts>](header/concepts "cpp/header/concepts") (C++20) | [Fundamental library concepts](concepts "cpp/concepts") | | Coroutines library | | [<coroutine>](header/coroutine "cpp/header/coroutine") (C++20) | [Coroutine support library](coroutine "cpp/coroutine") | | Utilities library | | [<any>](header/any "cpp/header/any") (C++17) | `[std::any](utility/any "cpp/utility/any")` class | | [<bitset>](header/bitset "cpp/header/bitset") | `[std::bitset](utility/bitset "cpp/utility/bitset")` class template | | [<chrono>](header/chrono "cpp/header/chrono") (C++11) | [C++ time utilites](chrono "cpp/chrono") | | [<compare>](header/compare "cpp/header/compare") (C++20) | [Three-way comparison operator](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") support | | [<csetjmp>](header/csetjmp "cpp/header/csetjmp") | [Macro (and function) that saves (and jumps) to an execution context](utility/program/setjmp "cpp/utility/program/setjmp") | | [<csignal>](header/csignal "cpp/header/csignal") | [Functions and macro constants for signal management](utility/program "cpp/utility/program") | | [<cstdarg>](header/cstdarg "cpp/header/cstdarg") | [Handling of variable length argument lists](utility/variadic "cpp/utility/variadic") | | [<cstddef>](header/cstddef "cpp/header/cstddef") | [Standard macros and typedefs](types "cpp/types") | | [<cstdlib>](header/cstdlib "cpp/header/cstdlib") | General purpose utilities: [program control](utility/program "cpp/utility/program"), [dynamic memory allocation](memory/c "cpp/memory/c"), [random numbers](numeric/random#C_random_library "cpp/numeric/random"), [sort and search](algorithm#C_library "cpp/algorithm") | | [<ctime>](header/ctime "cpp/header/ctime") | [C-style time/date utilites](chrono/c "cpp/chrono/c") | | [<expected>](header/expected "cpp/header/expected") (C++23) | `std::expected` class template | | [<functional>](header/functional "cpp/header/functional") | [Function objects, Function invocations, Bind operations and Reference wrappers](utility/functional "cpp/utility/functional") | | [<initializer\_list>](header/initializer_list "cpp/header/initializer list") (C++11) | `[std::initializer\_list](utility/initializer_list "cpp/utility/initializer list")` class template | | [<optional>](header/optional "cpp/header/optional") (C++17) | `[std::optional](utility/optional "cpp/utility/optional")` class template | | [<source\_location>](header/source_location "cpp/header/source location") (C++20) | Supplies means to obtain [source code location](utility/source_location "cpp/utility/source location") | | [<tuple>](header/tuple "cpp/header/tuple") (C++11) | `[std::tuple](utility/tuple "cpp/utility/tuple")` class template | | [<type\_traits>](header/type_traits "cpp/header/type traits") (C++11) | [Compile-time type information](types "cpp/types") | | [<typeindex>](header/typeindex "cpp/header/typeindex") (C++11) | `[std::type\_index](types/type_index "cpp/types/type index")` | | [<typeinfo>](header/typeinfo "cpp/header/typeinfo") | [Runtime type information utilities](types "cpp/types") | | [<utility>](header/utility "cpp/header/utility") | Various [utility components](utility "cpp/utility") | | [<variant>](header/variant "cpp/header/variant") (C++17) | `[std::variant](utility/variant "cpp/utility/variant")` class template | | [<version>](header/version "cpp/header/version") (C++20) | Supplies implementation-dependent library information | | Dynamic memory management | | [<memory>](header/memory "cpp/header/memory") | [High-level memory management utilities](memory "cpp/memory") | | [<memory\_resource>](header/memory_resource "cpp/header/memory resource") (C++17) | [Polymorphic allocators and memory resources](memory/memory_resource "cpp/memory/memory resource") | | [<new>](header/new "cpp/header/new") | [Low-level memory management utilities](memory/new "cpp/memory/new") | | [<scoped\_allocator>](header/scoped_allocator "cpp/header/scoped allocator") (C++11) | [Nested allocator class](memory/scoped_allocator_adaptor "cpp/memory/scoped allocator adaptor") | | Numeric limits | | [<cfloat>](header/cfloat "cpp/header/cfloat") | [Limits of floating-point types](types/climits "cpp/types/climits") | | [<cinttypes>](header/cinttypes "cpp/header/cinttypes") (C++11) | [Formatting macros](types/integer#Format_macro_constants "cpp/types/integer"), `intmax_t` and `uintmax_t` math and conversions | | [<climits>](header/climits "cpp/header/climits") | [Limits of integral types](types/climits "cpp/types/climits") | | [<cstdint>](header/cstdint "cpp/header/cstdint") (C++11) | [Fixed-width integer types](types/integer "cpp/types/integer") and [limits of other types](types/climits "cpp/types/climits") | | [<limits>](header/limits "cpp/header/limits") | [Uniform way to query properties of arithmetic types](types/numeric_limits "cpp/types/numeric limits") | | [<stdfloat>](header/stdfloat "cpp/header/stdfloat") (C++23) | Optional extended floating-point types | | Error handling | | [<cassert>](header/cassert "cpp/header/cassert") | [Conditionally compiled macro that compares its argument to zero](error/assert "cpp/error/assert") | | [<cerrno>](header/cerrno "cpp/header/cerrno") | [Macro containing the last error number](error/errno "cpp/error/errno") | | [<exception>](header/exception "cpp/header/exception") | [Exception handling utilities](error#Exception_handling "cpp/error") | | [<stacktrace>](header/stacktrace "cpp/header/stacktrace") (C++23) | [Stacktrace](utility/basic_stacktrace "cpp/utility/basic stacktrace") library | | [<stdexcept>](header/stdexcept "cpp/header/stdexcept") | [Standard exception objects](error#Exception_categories "cpp/error") | | [<system\_error>](header/system_error "cpp/header/system error") (C++11) | Defines `[std::error\_code](error/error_code "cpp/error/error code")`, a platform-dependent error code | | Strings library | | [<cctype>](header/cctype "cpp/header/cctype") | [Functions to determine the category of narrow characters](string/byte "cpp/string/byte") | | [<charconv>](header/charconv "cpp/header/charconv") (C++17) | `std::to_chars` and `std::from_chars` | | [<cstring>](header/cstring "cpp/header/cstring") | Various [narrow character string handling functions](string/byte "cpp/string/byte") | | [<cuchar>](header/cuchar "cpp/header/cuchar") (C++11) | C-style [Unicode character conversion functions](string/multibyte "cpp/string/multibyte") | | [<cwchar>](header/cwchar "cpp/header/cwchar") | Various [wide](string/wide "cpp/string/wide") and [multibyte](string/multibyte "cpp/string/multibyte") string handling functions | | [<cwctype>](header/cwctype "cpp/header/cwctype") | [Functions to determine the catagory of wide characters](string/wide "cpp/string/wide") | | [<format>](header/format "cpp/header/format") (C++20) | [Formatting library](utility/format "cpp/utility/format") including `[std::format](utility/format/format "cpp/utility/format/format")` | | [<string>](header/string "cpp/header/string") | `[std::basic\_string](string/basic_string "cpp/string/basic string")` class template | | [<string\_view>](header/string_view "cpp/header/string view") (C++17) | `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` class template | | Containers library | | [<array>](header/array "cpp/header/array") (C++11) | `[std::array](container/array "cpp/container/array")` container | | [<deque>](header/deque "cpp/header/deque") | `[std::deque](container/deque "cpp/container/deque")` container | | [<flat\_map>](header/flat_map "cpp/header/flat map") (C++23) | `std::flat_map` and `std::flat_multimap` container adaptors | | [<flat\_set>](header/flat_set "cpp/header/flat set") (C++23) | `std::flat_set` and `std::flat_multiset` container adaptors | | [<forward\_list>](header/forward_list "cpp/header/forward list") (C++11) | `[std::forward\_list](container/forward_list "cpp/container/forward list")` container | | [<list>](header/list "cpp/header/list") | `[std::list](container/list "cpp/container/list")` container | | [<map>](header/map "cpp/header/map") | `[std::map](container/map "cpp/container/map")` and `[std::multimap](container/multimap "cpp/container/multimap")` associative containers | | [<mdspan>](header/mdspan "cpp/header/mdspan") (C++23) | `std::mdspan` view | | [<queue>](header/queue "cpp/header/queue") | `[std::queue](container/queue "cpp/container/queue")` and `[std::priority\_queue](container/priority_queue "cpp/container/priority queue")` container adaptors | | [<set>](header/set "cpp/header/set") | `[std::set](container/set "cpp/container/set")` and `[std::multiset](container/multiset "cpp/container/multiset")` associative containers | | [<span>](header/span "cpp/header/span") (C++20) | `std::span` view | | [<stack>](header/stack "cpp/header/stack") | `[std::stack](container/stack "cpp/container/stack")` container adaptor | | [<unordered\_map>](header/unordered_map "cpp/header/unordered map") (C++11) | `[std::unordered\_map](container/unordered_map "cpp/container/unordered map")` and `[std::unordered\_multimap](container/unordered_multimap "cpp/container/unordered multimap")` unordered associative containers | | [<unordered\_set>](header/unordered_set "cpp/header/unordered set") (C++11) | `[std::unordered\_set](container/unordered_set "cpp/container/unordered set")` and `[std::unordered\_multiset](container/unordered_multiset "cpp/container/unordered multiset")` unordered associative containers | | [<vector>](header/vector "cpp/header/vector") | `[std::vector](container/vector "cpp/container/vector")` container | | Iterators library | | [<iterator>](header/iterator "cpp/header/iterator") | [Range iterators](iterator "cpp/iterator") | | Ranges library | | [<generator>](header/generator "cpp/header/generator") (C++23) | `std::generator` class template | | [<ranges>](header/ranges "cpp/header/ranges") (C++20) | [Range access, primitives, requirements, utilities and adaptors](ranges "cpp/ranges") | | Algorithms library | | [<algorithm>](header/algorithm "cpp/header/algorithm") | [Algorithms that operate on ranges](algorithm "cpp/algorithm") | | [<execution>](header/execution "cpp/header/execution") (C++17) | Predefined execution policies for parallel versions of the algorithms | | Numerics library | | [<bit>](header/bit "cpp/header/bit") (C++20) | [Bit manipulation](numeric#Bit_manipulation_.28since_C.2B.2B20.29 "cpp/numeric") functions | | [<cfenv>](header/cfenv "cpp/header/cfenv") (C++11) | [Floating-point environment](numeric/fenv "cpp/numeric/fenv") access functions | | [<cmath>](header/cmath "cpp/header/cmath") | [Common mathematics functions](numeric/math "cpp/numeric/math") | | [<complex>](header/complex "cpp/header/complex") | [Complex number type](numeric/complex "cpp/numeric/complex") | | [<numbers>](header/numbers "cpp/header/numbers") (C++20) | [Math constants](numeric/constants "cpp/numeric/constants") | | [<numeric>](header/numeric "cpp/header/numeric") | [Numeric operations on values in ranges](numeric "cpp/numeric") | | [<random>](header/random "cpp/header/random") (C++11) | [Random number generators and distributions](numeric/random "cpp/numeric/random") | | [<ratio>](header/ratio "cpp/header/ratio") (C++11) | [Compile-time rational arithmetic](numeric/ratio "cpp/numeric/ratio") | | [<valarray>](header/valarray "cpp/header/valarray") | [Class for representing and manipulating arrays of values](numeric/valarray "cpp/numeric/valarray") | | Localization library | | [<clocale>](header/clocale "cpp/header/clocale") | [C localization utilities](locale#C_library_locales "cpp/locale") | | [<codecvt>](header/codecvt "cpp/header/codecvt") (C++11)(deprecated in C++17) | [Unicode conversion facilities](locale "cpp/locale") | | [<locale>](header/locale "cpp/header/locale") | [Localization utilities](locale "cpp/locale") | | Input/output library | | [<cstdio>](header/cstdio "cpp/header/cstdio") | [C-style input-output functions](io/c "cpp/io/c") | | [<fstream>](header/fstream "cpp/header/fstream") | `[std::basic\_fstream](io/basic_fstream "cpp/io/basic fstream")`, `[std::basic\_ifstream](io/basic_ifstream "cpp/io/basic ifstream")`, `[std::basic\_ofstream](io/basic_ofstream "cpp/io/basic ofstream")` class templates and several typedefs | | [<iomanip>](header/iomanip "cpp/header/iomanip") | [Helper functions to control the format of input and output](io/manip "cpp/io/manip") | | [<ios>](header/ios "cpp/header/ios") | `[std::ios\_base](io/ios_base "cpp/io/ios base")` class, `[std::basic\_ios](io/basic_ios "cpp/io/basic ios")` class template and several typedefs | | [<iosfwd>](header/iosfwd "cpp/header/iosfwd") | Forward declarations of all classes in the input/output library | | [<iostream>](header/iostream "cpp/header/iostream") | Several standard stream objects | | [<istream>](header/istream "cpp/header/istream") | `[std::basic\_istream](io/basic_istream "cpp/io/basic istream")` class template and several typedefs | | [<ostream>](header/ostream "cpp/header/ostream") | `[std::basic\_ostream](io/basic_ostream "cpp/io/basic ostream")`, `[std::basic\_iostream](io/basic_iostream "cpp/io/basic iostream")` class templates and several typedefs | | [<print>](header/print "cpp/header/print") (C++23) | Formatted output library including `std::print` | | [<spanstream>](header/spanstream "cpp/header/spanstream") (C++23) | `std::basic_spanstream`, `std::basic_ispanstream`, `std::basic_ospanstream` class templates and typedefs | | [<sstream>](header/sstream "cpp/header/sstream") | `[std::basic\_stringstream](io/basic_stringstream "cpp/io/basic stringstream")`, `[std::basic\_istringstream](io/basic_istringstream "cpp/io/basic istringstream")`, `[std::basic\_ostringstream](io/basic_ostringstream "cpp/io/basic ostringstream")` class templates and several typedefs | | [<streambuf>](header/streambuf "cpp/header/streambuf") | `[std::basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf")` class template | | [<strstream>](header/strstream "cpp/header/strstream") (deprecated in C++98) | `[std::strstream](io/strstream "cpp/io/strstream")`, `[std::istrstream](io/istrstream "cpp/io/istrstream")`, `[std::ostrstream](io/ostrstream "cpp/io/ostrstream")` | | [<syncstream>](header/syncstream "cpp/header/syncstream") (C++20) | `std::basic_osyncstream`, `std::basic_syncbuf`, and typedefs | | Filesystem library | | [<filesystem>](header/filesystem "cpp/header/filesystem") (C++17) | `std::path` class and [supporting functions](filesystem "cpp/filesystem") | | Regular Expressions library | | [<regex>](header/regex "cpp/header/regex") (C++11) | [Classes, algorithms and iterators to support regular expression processing](regex "cpp/regex") | | Atomic Operations library | | [<atomic>](header/atomic "cpp/header/atomic") (C++11) | [Atomic operations library](thread#Atomic_operations "cpp/thread") | | Thread support library | | [<barrier>](header/barrier "cpp/header/barrier") (C++20) | [Barriers](thread/barrier "cpp/thread/barrier") | | [<condition\_variable>](header/condition_variable "cpp/header/condition variable") (C++11) | [Thread waiting conditions](thread "cpp/thread") | | [<future>](header/future "cpp/header/future") (C++11) | [Primitives for asynchronous computations](thread "cpp/thread") | | [<latch>](header/latch "cpp/header/latch") (C++20) | [Latches](thread/latch "cpp/thread/latch") | | [<mutex>](header/mutex "cpp/header/mutex") (C++11) | [Mutual exclusion primitives](thread "cpp/thread") | | [<semaphore>](header/semaphore "cpp/header/semaphore") (C++20) | [Semaphores](thread/counting_semaphore "cpp/thread/counting semaphore") | | [<shared\_mutex>](header/shared_mutex "cpp/header/shared mutex") (C++14) | [Shared mutual exclusion primitives](thread "cpp/thread") | | [<stop\_token>](header/stop_token "cpp/header/stop token") (C++20) | Stop tokens for `[std::jthread](thread/jthread "cpp/thread/jthread")` | | [<thread>](header/thread "cpp/header/thread") (C++11) | `[std::thread](thread/thread "cpp/thread/thread")` class and [supporting functions](thread "cpp/thread") | ### C compatibility headers For some of the C standard library headers of the form `*xxx*.h`, the C++ standard library both includes an identically-named header and another header of the form `c*xxx*` (all meaningful `c*xxx*` headers are listed above). The intended use of headers of form `*xxx*.h` is for interoperability only. It is possible that C++ source files need to include one of these headers in order to be valid ISO C. Source files that are not intended to also be valid ISO C should not use any of the C headers. With the exception of [`complex.h`](header/ccomplex "cpp/header/ccomplex") , each `*xxx*.h` header included in the C++ standard library places in the global namespace each name that the corresponding `c*xxx*` header would have placed in the `std` namespace. These headers are allowed to also declare the same names in the `std` namespace, and the corresponding `c*xxx*` headers are allowed to also declare the same names in the global namespace: including [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") definitely provides `[std::malloc](http://en.cppreference.com/w/cpp/memory/c/malloc)` and may also provide `::malloc`. Including `<stdlib.h>` definitely provides `::malloc` and may also provide `[std::malloc](http://en.cppreference.com/w/cpp/memory/c/malloc)`. This applies even to functions and function overloads that are not part of C standard library. Notes: `*xxx*.h` headers are deprecated in C++98 and undeprecated in C++23. These headers are discouraged for pure C++ code, but not subject to future removal. | | | | --- | --- | | [<assert.h>](header/cassert "cpp/header/cassert") | Behaves same as [`<cassert>`](header/cassert "cpp/header/cassert") | | [<ctype.h>](header/cctype "cpp/header/cctype") | Behaves as if each name from [`<cctype>`](header/cctype "cpp/header/cctype") is placed in global namespace | | [<errno.h>](header/cerrno "cpp/header/cerrno") | Behaves same as [`<cerrno>`](header/cerrno "cpp/header/cerrno") | | [<fenv.h>](header/cfenv "cpp/header/cfenv") (C++11) | Behaves as if each name from [`<cfenv>`](header/cfenv "cpp/header/cfenv") is placed in global namespace | | [<float.h>](header/cfloat "cpp/header/cfloat") | Behaves same as [`<cfloat>`](header/cfloat "cpp/header/cfloat") | | [<inttypes.h>](header/cinttypes "cpp/header/cinttypes") (C++11) | Behaves as if each name from [`<cinttypes>`](header/cinttypes "cpp/header/cinttypes") is placed in global namespace | | [<limits.h>](header/climits "cpp/header/climits") | Behaves same as [`<climits>`](header/climits "cpp/header/climits") | | [<locale.h>](header/clocale "cpp/header/clocale") | Behaves as if each name from [`<clocale>`](header/clocale "cpp/header/clocale") is placed in global namespace | | [<math.h>](header/cmath "cpp/header/cmath") | Behaves as if each name from [`<cmath>`](header/cmath "cpp/header/cmath") is placed in global namespace,except for names of [mathematical special functions](numeric/special_functions "cpp/numeric/special functions") | | [<setjmp.h>](header/csetjmp "cpp/header/csetjmp") | Behaves as if each name from [`<csetjmp>`](header/csetjmp "cpp/header/csetjmp") is placed in global namespace | | [<signal.h>](header/csignal "cpp/header/csignal") | Behaves as if each name from [`<csignal>`](header/csignal "cpp/header/csignal") is placed in global namespace | | [<stdarg.h>](header/cstdarg "cpp/header/cstdarg") | Behaves as if each name from [`<cstdarg>`](header/cstdarg "cpp/header/cstdarg") is placed in global namespace | | [<stddef.h>](header/cstddef "cpp/header/cstddef") | Behaves as if each name from [`<cstddef>`](header/cstddef "cpp/header/cstddef") is placed in global namespace,except for names of [`std::byte` and related functions](types/byte "cpp/types/byte") | | [<stdint.h>](header/cstdint "cpp/header/cstdint") (C++11) | Behaves as if each name from [`<cstdint>`](header/cstdint "cpp/header/cstdint") is placed in global namespace | | [<stdio.h>](header/cstdio "cpp/header/cstdio") | Behaves as if each name from [`<cstdio>`](header/cstdio "cpp/header/cstdio") is placed in global namespace | | [<stdlib.h>](header/cstdlib "cpp/header/cstdlib") | Behaves as if each name from [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") is placed in global namespace | | [<string.h>](header/cstring "cpp/header/cstring") | Behaves as if each name from [`<cstring>`](header/cstring "cpp/header/cstring") is placed in global namespace | | [<time.h>](header/ctime "cpp/header/ctime") | Behaves as if each name from [`<ctime>`](header/ctime "cpp/header/ctime") is placed in global namespace | | [<uchar.h>](header/cuchar "cpp/header/cuchar") (C++11) | Behaves as if each name from [`<cuchar>`](header/cuchar "cpp/header/cuchar") is placed in global namespace | | [<wchar.h>](header/cwchar "cpp/header/cwchar") | Behaves as if each name from [`<cwchar>`](header/cwchar "cpp/header/cwchar") is placed in global namespace | | [<wctype.h>](header/cwctype "cpp/header/cwctype") | Behaves as if each name from [`<cwctype>`](header/cwctype "cpp/header/cwctype") is placed in global namespace | #### Special C compatibility headers The header `<stdatomic.h>` declares names which are also provided in the C standard library, and defines the `_Atomic` macro which is a [keyword](https://en.cppreference.com/w/c/keyword/_Atomic "c/keyword/ Atomic") in C. Unlike other `*xxx*.h` headers, corresponding `<cstdatomic>` is not provided. | | | | --- | --- | | [<stdatomic.h>](header/stdatomic.h "cpp/header/stdatomic.h") (C++23) | Defines `_Atomic` and provides corresponding components in the C standard library | #### Empty C headers The headers `<complex.h>`, [`<ccomplex>`](header/ccomplex "cpp/header/ccomplex"), `<tgmath.h>`, and [`<ctgmath>`](header/ctgmath "cpp/header/ctgmath") do not contain any content from the C standard library and instead merely include other headers from the C++ standard library. | | | | --- | --- | | [<ccomplex>](header/ccomplex "cpp/header/ccomplex") (C++11)(deprecated in C++17)(removed in C++20) | Simply includes the header [`<complex>`](header/complex "cpp/header/complex") | | [<complex.h>](header/ccomplex "cpp/header/ccomplex") (C++11) | Simply includes the header [`<complex>`](header/complex "cpp/header/complex") | | [<ctgmath>](header/ctgmath "cpp/header/ctgmath") (C++11)(deprecated in C++17)(removed in C++20) | Simply includes the headers [`<complex>`](header/complex "cpp/header/complex") and [`<cmath>`](header/cmath "cpp/header/cmath"): the overloads equivalent to the contents of the C header `tgmath.h` are already provided by those headers | | [<tgmath.h>](header/ctgmath "cpp/header/ctgmath") (C++11) | Simply includes the headers [`<complex>`](header/complex "cpp/header/complex") and [`<cmath>`](header/cmath "cpp/header/cmath") | #### Meaningless C headers The headers [`<ciso646>`](header/ciso646 "cpp/header/ciso646"), [`<cstdalign>`](header/cstdalign "cpp/header/cstdalign"), and [`<cstdbool>`](header/cstdbool "cpp/header/cstdbool") are meaningless in C++ because the macros they provide in C are language keywords in C++. | | | | --- | --- | | [<ciso646>](header/ciso646 "cpp/header/ciso646") (removed in C++20) | Empty header. [The macros that appear in `iso646.h` in C](https://en.cppreference.com/w/c/language/operator_alternative "c/language/operator alternative") are [keywords in C++](language/operator_alternative "cpp/language/operator alternative") | | [<cstdalign>](header/cstdalign "cpp/header/cstdalign") (C++11)(deprecated in C++17)(removed in C++20) | Defines one [compatibility macro constant](types "cpp/types") | | [<cstdbool>](header/cstdbool "cpp/header/cstdbool") (C++11)(deprecated in C++17)(removed in C++20) | Defines one [compatibility macro constant](types "cpp/types") | | [<iso646.h>](header/ciso646 "cpp/header/ciso646") | Has no effect | | [<stdalign.h>](header/cstdalign "cpp/header/cstdalign") (C++11) | Defines one [compatibility macro constant](types "cpp/types") | | [<stdbool.h>](header/cstdbool "cpp/header/cstdbool") (C++11) | Defines one [compatibility macro constant](types "cpp/types") | #### Unsupported C headers The C headers `<stdatomic.h>`, (until C++23) `<stdnoreturn.h>`, and `<threads.h>` are not included in C++ and have no `c*xxx*` equivalents. ### [Experimental libraries](https://en.cppreference.com/w/cpp/header/experimental "cpp/header/experimental") [C++ TR's/TS's](https://en.cppreference.com/w/cpp/experimental "cpp/experimental") also define several collections of headers. ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/header "c/header") for C Standard Library headers |
programming_docs
cpp C++14 C++14 ===== C++14 is a minor version after the major version C++11, featuring mainly minor improvements and defect fixes. Its approval was announced on August 18, 2014. It was released on December 15, 2014. Before its approval, C++1y is sometimes used to indicate its release in 2010s. New language features ---------------------- * [variable templates](language/variable_template "cpp/language/variable template") * [generic lambdas](language/lambda "cpp/language/lambda") * lambda init-capture * new/delete elision * [relaxed restrictions on constexpr functions](language/constexpr "cpp/language/constexpr") * [binary literals](language/integer_literal "cpp/language/integer literal") * [digit separators](language/integer_literal#Single_quote "cpp/language/integer literal") * [return type deduction for functions](language/function#Return_type_deduction_.28since_C.2B.2B14.29 "cpp/language/function") * [aggregate classes](language/aggregate_initialization "cpp/language/aggregate initialization") with default non-static member initializers. New library features --------------------- * `[std::make\_unique](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique")` * `[std::shared\_timed\_mutex](thread/shared_timed_mutex "cpp/thread/shared timed mutex")` and `[std::shared\_lock](thread/shared_lock "cpp/thread/shared lock")` * `[std::integer\_sequence](utility/integer_sequence "cpp/utility/integer sequence")` * `[std::exchange](utility/exchange "cpp/utility/exchange")` * `[std::quoted](io/manip/quoted "cpp/io/manip/quoted")` * and many small improvements to existing library facilities, such as + two-range overloads for some algorithms + type alias versions of type traits + user-defined literals for [`basic_string`](string/basic_string "cpp/string/basic string"), [`duration`](chrono/duration "cpp/chrono/duration") and [`complex`](numeric/complex "cpp/numeric/complex") + etc. Defect reports --------------- | Defect Reports fixed in C++14 (276 core, 158 library) | | --- | | * [CWG#129](https://wg21.cmeerw.net/cwg/issue129) * [CWG#223](https://wg21.cmeerw.net/cwg/issue223) * [CWG#240](https://wg21.cmeerw.net/cwg/issue240) * [CWG#292](https://wg21.cmeerw.net/cwg/issue292) * [CWG#312](https://wg21.cmeerw.net/cwg/issue312) * [CWG#332](https://wg21.cmeerw.net/cwg/issue332) * [CWG#342](https://wg21.cmeerw.net/cwg/issue342) * [CWG#344](https://wg21.cmeerw.net/cwg/issue344) * [CWG#388](https://wg21.cmeerw.net/cwg/issue388) * [CWG#462](https://wg21.cmeerw.net/cwg/issue462) * [CWG#482](https://wg21.cmeerw.net/cwg/issue482) * [CWG#483](https://wg21.cmeerw.net/cwg/issue483) * [CWG#496](https://wg21.cmeerw.net/cwg/issue496) * [CWG#535](https://wg21.cmeerw.net/cwg/issue535) * [CWG#539](https://wg21.cmeerw.net/cwg/issue539) * [CWG#565](https://wg21.cmeerw.net/cwg/issue565) * [CWG#577](https://wg21.cmeerw.net/cwg/issue577) * [CWG#583](https://wg21.cmeerw.net/cwg/issue583) * [CWG#597](https://wg21.cmeerw.net/cwg/issue597) * [CWG#616](https://wg21.cmeerw.net/cwg/issue616) * [CWG#623](https://wg21.cmeerw.net/cwg/issue623) * [CWG#631](https://wg21.cmeerw.net/cwg/issue631) * [CWG#675](https://wg21.cmeerw.net/cwg/issue675) * [CWG#712](https://wg21.cmeerw.net/cwg/issue712) * [CWG#729](https://wg21.cmeerw.net/cwg/issue729) * [CWG#739](https://wg21.cmeerw.net/cwg/issue739) * [CWG#755](https://wg21.cmeerw.net/cwg/issue755) * [CWG#903](https://wg21.cmeerw.net/cwg/issue903) * [CWG#912](https://wg21.cmeerw.net/cwg/issue912) * [CWG#974](https://wg21.cmeerw.net/cwg/issue974) * [CWG#975](https://wg21.cmeerw.net/cwg/issue975) * [CWG#977](https://wg21.cmeerw.net/cwg/issue977) * [CWG#1003](https://wg21.cmeerw.net/cwg/issue1003) * [CWG#1013](https://wg21.cmeerw.net/cwg/issue1013) * [CWG#1024](https://wg21.cmeerw.net/cwg/issue1024) * [CWG#1048](https://wg21.cmeerw.net/cwg/issue1048) * [CWG#1059](https://wg21.cmeerw.net/cwg/issue1059) * [CWG#1093](https://wg21.cmeerw.net/cwg/issue1093) * [CWG#1213](https://wg21.cmeerw.net/cwg/issue1213) * [CWG#1226](https://wg21.cmeerw.net/cwg/issue1226) * [CWG#1227](https://wg21.cmeerw.net/cwg/issue1227) * [CWG#1250](https://wg21.cmeerw.net/cwg/issue1250) * [CWG#1251](https://wg21.cmeerw.net/cwg/issue1251) * [CWG#1260](https://wg21.cmeerw.net/cwg/issue1260) * [CWG#1261](https://wg21.cmeerw.net/cwg/issue1261) * [CWG#1262](https://wg21.cmeerw.net/cwg/issue1262) * [CWG#1264](https://wg21.cmeerw.net/cwg/issue1264) * [CWG#1265](https://wg21.cmeerw.net/cwg/issue1265) * [CWG#1267](https://wg21.cmeerw.net/cwg/issue1267) * [CWG#1268](https://wg21.cmeerw.net/cwg/issue1268) * [CWG#1269](https://wg21.cmeerw.net/cwg/issue1269) * [CWG#1270](https://wg21.cmeerw.net/cwg/issue1270) * [CWG#1275](https://wg21.cmeerw.net/cwg/issue1275) * [CWG#1282](https://wg21.cmeerw.net/cwg/issue1282) * [CWG#1287](https://wg21.cmeerw.net/cwg/issue1287) * [CWG#1288](https://wg21.cmeerw.net/cwg/issue1288) * [CWG#1290](https://wg21.cmeerw.net/cwg/issue1290) * [CWG#1293](https://wg21.cmeerw.net/cwg/issue1293) * [CWG#1295](https://wg21.cmeerw.net/cwg/issue1295) * [CWG#1296](https://wg21.cmeerw.net/cwg/issue1296) * [CWG#1297](https://wg21.cmeerw.net/cwg/issue1297) * [CWG#1298](https://wg21.cmeerw.net/cwg/issue1298) * [CWG#1301](https://wg21.cmeerw.net/cwg/issue1301) * [CWG#1302](https://wg21.cmeerw.net/cwg/issue1302) * [CWG#1305](https://wg21.cmeerw.net/cwg/issue1305) * [CWG#1306](https://wg21.cmeerw.net/cwg/issue1306) * [CWG#1307](https://wg21.cmeerw.net/cwg/issue1307) * [CWG#1308](https://wg21.cmeerw.net/cwg/issue1308) * [CWG#1310](https://wg21.cmeerw.net/cwg/issue1310) * [CWG#1311](https://wg21.cmeerw.net/cwg/issue1311) * [CWG#1312](https://wg21.cmeerw.net/cwg/issue1312) * [CWG#1313](https://wg21.cmeerw.net/cwg/issue1313) * [CWG#1318](https://wg21.cmeerw.net/cwg/issue1318) * [CWG#1320](https://wg21.cmeerw.net/cwg/issue1320) * [CWG#1321](https://wg21.cmeerw.net/cwg/issue1321) * [CWG#1324](https://wg21.cmeerw.net/cwg/issue1324) * [CWG#1327](https://wg21.cmeerw.net/cwg/issue1327) * [CWG#1328](https://wg21.cmeerw.net/cwg/issue1328) * [CWG#1329](https://wg21.cmeerw.net/cwg/issue1329) * [CWG#1330](https://wg21.cmeerw.net/cwg/issue1330) * [CWG#1333](https://wg21.cmeerw.net/cwg/issue1333) * [CWG#1336](https://wg21.cmeerw.net/cwg/issue1336) * [CWG#1340](https://wg21.cmeerw.net/cwg/issue1340) * [CWG#1344](https://wg21.cmeerw.net/cwg/issue1344) * [CWG#1345](https://wg21.cmeerw.net/cwg/issue1345) * [CWG#1346](https://wg21.cmeerw.net/cwg/issue1346) * [CWG#1347](https://wg21.cmeerw.net/cwg/issue1347) * [CWG#1350](https://wg21.cmeerw.net/cwg/issue1350) * [CWG#1352](https://wg21.cmeerw.net/cwg/issue1352) * [CWG#1354](https://wg21.cmeerw.net/cwg/issue1354) * [CWG#1355](https://wg21.cmeerw.net/cwg/issue1355) * [CWG#1357](https://wg21.cmeerw.net/cwg/issue1357) * [CWG#1358](https://wg21.cmeerw.net/cwg/issue1358) * [CWG#1359](https://wg21.cmeerw.net/cwg/issue1359) * [CWG#1361](https://wg21.cmeerw.net/cwg/issue1361) * [CWG#1362](https://wg21.cmeerw.net/cwg/issue1362) * [CWG#1363](https://wg21.cmeerw.net/cwg/issue1363) * [CWG#1364](https://wg21.cmeerw.net/cwg/issue1364) * [CWG#1365](https://wg21.cmeerw.net/cwg/issue1365) * [CWG#1366](https://wg21.cmeerw.net/cwg/issue1366) * [CWG#1367](https://wg21.cmeerw.net/cwg/issue1367) * [CWG#1368](https://wg21.cmeerw.net/cwg/issue1368) * [CWG#1369](https://wg21.cmeerw.net/cwg/issue1369) * [CWG#1370](https://wg21.cmeerw.net/cwg/issue1370) * [CWG#1372](https://wg21.cmeerw.net/cwg/issue1372) * [CWG#1374](https://wg21.cmeerw.net/cwg/issue1374) * [CWG#1375](https://wg21.cmeerw.net/cwg/issue1375) * [CWG#1376](https://wg21.cmeerw.net/cwg/issue1376) * [CWG#1380](https://wg21.cmeerw.net/cwg/issue1380) * [CWG#1381](https://wg21.cmeerw.net/cwg/issue1381) * [CWG#1382](https://wg21.cmeerw.net/cwg/issue1382) * [CWG#1383](https://wg21.cmeerw.net/cwg/issue1383) * [CWG#1385](https://wg21.cmeerw.net/cwg/issue1385) * [CWG#1387](https://wg21.cmeerw.net/cwg/issue1387) * [CWG#1388](https://wg21.cmeerw.net/cwg/issue1388) * [CWG#1392](https://wg21.cmeerw.net/cwg/issue1392) * [CWG#1394](https://wg21.cmeerw.net/cwg/issue1394) * [CWG#1398](https://wg21.cmeerw.net/cwg/issue1398) * [CWG#1399](https://wg21.cmeerw.net/cwg/issue1399) * [CWG#1401](https://wg21.cmeerw.net/cwg/issue1401) * [CWG#1402](https://wg21.cmeerw.net/cwg/issue1402) * [CWG#1405](https://wg21.cmeerw.net/cwg/issue1405) * [CWG#1406](https://wg21.cmeerw.net/cwg/issue1406) * [CWG#1408](https://wg21.cmeerw.net/cwg/issue1408) * [CWG#1409](https://wg21.cmeerw.net/cwg/issue1409) * [CWG#1410](https://wg21.cmeerw.net/cwg/issue1410) * [CWG#1411](https://wg21.cmeerw.net/cwg/issue1411) * [CWG#1412](https://wg21.cmeerw.net/cwg/issue1412) * [CWG#1413](https://wg21.cmeerw.net/cwg/issue1413) * [CWG#1415](https://wg21.cmeerw.net/cwg/issue1415) * [CWG#1416](https://wg21.cmeerw.net/cwg/issue1416) * [CWG#1417](https://wg21.cmeerw.net/cwg/issue1417) * [CWG#1418](https://wg21.cmeerw.net/cwg/issue1418) * [CWG#1423](https://wg21.cmeerw.net/cwg/issue1423) * [CWG#1424](https://wg21.cmeerw.net/cwg/issue1424) * [CWG#1425](https://wg21.cmeerw.net/cwg/issue1425) * [CWG#1428](https://wg21.cmeerw.net/cwg/issue1428) * [CWG#1431](https://wg21.cmeerw.net/cwg/issue1431) * [CWG#1435](https://wg21.cmeerw.net/cwg/issue1435) * [CWG#1437](https://wg21.cmeerw.net/cwg/issue1437) * [CWG#1438](https://wg21.cmeerw.net/cwg/issue1438) * [CWG#1439](https://wg21.cmeerw.net/cwg/issue1439) * [CWG#1440](https://wg21.cmeerw.net/cwg/issue1440) * [CWG#1441](https://wg21.cmeerw.net/cwg/issue1441) * [CWG#1442](https://wg21.cmeerw.net/cwg/issue1442) * [CWG#1447](https://wg21.cmeerw.net/cwg/issue1447) * [CWG#1449](https://wg21.cmeerw.net/cwg/issue1449) * [CWG#1450](https://wg21.cmeerw.net/cwg/issue1450) * [CWG#1453](https://wg21.cmeerw.net/cwg/issue1453) * [CWG#1454](https://wg21.cmeerw.net/cwg/issue1454) * [CWG#1455](https://wg21.cmeerw.net/cwg/issue1455) * [CWG#1456](https://wg21.cmeerw.net/cwg/issue1456) * [CWG#1457](https://wg21.cmeerw.net/cwg/issue1457) * [CWG#1458](https://wg21.cmeerw.net/cwg/issue1458) * [CWG#1460](https://wg21.cmeerw.net/cwg/issue1460) * [CWG#1462](https://wg21.cmeerw.net/cwg/issue1462) * [CWG#1464](https://wg21.cmeerw.net/cwg/issue1464) * [CWG#1466](https://wg21.cmeerw.net/cwg/issue1466) * [CWG#1471](https://wg21.cmeerw.net/cwg/issue1471) * [CWG#1472](https://wg21.cmeerw.net/cwg/issue1472) * [CWG#1473](https://wg21.cmeerw.net/cwg/issue1473) * [CWG#1475](https://wg21.cmeerw.net/cwg/issue1475) * [CWG#1476](https://wg21.cmeerw.net/cwg/issue1476) * [CWG#1477](https://wg21.cmeerw.net/cwg/issue1477) * [CWG#1479](https://wg21.cmeerw.net/cwg/issue1479) * [CWG#1480](https://wg21.cmeerw.net/cwg/issue1480) * [CWG#1481](https://wg21.cmeerw.net/cwg/issue1481) * [CWG#1482](https://wg21.cmeerw.net/cwg/issue1482) * [CWG#1487](https://wg21.cmeerw.net/cwg/issue1487) * [CWG#1489](https://wg21.cmeerw.net/cwg/issue1489) * [CWG#1491](https://wg21.cmeerw.net/cwg/issue1491) * [CWG#1493](https://wg21.cmeerw.net/cwg/issue1493) * [CWG#1494](https://wg21.cmeerw.net/cwg/issue1494) * [CWG#1495](https://wg21.cmeerw.net/cwg/issue1495) * [CWG#1502](https://wg21.cmeerw.net/cwg/issue1502) * [CWG#1503](https://wg21.cmeerw.net/cwg/issue1503) * [CWG#1504](https://wg21.cmeerw.net/cwg/issue1504) * [CWG#1506](https://wg21.cmeerw.net/cwg/issue1506) * [CWG#1507](https://wg21.cmeerw.net/cwg/issue1507) * [CWG#1508](https://wg21.cmeerw.net/cwg/issue1508) * [CWG#1509](https://wg21.cmeerw.net/cwg/issue1509) * [CWG#1510](https://wg21.cmeerw.net/cwg/issue1510) * [CWG#1511](https://wg21.cmeerw.net/cwg/issue1511) * [CWG#1512](https://wg21.cmeerw.net/cwg/issue1512) * [CWG#1514](https://wg21.cmeerw.net/cwg/issue1514) * [CWG#1515](https://wg21.cmeerw.net/cwg/issue1515) * [CWG#1516](https://wg21.cmeerw.net/cwg/issue1516) * [CWG#1522](https://wg21.cmeerw.net/cwg/issue1522) * [CWG#1527](https://wg21.cmeerw.net/cwg/issue1527) * [CWG#1528](https://wg21.cmeerw.net/cwg/issue1528) * [CWG#1531](https://wg21.cmeerw.net/cwg/issue1531) * [CWG#1532](https://wg21.cmeerw.net/cwg/issue1532) * [CWG#1533](https://wg21.cmeerw.net/cwg/issue1533) * [CWG#1535](https://wg21.cmeerw.net/cwg/issue1535) * [CWG#1537](https://wg21.cmeerw.net/cwg/issue1537) * [CWG#1538](https://wg21.cmeerw.net/cwg/issue1538) * [CWG#1539](https://wg21.cmeerw.net/cwg/issue1539) * [CWG#1541](https://wg21.cmeerw.net/cwg/issue1541) * [CWG#1543](https://wg21.cmeerw.net/cwg/issue1543) * [CWG#1544](https://wg21.cmeerw.net/cwg/issue1544) * [CWG#1550](https://wg21.cmeerw.net/cwg/issue1550) * [CWG#1551](https://wg21.cmeerw.net/cwg/issue1551) * [CWG#1553](https://wg21.cmeerw.net/cwg/issue1553) * [CWG#1556](https://wg21.cmeerw.net/cwg/issue1556) * [CWG#1557](https://wg21.cmeerw.net/cwg/issue1557) * [CWG#1559](https://wg21.cmeerw.net/cwg/issue1559) * [CWG#1560](https://wg21.cmeerw.net/cwg/issue1560) * [CWG#1562](https://wg21.cmeerw.net/cwg/issue1562) * [CWG#1563](https://wg21.cmeerw.net/cwg/issue1563) * [CWG#1567](https://wg21.cmeerw.net/cwg/issue1567) * [CWG#1569](https://wg21.cmeerw.net/cwg/issue1569) * [CWG#1570](https://wg21.cmeerw.net/cwg/issue1570) * [CWG#1575](https://wg21.cmeerw.net/cwg/issue1575) * [CWG#1576](https://wg21.cmeerw.net/cwg/issue1576) * [CWG#1579](https://wg21.cmeerw.net/cwg/issue1579) * [CWG#1583](https://wg21.cmeerw.net/cwg/issue1583) * [CWG#1587](https://wg21.cmeerw.net/cwg/issue1587) * [CWG#1588](https://wg21.cmeerw.net/cwg/issue1588) * [CWG#1592](https://wg21.cmeerw.net/cwg/issue1592) * [CWG#1593](https://wg21.cmeerw.net/cwg/issue1593) * [CWG#1595](https://wg21.cmeerw.net/cwg/issue1595) * [CWG#1597](https://wg21.cmeerw.net/cwg/issue1597) * [CWG#1598](https://wg21.cmeerw.net/cwg/issue1598) * [CWG#1601](https://wg21.cmeerw.net/cwg/issue1601) * [CWG#1604](https://wg21.cmeerw.net/cwg/issue1604) * [CWG#1605](https://wg21.cmeerw.net/cwg/issue1605) * [CWG#1607](https://wg21.cmeerw.net/cwg/issue1607) * [CWG#1608](https://wg21.cmeerw.net/cwg/issue1608) * [CWG#1611](https://wg21.cmeerw.net/cwg/issue1611) * [CWG#1612](https://wg21.cmeerw.net/cwg/issue1612) * [CWG#1613](https://wg21.cmeerw.net/cwg/issue1613) * [CWG#1618](https://wg21.cmeerw.net/cwg/issue1618) * [CWG#1629](https://wg21.cmeerw.net/cwg/issue1629) * [CWG#1648](https://wg21.cmeerw.net/cwg/issue1648) * [CWG#1649](https://wg21.cmeerw.net/cwg/issue1649) * [CWG#1658](https://wg21.cmeerw.net/cwg/issue1658) * [CWG#1660](https://wg21.cmeerw.net/cwg/issue1660) * [CWG#1662](https://wg21.cmeerw.net/cwg/issue1662) * [CWG#1664](https://wg21.cmeerw.net/cwg/issue1664) * [CWG#1666](https://wg21.cmeerw.net/cwg/issue1666) * [CWG#1669](https://wg21.cmeerw.net/cwg/issue1669) * [CWG#1673](https://wg21.cmeerw.net/cwg/issue1673) * [CWG#1674](https://wg21.cmeerw.net/cwg/issue1674) * [CWG#1681](https://wg21.cmeerw.net/cwg/issue1681) * [CWG#1684](https://wg21.cmeerw.net/cwg/issue1684) * [CWG#1687](https://wg21.cmeerw.net/cwg/issue1687) * [CWG#1689](https://wg21.cmeerw.net/cwg/issue1689) * [CWG#1690](https://wg21.cmeerw.net/cwg/issue1690) * [CWG#1691](https://wg21.cmeerw.net/cwg/issue1691) * [CWG#1692](https://wg21.cmeerw.net/cwg/issue1692) * [CWG#1693](https://wg21.cmeerw.net/cwg/issue1693) * [CWG#1707](https://wg21.cmeerw.net/cwg/issue1707) * [CWG#1716](https://wg21.cmeerw.net/cwg/issue1716) * [CWG#1717](https://wg21.cmeerw.net/cwg/issue1717) * [CWG#1732](https://wg21.cmeerw.net/cwg/issue1732) * [CWG#1737](https://wg21.cmeerw.net/cwg/issue1737) * [CWG#1738](https://wg21.cmeerw.net/cwg/issue1738) * [CWG#1739](https://wg21.cmeerw.net/cwg/issue1739) * [CWG#1740](https://wg21.cmeerw.net/cwg/issue1740) * [CWG#1741](https://wg21.cmeerw.net/cwg/issue1741) * [CWG#1746](https://wg21.cmeerw.net/cwg/issue1746) * [CWG#1747](https://wg21.cmeerw.net/cwg/issue1747) * [CWG#1759](https://wg21.cmeerw.net/cwg/issue1759) * [CWG#1760](https://wg21.cmeerw.net/cwg/issue1760) * [CWG#1762](https://wg21.cmeerw.net/cwg/issue1762) * [CWG#1764](https://wg21.cmeerw.net/cwg/issue1764) * [CWG#1765](https://wg21.cmeerw.net/cwg/issue1765) * [CWG#1767](https://wg21.cmeerw.net/cwg/issue1767) * [CWG#1769](https://wg21.cmeerw.net/cwg/issue1769) * [CWG#1770](https://wg21.cmeerw.net/cwg/issue1770) * [CWG#1772](https://wg21.cmeerw.net/cwg/issue1772) * [CWG#1773](https://wg21.cmeerw.net/cwg/issue1773) * [CWG#1775](https://wg21.cmeerw.net/cwg/issue1775) * [CWG#1778](https://wg21.cmeerw.net/cwg/issue1778) * [CWG#1786](https://wg21.cmeerw.net/cwg/issue1786) * [CWG#1787](https://wg21.cmeerw.net/cwg/issue1787) * [LWG#1214](http://wg21.link/lwg1214) * [LWG#1450](http://wg21.link/lwg1450) * [LWG#2003](http://wg21.link/lwg2003) * [LWG#2005](http://wg21.link/lwg2005) * [LWG#2009](http://wg21.link/lwg2009) * [LWG#2010](http://wg21.link/lwg2010) * [LWG#2011](http://wg21.link/lwg2011) * [LWG#2012](http://wg21.link/lwg2012) * [LWG#2013](http://wg21.link/lwg2013) * [LWG#2015](http://wg21.link/lwg2015) * [LWG#2018](http://wg21.link/lwg2018) * [LWG#2021](http://wg21.link/lwg2021) * [LWG#2028](http://wg21.link/lwg2028) * [LWG#2033](http://wg21.link/lwg2033) * [LWG#2039](http://wg21.link/lwg2039) * [LWG#2044](http://wg21.link/lwg2044) * [LWG#2045](http://wg21.link/lwg2045) * [LWG#2047](http://wg21.link/lwg2047) * [LWG#2048](http://wg21.link/lwg2048) * [LWG#2049](http://wg21.link/lwg2049) * [LWG#2050](http://wg21.link/lwg2050) * [LWG#2052](http://wg21.link/lwg2052) * [LWG#2054](http://wg21.link/lwg2054) * [LWG#2053](http://wg21.link/lwg2053) * [LWG#2056](http://wg21.link/lwg2056) * [LWG#2057](http://wg21.link/lwg2057) * [LWG#2058](http://wg21.link/lwg2058) * [LWG#2061](http://wg21.link/lwg2061) * [LWG#2064](http://wg21.link/lwg2064) * [LWG#2065](http://wg21.link/lwg2065) * [LWG#2066](http://wg21.link/lwg2066) * [LWG#2067](http://wg21.link/lwg2067) * [LWG#2069](http://wg21.link/lwg2069) * [LWG#2071](http://wg21.link/lwg2071) * [LWG#2074](http://wg21.link/lwg2074) * [LWG#2075](http://wg21.link/lwg2075) * [LWG#2078](http://wg21.link/lwg2078) * [LWG#2080](http://wg21.link/lwg2080) * [LWG#2081](http://wg21.link/lwg2081) * [LWG#2083](http://wg21.link/lwg2083) * [LWG#2085](http://wg21.link/lwg2085) * [LWG#2086](http://wg21.link/lwg2086) * [LWG#2087](http://wg21.link/lwg2087) * [LWG#2091](http://wg21.link/lwg2091) * [LWG#2092](http://wg21.link/lwg2092) * [LWG#2093](http://wg21.link/lwg2093) * [LWG#2094](http://wg21.link/lwg2094) * [LWG#2096](http://wg21.link/lwg2096) * [LWG#2097](http://wg21.link/lwg2097) * [LWG#2098](http://wg21.link/lwg2098) * [LWG#2099](http://wg21.link/lwg2099) * [LWG#2100](http://wg21.link/lwg2100) * [LWG#2102](http://wg21.link/lwg2102) * [LWG#2103](http://wg21.link/lwg2103) * [LWG#2104](http://wg21.link/lwg2104) * [LWG#2105](http://wg21.link/lwg2105) * [LWG#2109](http://wg21.link/lwg2109) * [LWG#2110](http://wg21.link/lwg2110) * [LWG#2112](http://wg21.link/lwg2112) * [LWG#2120](http://wg21.link/lwg2120) * [LWG#2122](http://wg21.link/lwg2122) * [LWG#2123](http://wg21.link/lwg2123) * [LWG#2128](http://wg21.link/lwg2128) * [LWG#2130](http://wg21.link/lwg2130) * [LWG#2132](http://wg21.link/lwg2132) * [LWG#2135](http://wg21.link/lwg2135) * [LWG#2138](http://wg21.link/lwg2138) * [LWG#2140](http://wg21.link/lwg2140) * [LWG#2141](http://wg21.link/lwg2141) * [LWG#2142](http://wg21.link/lwg2142) * [LWG#2143](http://wg21.link/lwg2143) * [LWG#2144](http://wg21.link/lwg2144) * [LWG#2145](http://wg21.link/lwg2145) * [LWG#2147](http://wg21.link/lwg2147) * [LWG#2148](http://wg21.link/lwg2148) * [LWG#2149](http://wg21.link/lwg2149) * [LWG#2150](http://wg21.link/lwg2150) * [LWG#2159](http://wg21.link/lwg2159) * [LWG#2162](http://wg21.link/lwg2162) * [LWG#2163](http://wg21.link/lwg2163) * [LWG#2165](http://wg21.link/lwg2165) * [LWG#2169](http://wg21.link/lwg2169) * [LWG#2172](http://wg21.link/lwg2172) * [LWG#2174](http://wg21.link/lwg2174) * [LWG#2175](http://wg21.link/lwg2175) * [LWG#2176](http://wg21.link/lwg2176) * [LWG#2177](http://wg21.link/lwg2177) * [LWG#2180](http://wg21.link/lwg2180) * [LWG#2182](http://wg21.link/lwg2182) * [LWG#2185](http://wg21.link/lwg2185) * [LWG#2186](http://wg21.link/lwg2186) * [LWG#2187](http://wg21.link/lwg2187) * [LWG#2188](http://wg21.link/lwg2188) * [LWG#2190](http://wg21.link/lwg2190) * [LWG#2193](http://wg21.link/lwg2193) * [LWG#2194](http://wg21.link/lwg2194) * [LWG#2196](http://wg21.link/lwg2196) * [LWG#2197](http://wg21.link/lwg2197) * [LWG#2200](http://wg21.link/lwg2200) * [LWG#2203](http://wg21.link/lwg2203) * [LWG#2205](http://wg21.link/lwg2205) * [LWG#2207](http://wg21.link/lwg2207) * [LWG#2209](http://wg21.link/lwg2209) * [LWG#2210](http://wg21.link/lwg2210) * [LWG#2211](http://wg21.link/lwg2211) * [LWG#2213](http://wg21.link/lwg2213) * [LWG#2222](http://wg21.link/lwg2222) * [LWG#2225](http://wg21.link/lwg2225) * [LWG#2229](http://wg21.link/lwg2229) * [LWG#2231](http://wg21.link/lwg2231) * [LWG#2235](http://wg21.link/lwg2235) * [LWG#2240](http://wg21.link/lwg2240) * [LWG#2246](http://wg21.link/lwg2246) * [LWG#2247](http://wg21.link/lwg2247) * [LWG#2249](http://wg21.link/lwg2249) * [LWG#2252](http://wg21.link/lwg2252) * [LWG#2257](http://wg21.link/lwg2257) * [LWG#2258](http://wg21.link/lwg2258) * [LWG#2263](http://wg21.link/lwg2263) * [LWG#2268](http://wg21.link/lwg2268) * [LWG#2271](http://wg21.link/lwg2271) * [LWG#2272](http://wg21.link/lwg2272) * [LWG#2275](http://wg21.link/lwg2275) * [LWG#2278](http://wg21.link/lwg2278) * [LWG#2280](http://wg21.link/lwg2280) * [LWG#2284](http://wg21.link/lwg2284) * [LWG#2285](http://wg21.link/lwg2285) * [LWG#2288](http://wg21.link/lwg2288) * [LWG#2291](http://wg21.link/lwg2291) * [LWG#2293](http://wg21.link/lwg2293) * [LWG#2298](http://wg21.link/lwg2298) * [LWG#2299](http://wg21.link/lwg2299) * [LWG#2300](http://wg21.link/lwg2300) * [LWG#2301](http://wg21.link/lwg2301) * [LWG#2304](http://wg21.link/lwg2304) * [LWG#2306](http://wg21.link/lwg2306) * [LWG#2308](http://wg21.link/lwg2308) * [LWG#2313](http://wg21.link/lwg2313) * [LWG#2314](http://wg21.link/lwg2314) * [LWG#2315](http://wg21.link/lwg2315) * [LWG#2316](http://wg21.link/lwg2316) * [LWG#2317](http://wg21.link/lwg2317) * [LWG#2320](http://wg21.link/lwg2320) * [LWG#2322](http://wg21.link/lwg2322) * [LWG#2323](http://wg21.link/lwg2323) * [LWG#2324](http://wg21.link/lwg2324) * [LWG#2329](http://wg21.link/lwg2329) * [LWG#2330](http://wg21.link/lwg2330) * [LWG#2332](http://wg21.link/lwg2332) * [LWG#2339](http://wg21.link/lwg2339) * [LWG#2341](http://wg21.link/lwg2341) * [LWG#2344](http://wg21.link/lwg2344) * [LWG#2346](http://wg21.link/lwg2346) * [LWG#2350](http://wg21.link/lwg2350) * [LWG#2356](http://wg21.link/lwg2356) * [LWG#2357](http://wg21.link/lwg2357) * [LWG#2359](http://wg21.link/lwg2359) * [LWG#2360](http://wg21.link/lwg2360) | Compiler support ----------------- Main Article: [C++14 compiler support](compiler_support#C.2B.2B14_features "cpp/compiler support"). ### C++14 core language features | C++14 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Tweaked wording for [contextual conversions](language/implicit_conversion#Contextual_conversions "cpp/language/implicit conversion") | [N3323](https://wg21.link/N3323) | 4.9 | 3.4 | 18.0\* | Yes | 4.9 | 16.0 | 13.1.2\* | 5.15 | 10.3 | 8.6 | 16.1 | 9.0 | | [Binary literals](language/integer_literal "cpp/language/integer literal") | [N3472](https://wg21.link/N3472) | 4.3 (GNU)4.9 | 2.9 | 19.0 (2015)\* | Yes | 4.10 | 11.0 | 13.1.2\* | 5.14 | 10.3 | 8.6 | 2015 | 9.0 | | [`decltype(auto)`](language/auto "cpp/language/auto"), Return type deduction for normal functions | [N3638](https://wg21.link/N3638) | 4.8 (partial)\*4.9 | 3.3 (partial)\*3.4 | 19.0 (2015)\* | Yes | 4.9 | 15.0 | 13.1.2\* | 5.15 | 10.3 | 8.6 | 16.1 | 9.0 | | Initialized/Generalized lambda captures (init-capture) | [N3648](https://wg21.link/N3648) | 4.5 (partial)4.9 | 3.4 | 19.0 (2015)\* | Yes | 4.10 | 15.0 | 16.1.1\* | 5.15 | 10.3 | 8.6 | 16.1 | 9.0 | | [Generic lambda expressions](language/lambda#Explanation "cpp/language/lambda") | [N3649](https://wg21.link/N3649) | 4.9 | 3.4 | 19.0 (2015)\* | Yes | 4.10 | 16.0 | 13.1.2\* | 5.15 | 10.3 | 8.6 | 16.1 | 9.0 | | [Variable templates](language/variable_template "cpp/language/variable template") | [N3651](https://wg21.link/N3651) | 5 | 3.4 | 19.0 (Update 2)\* | Yes | 4.11 | 17.0 | 13.1.2\* | 5.15 | 10.3 | 8.6 | 17.4 | 9.0 | | Extended constexpr | [N3652](https://wg21.link/N3652) | 5 | 3.4 | 19.10\* | Yes | 4.11 | 17.0 | 13.1.2\* | 5.15 | 10.3 | 8.6 | 17.4 | 9.0 | | Aggregates with [default member initializers](language/data_members#Member_initialization "cpp/language/data members") | [N3653](https://wg21.link/N3653) | 5 | 3.3 | 19.10\* | Yes | 4.9 | 16.0 | 16.1.1\* | 5.14 | 10.3 | 8.6 | 16.1 | 9.0 | | Omitting/extending [memory allocations](language/new#Allocation "cpp/language/new") | [N3664](https://wg21.link/N3664) | N/A | 3.4 | N/A | Yes | N/A | N/A | N/A | N/A | 10.3 | 8.6 | 17.4 | N/A | | `[[[deprecated](language/attributes/deprecated "cpp/language/attributes/deprecated")]]` attribute | [N3760](https://wg21.link/N3760) | 4.9 | 3.4 | 19.0 (2015)\* | Yes | 4.9 | 15.0\*16.0 | 13.1.2\* | 5.14 | 10.3 | 8.6 | 16.1 | 9.0 | | [Sized deallocation](language/delete "cpp/language/delete") | [N3778](https://wg21.link/N3778) | 5 | 3.4 | 19.0 (2015)\* | Yes | 4.10.1 | 17.0 | 16.1.1\* | 5.14 | 10.3 | 8.6 | 16.1 | | | [Single quote as digit separator](language/integer_literal#Single_quote "cpp/language/integer literal") | [N3781](https://wg21.link/N3781) | 4.9 | 3.4 | 19.0 (2015)\* | Yes | 4.10 | 16.0 | 13.1.2\* | 5.14 | 10.3 | 8.6 | 2015 | 9.0 | | C++14 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | ### C++14 library features | C++14 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | `constexpr` for [`<complex>`](header/complex "cpp/header/complex") | [N3302](https://wg21.link/N3302) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | Transparent [operator functors](utility/functional#Operator_function_objects "cpp/utility/functional") | [N3421](https://wg21.link/N3421) | 4.9 | 3.4 | 18.0\* | Yes | 5.15 | 10.3 | 8.6 | | `[std::result\_of](types/result_of "cpp/types/result of")` and [SFINAE](language/sfinae "cpp/language/sfinae") | [N3462](https://wg21.link/N3462) | 5 | Yes | 19.0 (Update 2)\* | Yes | 5.15 | 10.3 | 8.6 | | `constexpr` for [`<chrono>`](header/chrono "cpp/header/chrono") | [N3469](https://wg21.link/N3469) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `constexpr` for [`<array>`](header/array "cpp/header/array") | [N3470](https://wg21.link/N3470) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `constexpr` for [`<initializer_list>`](header/initializer_list "cpp/header/initializer list"), [`<utility>`](header/utility "cpp/header/utility") and [`<tuple>`](header/tuple "cpp/header/tuple") | [N3471](https://wg21.link/N3471) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | Improved `[std::integral\_constant](types/integral_constant "cpp/types/integral constant")` | [N3545](https://wg21.link/N3545) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | [User-defined literals](language/user_literal#Standard_library "cpp/language/user literal") for [`<chrono>`](header/chrono "cpp/header/chrono") and [`<string>`](header/string "cpp/header/string") | [N3642](https://wg21.link/N3642) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | [Null forward iterators](named_req/forwarditerator#Singular_iterators "cpp/named req/ForwardIterator") | [N3644](https://wg21.link/N3644) | 5 (partial) 10 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `[std::quoted](io/manip/quoted "cpp/io/manip/quoted")` | [N3654](https://wg21.link/N3654) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `[std::make\_unique](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique")` | [N3656](https://wg21.link/N3656) | 4.9 | 3.4 | 18.0\* | Yes | 5.15 | 10.3 | 8.6 | | Heterogeneous associative lookup | [N3657](https://wg21.link/N3657) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `[std::integer\_sequence](utility/integer_sequence "cpp/utility/integer sequence")` | [N3658](https://wg21.link/N3658) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `[std::shared\_timed\_mutex](thread/shared_timed_mutex "cpp/thread/shared timed mutex")` | [N3659](https://wg21.link/N3659) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | `[std::exchange](utility/exchange "cpp/utility/exchange")` | [N3668](https://wg21.link/N3668) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | fixing `constexpr` member functions without `const` | [N3669](https://wg21.link/N3669) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | [`std::get<T>()`](utility/pair/get "cpp/utility/pair/get") | [N3670](https://wg21.link/N3670) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | Dual-Range `[std::equal](algorithm/equal "cpp/algorithm/equal")`, `[std::is\_permutation](algorithm/is_permutation "cpp/algorithm/is permutation")`, `[std::mismatch](algorithm/mismatch "cpp/algorithm/mismatch")` | [N3671](https://wg21.link/N3671) | 5 | 3.4 | 19.0 (2015)\* | Yes | 5.15 | 10.3 | 8.6 | | C++14 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | *\** - hover over a cell with the version number to see notes. ### External links * [Working C++14 examples](https://github.com/makelinux/examples/blob/HEAD/cpp/14.cpp)
programming_docs
cpp Feature testing (since C++20) Feature testing (since C++20) ============================= The standard defines a set of [preprocessor macros](preprocessor/replace "cpp/preprocessor/replace") corresponding to C++ language and library features introduced in C++11 or later. They are intended as a simple and portable way to detect the presence of said features. ### Attributes | | | | | --- | --- | --- | | `__has_cpp_attribute(` attribute-token `)` | | | Checks for the presence of an [attribute](language/attributes "cpp/language/attributes") named by attribute-token (after macro expansion). For standard attributes, it will expand to the year and month in which the attribute was added to the working draft (see table below), the presence of vendor-specific attributes is determined by a non-zero value. `__has_cpp_attribute` can be expanded in the expression of [`#if`](preprocessor/conditional "cpp/preprocessor/conditional") and [`#elif`](preprocessor/conditional "cpp/preprocessor/conditional"). It is treated as a defined macro by [`#ifdef`](preprocessor/conditional "cpp/preprocessor/conditional"), [`#ifndef`](preprocessor/conditional "cpp/preprocessor/conditional") and [`defined`](preprocessor/conditional "cpp/preprocessor/conditional") but cannot be used anywhere else. | attribute-token | Attribute | Value | Standard | | --- | --- | --- | --- | | `assume` | `[[[assume](https://en.cppreference.com/mwiki/index.php?title=cpp/language/attributes/assume&action=edit&redlink=1 "cpp/language/attributes/assume (page does not exist)")]]` | `202207L`? | (C++23) | | `carries_dependency` | `[[[carries\_dependency](language/attributes/carries_dependency "cpp/language/attributes/carries dependency")]]` | `200809L` | (C++11) | | `deprecated` | `[[[deprecated](language/attributes/deprecated "cpp/language/attributes/deprecated")]]` | `201309L` | (C++14) | | `fallthrough` | `[[[fallthrough](language/attributes/fallthrough "cpp/language/attributes/fallthrough")]]` | `201603L` | (C++17) | | `likely` | `[[[likely](language/attributes/likely "cpp/language/attributes/likely")]]` | `201803L` | (C++20) | | `maybe_unused` | `[[[maybe\_unused](language/attributes/maybe_unused "cpp/language/attributes/maybe unused")]]` | `201603L` | (C++17) | | `no_unique_address` | `[[[no\_unique\_address](language/attributes/no_unique_address "cpp/language/attributes/no unique address")]]` | `201803L` | (C++20) | | `nodiscard` | `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` | `201603L` | (C++17) | | `201907L` | (C++20) | | `noreturn` | `[[[noreturn](language/attributes/noreturn "cpp/language/attributes/noreturn")]]` | `200809L` | (C++11) | | `unlikely` | `[[[unlikely](language/attributes/likely "cpp/language/attributes/likely")]]` | `201803L` | (C++20) | ### Language features The following macros are predefined in every translation unit. Each macro expands to an integer literal corresponding to the year and month when the corresponding feature has been included in the working draft. When a feature changes significantly, the macro will be updated accordingly. | Macro name | Feature | Value | Std | | --- | --- | --- | --- | | `__cpp_aggregate_bases` | [Aggregate classes](language/aggregate_initialization "cpp/language/aggregate initialization") with base classes | `201603L` | (C++17) | | `__cpp_aggregate_nsdmi` | [Aggregate classes](language/aggregate_initialization "cpp/language/aggregate initialization") with [default member initializers](language/data_members#Member_initialization "cpp/language/data members") | `201304L` | (C++14) | | `__cpp_aggregate_paren_init` | [Aggregate initialization](language/aggregate_initialization "cpp/language/aggregate initialization") in the form of [direct initialization](language/direct_initialization "cpp/language/direct initialization") | `201902L` | (C++20) | | `__cpp_alias_templates` | [Alias templates](language/type_alias "cpp/language/type alias") | `200704L` | (C++11) | | `__cpp_aligned_new` | [Dynamic memory allocation for over-aligned data](memory/new/align_val_t "cpp/memory/new/align val t") | `201606L` | (C++17) | | `__cpp_attributes` | [Attributes](language/attributes "cpp/language/attributes") | `200809L` | (C++11) | | `__cpp_binary_literals` | [Binary literals](language/integer_literal "cpp/language/integer literal") | `201304L` | (C++14) | | `__cpp_capture_star_this` | [Lambda capture of \*this by value as [=,\*this]](language/lambda#Lambda_capture "cpp/language/lambda") | `201603L` | (C++17) | | `__cpp_char8_t` | [`char8_t`](keyword/char8_t "cpp/keyword/char8 t") | `201811L` | (C++20) | | `char8_t` compatibility and portability fix (allow [initialization of (unsigned) char arrays](language/aggregate_initialization#Character_arrays "cpp/language/aggregate initialization") from [UTF-8 string literals](language/string_literal "cpp/language/string literal")) | `202207L` | (C++23) | | `__cpp_concepts` | [Concepts](language/constraints "cpp/language/constraints") | `201907L` | (C++20) | | Conditional trivial special member functions | `202002L` | (C++20) | | `__cpp_conditional_explicit` | [`explicit(bool)`](language/explicit "cpp/language/explicit") | `201806L` | (C++20) | | `__cpp_consteval` | [Immediate functions](language/consteval "cpp/language/consteval") | `201811L` | (C++20) | | `__cpp_constexpr` | [`constexpr`](language/constexpr "cpp/language/constexpr") | `200704L` | (C++11) | | [Relaxed `constexpr`](language/constexpr#relaxed-constexpr "cpp/language/constexpr"), [non-`const` `constexpr` methods](language/constexpr#constexpr-method-is-const "cpp/language/constexpr") | `201304L` | (C++14) | | [Constexpr lambda](language/lambda "cpp/language/lambda") | `201603L` | (C++17) | | Trivial [default initialization](language/default_initialization "cpp/language/default initialization") and [asm-declaration](language/asm "cpp/language/asm") in constexpr functions | `201907L` | (C++20) | | Changing the active member of a union in constant evaluation | `202002L` | (C++20) | | Non-[literal](named_req/literaltype "cpp/named req/LiteralType") variables, labels, and [`goto`](language "cpp/language") statements in constexpr functions | `202110L` | (C++23) | | Relaxing some [constexpr](language/constexpr "cpp/language/constexpr") restrictions | `202207L` | (C++23) | | `__cpp_constexpr_dynamic_alloc` | Operations for dynamic storage duration in constexpr functions | `201907L` | (C++20) | | `__cpp_constexpr_in_decltype` | Generation of function and variable definitions when [needed for constant evaluation](language/constant_expression#Functions_and_variables_needed_for_constant_evaluation "cpp/language/constant expression") | `201711L` | (C++11)(DR) | | `__cpp_constinit` | [`constinit`](language/constinit "cpp/language/constinit") | `201907L` | (C++20) | | `__cpp_decltype` | [`decltype`](language/decltype "cpp/language/decltype") | `200707L` | (C++11) | | `__cpp_decltype_auto` | [Return type deduction for normal functions](language/auto "cpp/language/auto") | `201304L` | (C++14) | | `__cpp_deduction_guides` | [Template argument deduction for class templates](language/class_template_argument_deduction "cpp/language/class template argument deduction") | `201703L` | (C++17) | | [CTAD](language/class_template_argument_deduction "cpp/language/class template argument deduction") for aggregates and aliases | `201907L` | (C++20) | | `__cpp_delegating_constructors` | [Delegating constructors](language/initializer_list#Delegating_constructor "cpp/language/initializer list") | `200604L` | (C++11) | | `__cpp_designated_initializers` | [Designated initializer](language/aggregate_initialization#Designated_initializers "cpp/language/aggregate initialization") | `201707L` | (C++20) | | `__cpp_enumerator_attributes` | Attributes for [enumerators](language/enum "cpp/language/enum") | `201411L` | (C++17) | | `__cpp_explicit_this_parameter` | Explicit object parameter | `202110L` | (C++23) | | `__cpp_fold_expressions` | [Fold expressions](language/fold "cpp/language/fold") | `201603L` | (C++17) | | `__cpp_generic_lambdas` | [Generic lambda expressions](language/lambda "cpp/language/lambda") | `201304L` | (C++14) | | Explicit template parameter list for [generic lambdas](language/lambda "cpp/language/lambda") | `201707L` | (C++20) | | `__cpp_guaranteed_copy_elision` | Guaranteed copy elision through simplified [value categories](language/value_category "cpp/language/value category") | `201606L` | (C++17) | | `__cpp_hex_float` | [Hexadecimal floating literals](language/floating_literal "cpp/language/floating literal") | `201603L` | (C++17) | | `__cpp_if_consteval` | [`consteval if`](language/if "cpp/language/if") | `202106L` | (C++23) | | `__cpp_if_constexpr` | [`constexpr if`](language/if "cpp/language/if") | `201606L` | (C++17) | | `__cpp_impl_coroutine` | [Coroutines](language/coroutines "cpp/language/coroutines") (compiler support) | `201902L` | (C++20) | | `__cpp_impl_destroying_delete` | [Destroying operator delete](memory/new/operator_delete "cpp/memory/new/operator delete") (compiler support) | `201806L` | (C++20) | | `__cpp_impl_three_way_comparison` | [Three-way comparison](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") (compiler support) | `201907L` | (C++20) | | `__cpp_implicit_move` | Simpler [implicit move](language/return#Automatic_move_from_local_variables_and_parameters "cpp/language/return") | `202207L` | (C++23) | | `__cpp_inheriting_constructors` | [Inheriting constructors](language/using_declaration#Inheriting_constructors "cpp/language/using declaration") | `200802L` | (C++11) | | Rewording inheriting constructors | `201511L` | (C++11)(DR) | | `__cpp_init_captures` | [Lambda init-capture](language/lambda "cpp/language/lambda") | `201304L` | (C++14) | | Allow pack expansion in [lambda](language/lambda "cpp/language/lambda") init-capture | `201803L` | (C++20) | | `__cpp_initializer_lists` | [List-initialization](language/list_initialization "cpp/language/list initialization") and `[std::initializer\_list](utility/initializer_list "cpp/utility/initializer list")` | `200806L` | (C++11) | | `__cpp_inline_variables` | [Inline variables](language/inline "cpp/language/inline") | `201606L` | (C++17) | | `__cpp_lambdas` | [Lambda expressions](language/lambda "cpp/language/lambda") | `200907L` | (C++11) | | `__cpp_modules` | [Modules](language/modules "cpp/language/modules") | `201907L` | (C++20) | | `__cpp_multidimensional_subscript` | Multidimensional [subscript operator](language/operator_member_access "cpp/language/operator member access") | `202110L` | (C++23) | | `__cpp_named_character_escapes` | Named [universal character escapes](language/escape "cpp/language/escape") | `202207L` | (C++23) | | `__cpp_namespace_attributes` | Attributes for [namespaces](language/namespace "cpp/language/namespace") | `201411L` | (C++17) | | `__cpp_noexcept_function_type` | Make [exception specifications](language/noexcept_spec "cpp/language/noexcept spec") be part of the type system | `201510L` | (C++17) | | `__cpp_nontype_template_args` | Allow constant evaluation for all [non-type template arguments](language/template_parameters#Template_non-type_arguments "cpp/language/template parameters") | `201411L` | (C++17) | | Class types and floating-point types in [non-type template parameters](language/template_parameters#Non-type_template_parameter "cpp/language/template parameters") | `201911L` | (C++20) | | `__cpp_nontype_template_``parameter_auto` | Declaring [non-type template parameters](language/template_parameters#Non-type_template_parameter "cpp/language/template parameters") with `auto` | `201606L` | (C++17) | | `__cpp_nsdmi` | [Non-static data member initializers](language/data_members#Member_initialization "cpp/language/data members") | `200809L` | (C++11) | | `__cpp_range_based_for` | [Range-based `for` loop](language/range-for "cpp/language/range-for") | `200907L` | (C++11) | | [Range-based `for` loop](language/range-for#Explanation "cpp/language/range-for") with different `begin`/`end` types | `201603L` | (C++17) | | `__cpp_raw_strings` | [Raw string literals](language/string_literal "cpp/language/string literal") | `200710L` | (C++11) | | `__cpp_ref_qualifiers` | [ref-qualifiers](language/member_functions "cpp/language/member functions") | `200710L` | (C++11) | | `__cpp_return_type_deduction` | [Return type deduction for normal functions](language/auto "cpp/language/auto") | `201304L` | (C++14) | | `__cpp_rvalue_references` | [Rvalue reference](language/reference "cpp/language/reference") | `200610L` | (C++11) | | `__cpp_size_t_suffix` | [Literal suffixes for `size_t` and its signed version](language/integer_literal "cpp/language/integer literal") | `202011L` | (C++23) | | `__cpp_sized_deallocation` | [Sized deallocation](memory/new/operator_delete "cpp/memory/new/operator delete") | `201309L` | (C++14) | | `__cpp_static_assert` | [`static_assert`](language/static_assert "cpp/language/static assert") | `200410L` | (C++11) | | [Single-argument `static_assert`](language/static_assert "cpp/language/static assert") | `201411L` | (C++17) | | `__cpp_static_call_operator` | `static` [`operator()`](language/operators "cpp/language/operators") | `202207L` | (C++23) | | `__cpp_structured_bindings` | [Structured bindings](language/structured_binding "cpp/language/structured binding") | `201606L` | (C++17) | | `__cpp_template_template_args` | Matching of [template template-arguments](language/template_parameters#Template_template_arguments "cpp/language/template parameters") | `201611L` | (C++17) | | `__cpp_threadsafe_static_init` | [Dynamic initialization and destruction with concurrency](language/storage_duration#Static_local_variables "cpp/language/storage duration") | `200806L` | (C++11) | | `__cpp_unicode_characters` | [New character types](language/types "cpp/language/types") (`char16_t` and `char32_t`) | `200704L` | (C++11) | | `__cpp_unicode_literals` | [Unicode string literals](language/string_literal "cpp/language/string literal") | `200710L` | (C++11) | | `__cpp_user_defined_literals` | [User-defined literals](language/user_literal "cpp/language/user literal") | `200809L` | (C++11) | | `__cpp_using_enum` | [`using enum`](language/enum#Using-enum-declaration "cpp/language/enum") | `201907L` | (C++20) | | `__cpp_variable_templates` | [Variable templates](language/variable_template "cpp/language/variable template") | `201304L` | (C++14) | | `__cpp_variadic_templates` | [Variadic templates](language/parameter_pack "cpp/language/parameter pack") | `200704L` | (C++11) | | `__cpp_variadic_using` | Pack expansions in [`using`-declarations](language/using_declaration "cpp/language/using declaration") | `201611L` | (C++17) | ### Library features The following macros are defined if the header [`<version>`](header/version "cpp/header/version") or any of the corresponding headers in the table below is included. Each macro expands to an integer literal corresponding to the year and month when the corresponding feature has been included in the working draft. When a feature changes significantly, the macro will be updated accordingly. | Macro name | Feature | Value | Header | Std | | --- | --- | --- | --- | --- | | `__cpp_lib_adaptor_iterator_``pair_constructor` | Iterator pair constructors for `[std::stack](container/stack "cpp/container/stack")` and `[std::queue](container/queue "cpp/container/queue")` | `202106L` | [`<stack>`](header/stack "cpp/header/stack") [`<queue>`](header/queue "cpp/header/queue") | (C++23) | | `__cpp_lib_addressof_constexpr` | Constexpr `[std::addressof](memory/addressof "cpp/memory/addressof")` | `201603L` | [`<memory>`](header/memory "cpp/header/memory") | (C++17) | | `__cpp_lib_algorithm_``iterator_requirements` | Ranges iterators as inputs to non-Ranges [algorithms](algorithm "cpp/algorithm") | `202207L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") [`<memory>`](header/memory "cpp/header/memory") [`<numeric>`](header/numeric "cpp/header/numeric") | (C++23) | | `__cpp_lib_allocate_at_least` | `std::allocate_at_least` etc. | `202106L` | [`<memory>`](header/memory "cpp/header/memory") | (C++23) | | `__cpp_lib_allocator_``traits_is_always_equal` | [`std::allocator_traits::is_always_equal`](memory/allocator_traits "cpp/memory/allocator traits") | `201411L` | [`<memory>`](header/memory "cpp/header/memory") [`<scoped_allocator>`](header/scoped_allocator "cpp/header/scoped allocator") [`<string>`](header/string "cpp/header/string") [`<deque>`](header/deque "cpp/header/deque") [`<forward_list>`](header/forward_list "cpp/header/forward list") [`<list>`](header/list "cpp/header/list") [`<vector>`](header/vector "cpp/header/vector") [`<map>`](header/map "cpp/header/map") [`<set>`](header/set "cpp/header/set") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") | (C++17) | | `__cpp_lib_any` | `[std::any](utility/any "cpp/utility/any")` | `201606L` | [`<any>`](header/any "cpp/header/any") | (C++17) | | `__cpp_lib_apply` | `[std::apply](utility/apply "cpp/utility/apply")` | `201603L` | [`<tuple>`](header/tuple "cpp/header/tuple") | (C++17) | | `__cpp_lib_array_constexpr` | Constexpr for `[std::reverse\_iterator](iterator/reverse_iterator "cpp/iterator/reverse iterator")`, `[std::move\_iterator](iterator/move_iterator "cpp/iterator/move iterator")`, `[std::array](container/array "cpp/container/array")` and [range access](iterator#Range_access "cpp/iterator") | `201603L` | [`<iterator>`](header/iterator "cpp/header/iterator") [`<array>`](header/array "cpp/header/array") | (C++17) | | [ConstexprIterator](named_req/constexpriterator "cpp/named req/ConstexprIterator"); constexpr comparison for `[std::array](container/array "cpp/container/array")`; misc constexpr bits (`[std::array::fill](container/array/fill "cpp/container/array/fill")` et al.) | `201811L` | [`<iterator>`](header/iterator "cpp/header/iterator") [`<array>`](header/array "cpp/header/array") | (C++20) | | `__cpp_lib_as_const` | `[std::as\_const](utility/as_const "cpp/utility/as const")` | `201510L` | [`<utility>`](header/utility "cpp/header/utility") | (C++17) | | `__cpp_lib_associative_``heterogeneous_erasure` | Heterogeneous erasure in [associative containers](container#Associative_containers "cpp/container") and [unordered associative containers](container#Unordered_associative_containers "cpp/container") | `202110L` | [`<map>`](header/map "cpp/header/map") [`<set>`](header/set "cpp/header/set") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") | (C++23) | | `__cpp_lib_assume_aligned` | `[std::assume\_aligned](memory/assume_aligned "cpp/memory/assume aligned")` | `201811L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_atomic_flag_test` | `std::atomic_flag::test` | `201907L` | [`<atomic>`](header/atomic "cpp/header/atomic") | (C++20) | | `__cpp_lib_atomic_float` | [Floating-point atomic](atomic/atomic "cpp/atomic/atomic") | `201711L` | [`<atomic>`](header/atomic "cpp/header/atomic") | (C++20) | | `__cpp_lib_atomic_``is_always_lock_free` | [`constexpr atomic<T>::is_always_lock_free`](atomic/atomic/is_always_lock_free "cpp/atomic/atomic/is always lock free") | `201603L` | [`<atomic>`](header/atomic "cpp/header/atomic") | (C++17) | | `__cpp_lib_atomic_``lock_free_type_aliases` | atomic lockfree integral types (`[std::atomic\_signed\_lock\_free](atomic/atomic "cpp/atomic/atomic")`, `[std::atomic\_unsigned\_lock\_free](atomic/atomic "cpp/atomic/atomic")`) | `201907L` | [`<atomic>`](header/atomic "cpp/header/atomic") | (C++20) | | `__cpp_lib_atomic_ref` | [`std::atomic_ref`](atomic/atomic_ref "cpp/atomic/atomic ref") | `201806L` | [`<atomic>`](header/atomic "cpp/header/atomic") | (C++20) | | `__cpp_lib_atomic_shared_ptr` | [`std::atomic<std::shared_ptr>`](memory/shared_ptr/atomic2 "cpp/memory/shared ptr/atomic2") | `201711L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_atomic_``value_initialization` | Fixing atomic initialization (value-initialize `[std::atomic](atomic/atomic "cpp/atomic/atomic")` by default) | `201911L` | [`<atomic>`](header/atomic "cpp/header/atomic") [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_atomic_wait` | Efficient `[std::atomic](atomic/atomic "cpp/atomic/atomic")` waiting | `201907L` | [`<atomic>`](header/atomic "cpp/header/atomic") | (C++20) | | `__cpp_lib_barrier` | [`std::barrier`](thread/barrier "cpp/thread/barrier") | `201907L` | [`<barrier>`](header/barrier "cpp/header/barrier") | (C++20) | | `__cpp_lib_bind_back` | `std::bind_back` | `202202L` | [`<functional>`](header/functional "cpp/header/functional") | (C++23) | | `__cpp_lib_bind_front` | [`std::bind_front`](utility/functional/bind_front "cpp/utility/functional/bind front") | `201907L` | [`<functional>`](header/functional "cpp/header/functional") | (C++20) | | `__cpp_lib_bit_cast` | [`std::bit_cast`](numeric/bit_cast "cpp/numeric/bit cast") | `201806L` | [`<bit>`](header/bit "cpp/header/bit") | (C++20) | | `__cpp_lib_bitops` | [Bit operations](numeric#Bit_manipulation_.28since_C.2B.2B20.29 "cpp/numeric") | `201907L` | [`<bit>`](header/bit "cpp/header/bit") | (C++20) | | `__cpp_lib_bool_constant` | `[std::bool\_constant](types/integral_constant "cpp/types/integral constant")` | `201505L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_bounded_array_traits` | [`std::is_bounded_array`](types/is_bounded_array "cpp/types/is bounded array"), [`std::is_unbounded_array`](types/is_unbounded_array "cpp/types/is unbounded array") | `201902L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_boyer_moore_searcher` | [Searchers](utility/functional#Searchers "cpp/utility/functional") | `201603L` | [`<functional>`](header/functional "cpp/header/functional") | (C++17) | | `__cpp_lib_byte` | [`std::byte`](types/byte "cpp/types/byte") | `201603L` | [`<cstddef>`](header/cstddef "cpp/header/cstddef") | (C++17) | | `__cpp_lib_byteswap` | [`std::byteswap`](numeric/byteswap "cpp/numeric/byteswap") | `202110L` | [`<bit>`](header/bit "cpp/header/bit") | (C++23) | | `__cpp_lib_char8_t` | Library support for [`char8_t`](language/types#char8_t "cpp/language/types") | `201907L` | [`<atomic>`](header/atomic "cpp/header/atomic") [`<filesystem>`](header/filesystem "cpp/header/filesystem") [`<istream>`](header/istream "cpp/header/istream") [`<limits>`](header/limits "cpp/header/limits") [`<locale>`](header/locale "cpp/header/locale") [`<ostream>`](header/ostream "cpp/header/ostream") [`<string>`](header/string "cpp/header/string") [`<string_view>`](header/string_view "cpp/header/string view") | (C++20) | | `__cpp_lib_chrono` | Rounding functions for `[std::chrono::duration](chrono/duration "cpp/chrono/duration")` and `[std::chrono::time\_point](chrono/time_point "cpp/chrono/time point")` | `201510L` | [`<chrono>`](header/chrono "cpp/header/chrono") | (C++17) | | Constexpr for all the member functions of `[std::chrono::duration](chrono/duration "cpp/chrono/duration")` and `[std::chrono::time\_point](chrono/time_point "cpp/chrono/time point")` | `201611L` | [`<chrono>`](header/chrono "cpp/header/chrono") | (C++17) | | [Calendars](chrono#Calendar "cpp/chrono") and [Time Zones](chrono#Time_zone "cpp/chrono") | `201907L` | [`<chrono>`](header/chrono "cpp/header/chrono") | (C++20) | | `__cpp_lib_chrono_udls` | [User-defined literals for time types](chrono/duration#Literals "cpp/chrono/duration") | `201304L` | [`<chrono>`](header/chrono "cpp/header/chrono") | (C++14) | | `__cpp_lib_clamp` | `[std::clamp](algorithm/clamp "cpp/algorithm/clamp")` | `201603L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++17) | | `__cpp_lib_complex_udls` | [User-defined Literals for `std::complex`](numeric/complex/operator%22%22i "cpp/numeric/complex/operator\"\"i") | `201309L` | [`<complex>`](header/complex "cpp/header/complex") | (C++14) | | `__cpp_lib_concepts` | [Standard library concepts](concepts "cpp/concepts") | `202002L` | [`<concepts>`](header/concepts "cpp/header/concepts") | (C++20) | | Move-only types for [`equality_comparable_with`](concepts/equality_comparable "cpp/concepts/equality comparable"), [`totally_ordered_with`](concepts/totally_ordered "cpp/concepts/totally ordered"), and [`three_way_comparable_with`](utility/compare/three_way_comparable "cpp/utility/compare/three way comparable") | `202207L` | [`<compare>`](header/compare "cpp/header/compare") [`<concepts>`](header/concepts "cpp/header/concepts") | (C++23) | | `__cpp_lib_constexpr_algorithms` | Constexpr for [algorithms](algorithm "cpp/algorithm") | `201806L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++20) | | `__cpp_lib_constexpr_bitset` | A more constexpr `[std::bitset](utility/bitset "cpp/utility/bitset")` | `202207L` | [`<bitset>`](header/bitset "cpp/header/bitset") | (C++23) | | `__cpp_lib_constexpr_charconv` | Add `constexpr` modifiers to functions `std::to_chars` and `std::from_chars` for integral types | `202207L` | [`<charconv>`](header/charconv "cpp/header/charconv") | (C++23) | | `__cpp_lib_constexpr_cmath` | Constexpr for mathematical functions in [`<cmath>`](header/cmath "cpp/header/cmath") and [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") | `202202L` | [`<cmath>`](header/cmath "cpp/header/cmath") [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") | (C++23) | | `__cpp_lib_constexpr_complex` | Constexpr for `[std::complex](numeric/complex "cpp/numeric/complex")` | `201711L` | [`<complex>`](header/complex "cpp/header/complex") | (C++20) | | `__cpp_lib_constexpr_``dynamic_alloc` | Constexpr for `[std::allocator](memory/allocator "cpp/memory/allocator")` and related utilities | `201907L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_constexpr_functional` | Misc constexpr bits (`[std::default\_searcher](utility/functional/default_searcher "cpp/utility/functional/default searcher")`); constexpr `INVOKE` | `201907L` | [`<functional>`](header/functional "cpp/header/functional") | (C++20) | | `__cpp_lib_constexpr_iterator` | Misc constexpr bits (`[std::insert\_iterator](iterator/insert_iterator "cpp/iterator/insert iterator")` et al.) | `201811L` | [`<iterator>`](header/iterator "cpp/header/iterator") | (C++20) | | `__cpp_lib_constexpr_memory` | Constexpr in `[std::pointer\_traits](memory/pointer_traits "cpp/memory/pointer traits")` | `201811L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | Constexpr `[std::unique\_ptr](memory/unique_ptr "cpp/memory/unique ptr")` | `202202L` | [`<memory>`](header/memory "cpp/header/memory") | (C++23) | | `__cpp_lib_constexpr_numeric` | Constexpr for [algorithms](algorithm#Numeric_operations "cpp/algorithm") in [`<numeric>`](header/numeric "cpp/header/numeric") | `201911L` | [`<numeric>`](header/numeric "cpp/header/numeric") | (C++20) | | `__cpp_lib_constexpr_string` | Constexpr for `[std::string](string/basic_string "cpp/string/basic string")` | `201907L` | [`<string>`](header/string "cpp/header/string") | (C++20) | | `__cpp_lib_constexpr_``string_view` | Misc constexpr bits (`[std::string\_view::copy](string/basic_string_view/copy "cpp/string/basic string view/copy")`) | `201811L` | [`<string_view>`](header/string_view "cpp/header/string view") | (C++20) | | `__cpp_lib_constexpr_tuple` | Misc constexpr bits (`[std::tuple::operator=](utility/tuple/operator= "cpp/utility/tuple/operator=")` et al.) | `201811L` | [`<tuple>`](header/tuple "cpp/header/tuple") | (C++20) | | `__cpp_lib_constexpr_typeinfo` | Constexpr for `[std::type\_info::operator==](types/type_info/operator_cmp "cpp/types/type info/operator cmp")` | `202106L` | [`<typeinfo>`](header/typeinfo "cpp/header/typeinfo") | (C++23) | | `__cpp_lib_constexpr_utility` | Misc constexpr bits (`[std::pair::operator=](utility/pair/operator= "cpp/utility/pair/operator=")` et al.) | `201811L` | [`<utility>`](header/utility "cpp/header/utility") | (C++20) | | `__cpp_lib_constexpr_vector` | Constexpr for `[std::vector](container/vector "cpp/container/vector")` | `201907L` | [`<vector>`](header/vector "cpp/header/vector") | (C++20) | | `__cpp_lib_containers_ranges` | Ranges construction and insertion for containers | `202202L` | [`<vector>`](header/vector "cpp/header/vector") [`<list>`](header/list "cpp/header/list") [`<forward_list>`](header/forward_list "cpp/header/forward list") [`<map>`](header/map "cpp/header/map") [`<set>`](header/set "cpp/header/set") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") [`<deque>`](header/deque "cpp/header/deque") [`<queue>`](header/queue "cpp/header/queue") [`<stack>`](header/stack "cpp/header/stack") [`<string>`](header/string "cpp/header/string") | (C++23) | | `__cpp_lib_coroutine` | [Coroutines](language/coroutines "cpp/language/coroutines") (library support) | `201902L` | [`<coroutine>`](header/coroutine "cpp/header/coroutine") | (C++20) | | `__cpp_lib_destroying_delete` | [Destroying `operator delete`](memory/new/operator_delete "cpp/memory/new/operator delete") (library support) | `201806L` | [`<new>`](header/new "cpp/header/new") | (C++20) | | `__cpp_lib_enable_``shared_from_this` | More precise specification of `[std::enable\_shared\_from\_this](memory/enable_shared_from_this "cpp/memory/enable shared from this")` | `201603L` | [`<memory>`](header/memory "cpp/header/memory") | (C++17) | | `__cpp_lib_endian` | `std::endian` | `201907L` | [`<bit>`](header/bit "cpp/header/bit") | (C++20) | | `__cpp_lib_erase_if` | Uniform container erasure | `202002L` | [`<string>`](header/string "cpp/header/string") [`<deque>`](header/deque "cpp/header/deque") [`<forward_list>`](header/forward_list "cpp/header/forward list") [`<list>`](header/list "cpp/header/list") [`<vector>`](header/vector "cpp/header/vector") [`<map>`](header/map "cpp/header/map") [`<set>`](header/set "cpp/header/set") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") | (C++20) | | `__cpp_lib_exchange_function` | `[std::exchange](utility/exchange "cpp/utility/exchange")` | `201304L` | [`<utility>`](header/utility "cpp/header/utility") | (C++14) | | `__cpp_lib_execution` | [Execution policies](algorithm#Execution_policies "cpp/algorithm") | `201603L` | [`<execution>`](header/execution "cpp/header/execution") | (C++17) | | [`std::execution::unsequenced_policy`](algorithm/execution_policy_tag_t "cpp/algorithm/execution policy tag t") | `201902L` | [`<execution>`](header/execution "cpp/header/execution") | (C++20) | | `__cpp_lib_expected` | class template `std::expected` | `202202L` | [`<expected>`](header/expected "cpp/header/expected") | (C++23) | | `__cpp_lib_filesystem` | [Filesystem library](filesystem "cpp/filesystem") | `201703L` | [`<filesystem>`](header/filesystem "cpp/header/filesystem") | (C++17) | | `__cpp_lib_find_last` | [`std::ranges::find_last`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last") | `202207L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++23) | | `__cpp_lib_flat_map` | [`std::flat_map`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_map&action=edit&redlink=1 "cpp/container/flat map (page does not exist)") and [`std::flat_multimap`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multimap&action=edit&redlink=1 "cpp/container/flat multimap (page does not exist)") | `202207L` | [`<flat_map>`](header/flat_map "cpp/header/flat map") | (C++23) | | `__cpp_lib_flat_set` | [`std::flat_set`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_set&action=edit&redlink=1 "cpp/container/flat set (page does not exist)") and [`std::flat_multiset`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multiset&action=edit&redlink=1 "cpp/container/flat multiset (page does not exist)") | `202207L` | [`<flat_set>`](header/flat_set "cpp/header/flat set") | (C++23) | | `__cpp_lib_format` | [Text formatting](utility/format "cpp/utility/format") | `201907L` | [`<format>`](header/format "cpp/header/format") | (C++20) | | Compile-time format string checks; Reducing parameterization of `[std::vformat\_to](utility/format/vformat_to "cpp/utility/format/vformat to")` | `202106L` | [`<format>`](header/format "cpp/header/format") | (C++20)(DR) | | Fixing locale handling in chrono formatters; Supporting non-const-formattable types | `202110L` | [`<format>`](header/format "cpp/header/format") | (C++20)(DR) | | Exposing [`std::basic_format_string`](utility/format/basic_format_string "cpp/utility/format/basic format string"); clarify handling of encodings in localized formatting of chrono types | `202207L` | [`<format>`](header/format "cpp/header/format") | (C++23) | | `__cpp_lib_format_ranges` | Formatting ranges | `202207L` | [`<format>`](header/format "cpp/header/format") | (C++23) | | `__cpp_lib_forward_like` | [`std::forward_like`](utility/forward_like "cpp/utility/forward like") | `202207L` | [`<utility>`](header/utility "cpp/header/utility") | (C++23) | | `__cpp_lib_gcd_lcm` | `[std::gcd](numeric/gcd "cpp/numeric/gcd")`, `[std::lcm](numeric/lcm "cpp/numeric/lcm")` | `201606L` | [`<numeric>`](header/numeric "cpp/header/numeric") | (C++17) | | `__cpp_lib_generator` | `std::generator`: synchronous coroutine generator for ranges | `202207L` | [`<generator>`](header/generator "cpp/header/generator") | (C++23) | | `__cpp_lib_generic_``associative_lookup` | Heterogeneous comparison lookup in [associative containers](container#Associative_containers "cpp/container") | `201304L` | [`<map>`](header/map "cpp/header/map") [`<set>`](header/set "cpp/header/set") | (C++14) | | `__cpp_lib_generic_``unordered_lookup` | Heterogeneous comparison lookup in [unordered associative containers](container#Unordered_associative_containers "cpp/container") | `201811L` | [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") | (C++20) | | `__cpp_lib_hardware_``interference_size` | [`constexpr std::hardware_{constructive, destructive}_interference_size`](thread/hardware_destructive_interference_size "cpp/thread/hardware destructive interference size") | `201703L` | [`<new>`](header/new "cpp/header/new") | (C++17) | | `__cpp_lib_has_unique_``object_representations` | `[std::has\_unique\_object\_representations](types/has_unique_object_representations "cpp/types/has unique object representations")` | `201606L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_hypot` | 3-argument overload of `[std::hypot](numeric/math/hypot "cpp/numeric/math/hypot")` | `201603L` | [`<cmath>`](header/cmath "cpp/header/cmath") | (C++17) | | `__cpp_lib_incomplete_``container_elements` | Minimal incomplete type support for [std::forward\_list](container/forward_list#Template_parameters "cpp/container/forward list"), [std::list](container/list#Template_parameters "cpp/container/list"), and [std::vector](container/vector#Template_parameters "cpp/container/vector") | `201505L` | [`<forward_list>`](header/forward_list "cpp/header/forward list") [`<list>`](header/list "cpp/header/list") [`<vector>`](header/vector "cpp/header/vector") | (C++17) | | `__cpp_lib_int_pow2` | [Integral power-of-2 operations](numeric#Bit_manipulation_.28since_C.2B.2B20.29 "cpp/numeric") (`std::has_single_bit`, `std::bit_ceil`, `std::bit_floor`, `std::bit_width`) | `202002L` | [`<bit>`](header/bit "cpp/header/bit") | (C++20) | | `__cpp_lib_integer_``comparison_functions` | [Integer comparison functions](utility#Integer_comparison_functions "cpp/utility") | `202002L` | [`<utility>`](header/utility "cpp/header/utility") | (C++20) | | `__cpp_lib_integer_sequence` | [Compile-time integer sequences](utility/integer_sequence "cpp/utility/integer sequence") | `201304L` | [`<utility>`](header/utility "cpp/header/utility") | (C++14) | | `__cpp_lib_integral_``constant_callable` | `[std::integral\_constant::operator()](types/integral_constant "cpp/types/integral constant")` | `201304L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++14) | | `__cpp_lib_interpolate` | [`std::lerp`](numeric/lerp "cpp/numeric/lerp"), [`std::midpoint`](numeric/midpoint "cpp/numeric/midpoint") | `201902L` | [`<cmath>`](header/cmath "cpp/header/cmath") [`<numeric>`](header/numeric "cpp/header/numeric") | (C++20) | | `__cpp_lib_invoke` | `[std::invoke](utility/functional/invoke "cpp/utility/functional/invoke")` | `201411L` | [`<functional>`](header/functional "cpp/header/functional") | (C++17) | | `__cpp_lib_invoke_r` | [`std::invoke_r`](utility/functional/invoke "cpp/utility/functional/invoke") | `202106L` | [`<functional>`](header/functional "cpp/header/functional") | (C++23) | | `__cpp_lib_ios_noreplace` | Support [exclusive mode](io/ios_base/openmode "cpp/io/ios base/openmode") for fstreams | `202207L` | [`<ios>`](header/ios "cpp/header/ios") | (C++23) | | `__cpp_lib_is_aggregate` | [`std::is_aggregate`](types/is_aggregate "cpp/types/is aggregate") | `201703L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_is_constant_``evaluated` | [`std::is_constant_evaluated`](types/is_constant_evaluated "cpp/types/is constant evaluated") | `201811L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_is_final` | `[std::is\_final](types/is_final "cpp/types/is final")` | `201402L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++14) | | `__cpp_lib_is_invocable` | `std::is_invocable`, `[std::invoke\_result](types/result_of "cpp/types/result of")` | `201703L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_is_layout_compatible` | [`std::is_layout_compatible`](types/is_layout_compatible "cpp/types/is layout compatible") | `201907L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_is_nothrow_``convertible` | [`std::is_nothrow_convertible`](types/is_convertible "cpp/types/is convertible") | `201806L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_is_null_pointer` | `[std::is\_null\_pointer](types/is_null_pointer "cpp/types/is null pointer")` | `201309L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++14) | | `__cpp_lib_is_pointer_``interconvertible` | Pointer-interconvertibility traits: [`std::is_pointer_interconvertible_with_class`](types/is_pointer_interconvertible_with_class "cpp/types/is pointer interconvertible with class"), [`std::is_pointer_interconvertible_base_of`](types/is_pointer_interconvertible_base_of "cpp/types/is pointer interconvertible base of") | `201907L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_is_scoped_enum` | [`std::is_scoped_enum`](types/is_scoped_enum "cpp/types/is scoped enum") | `202011L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++23) | | `__cpp_lib_is_swappable` | [[`nothrow`-]swappable traits](types/is_swappable "cpp/types/is swappable") | `201603L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_jthread` | [Stop token](thread/stop_token "cpp/thread/stop token") and [joining thread](thread/jthread "cpp/thread/jthread") | `201911L` | [`<stop_token>`](header/stop_token "cpp/header/stop token") [`<thread>`](header/thread "cpp/header/thread") | (C++20) | | `__cpp_lib_latch` | [`std::latch`](thread/latch "cpp/thread/latch") | `201907L` | [`<latch>`](header/latch "cpp/header/latch") | (C++20) | | `__cpp_lib_launder` | Core Issue 1776: Replacement of class objects containing reference members (`[std::launder](utility/launder "cpp/utility/launder")`) | `201606L` | [`<new>`](header/new "cpp/header/new") | (C++17) | | `__cpp_lib_list_remove_``return_type` | Change the return type of the `[remove()](container/list/remove "cpp/container/list/remove")`, `[remove\_if()](container/list/remove "cpp/container/list/remove")` and `[unique()](container/list/unique "cpp/container/list/unique")` members of `[std::forward\_list](container/forward_list "cpp/container/forward list")` and `[std::list](container/list "cpp/container/list")` | `201806L` | [`<forward_list>`](header/forward_list "cpp/header/forward list") [`<list>`](header/list "cpp/header/list") | (C++20) | | `__cpp_lib_logical_traits` | [Logical operator type traits](types#Operations_on_traits "cpp/types") | `201510L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_make_from_tuple` | `[std::make\_from\_tuple()](utility/make_from_tuple "cpp/utility/make from tuple")` | `201606L` | [`<tuple>`](header/tuple "cpp/header/tuple") | (C++17) | | `__cpp_lib_make_reverse_``iterator` | [`std::make_reverse_iterator`](iterator/make_reverse_iterator "cpp/iterator/make reverse iterator") | `201402L` | [`<iterator>`](header/iterator "cpp/header/iterator") | (C++14) | | `__cpp_lib_make_unique` | `[std::make\_unique](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique")` | `201304L` | [`<memory>`](header/memory "cpp/header/memory") | (C++14) | | `__cpp_lib_map_try_emplace` | `[std::map::try\_emplace](container/map/try_emplace "cpp/container/map/try emplace")`, `[std::map::insert\_or\_assign](container/map/insert_or_assign "cpp/container/map/insert or assign")` | `201411L` | [`<map>`](header/map "cpp/header/map") | (C++17) | | `__cpp_lib_math_constants` | [Mathematical constants](header/numbers "cpp/header/numbers") | `201907L` | [`<numbers>`](header/numbers "cpp/header/numbers") | (C++20) | | `__cpp_lib_math_special_``functions` | [Mathematical special functions for C++17](numeric/special_functions "cpp/numeric/special functions") | `201603L` | [`<cmath>`](header/cmath "cpp/header/cmath") | (C++17) | | `__cpp_lib_mdspan` | [`std::mdspan`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan&action=edit&redlink=1 "cpp/container/mdspan (page does not exist)") | `202207L` | [`<mdspan>`](header/mdspan "cpp/header/mdspan") | (C++23) | | `__cpp_lib_memory_resource` | `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` | `201603L` | [`<memory_resource>`](header/memory_resource "cpp/header/memory resource") | (C++17) | | `__cpp_lib_modules` | Standard library modules `std` and `std.compat` | `202207L` | | (C++23) | | `__cpp_lib_move_iterator_``concept` | Make `[std::move\_iterator<T\*>](iterator/move_iterator "cpp/iterator/move iterator")` a random access iterator | `202207L` | [`<iterator>`](header/iterator "cpp/header/iterator") | (C++23) | | `__cpp_lib_move_only_function` | `std::move_only_function` | `202110L` | [`<functional>`](header/functional "cpp/header/functional") | (C++23) | | `__cpp_lib_node_extract` | Splicing maps and sets (`[std::map::extract](container/map/extract "cpp/container/map/extract")`, `[std::map::merge](container/map/merge "cpp/container/map/merge")`, [`std::map::insert(node_type)`](container/map/insert "cpp/container/map/insert"), etc) | `201606L` | [`<map>`](header/map "cpp/header/map") [`<set>`](header/set "cpp/header/set") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") | (C++17) | | `__cpp_lib_nonmember_``container_access` | `[std::size()](iterator/size "cpp/iterator/size")`, `[std::data()](iterator/data "cpp/iterator/data")` and `[std::empty()](iterator/empty "cpp/iterator/empty")` | `201411L` | [`<iterator>`](header/iterator "cpp/header/iterator") [`<array>`](header/array "cpp/header/array") [`<deque>`](header/deque "cpp/header/deque") [`<forward_list>`](header/forward_list "cpp/header/forward list") [`<list>`](header/list "cpp/header/list") [`<map>`](header/map "cpp/header/map") [`<regex>`](header/regex "cpp/header/regex") [`<set>`](header/set "cpp/header/set") [`<string>`](header/string "cpp/header/string") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") [`<vector>`](header/vector "cpp/header/vector") | (C++17) | | `__cpp_lib_not_fn` | `[std::not\_fn()](utility/functional/not_fn "cpp/utility/functional/not fn")` | `201603L` | [`<functional>`](header/functional "cpp/header/functional") | (C++17) | | `__cpp_lib_null_iterators` | Null [LegacyForwardIterators](named_req/forwarditerator "cpp/named req/ForwardIterator") | `201304L` | [`<iterator>`](header/iterator "cpp/header/iterator") | (C++14) | | `__cpp_lib_optional` | `[std::optional](utility/optional "cpp/utility/optional")` | `201606L` | [`<optional>`](header/optional "cpp/header/optional") | (C++17) | | Fully `constexpr` `[std::optional](utility/optional "cpp/utility/optional")` | `202106L` | [`<optional>`](header/optional "cpp/header/optional") | (C++20)(DR) | | [Monadic operations](utility/optional#Monadic_operations "cpp/utility/optional") in `[std::optional](utility/optional "cpp/utility/optional")` | `202110L` | [`<optional>`](header/optional "cpp/header/optional") | (C++23) | | `__cpp_lib_out_ptr` | `std::out_ptr`, `std::inout_ptr` | `202106L` | [`<memory>`](header/memory "cpp/header/memory") | (C++23) | | `__cpp_lib_parallel_algorithm` | [Parallel algorithms](algorithm#Execution_policies "cpp/algorithm") | `201603L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") [`<numeric>`](header/numeric "cpp/header/numeric") | (C++17) | | `__cpp_lib_polymorphic_``allocator` | `[std::pmr::polymorphic\_allocator<>](memory/polymorphic_allocator "cpp/memory/polymorphic allocator")` as a vocabulary type | `201902L` | [`<memory_resource>`](header/memory_resource "cpp/header/memory resource") | (C++20) | | `__cpp_lib_print` | Formatted output | `202207L` | [`<ostream>`](header/ostream "cpp/header/ostream") [`<print>`](header/print "cpp/header/print") | (C++23) | | `__cpp_lib_quoted_string_io` | `[std::quoted](io/manip/quoted "cpp/io/manip/quoted")` | `201304L` | [`<iomanip>`](header/iomanip "cpp/header/iomanip") | (C++14) | | `__cpp_lib_ranges` | [Ranges library](ranges "cpp/ranges") and [constrained algorithms](algorithm/ranges "cpp/algorithm/ranges") | `201911L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") [`<functional>`](header/functional "cpp/header/functional") [`<iterator>`](header/iterator "cpp/header/iterator") [`<memory>`](header/memory "cpp/header/memory") [`<ranges>`](header/ranges "cpp/header/ranges") | (C++20) | | Non-[default-initializable](concepts/default_initializable "cpp/concepts/default initializable") [views](ranges/view "cpp/ranges/view") | `202106L` | (C++20)(DR) | | [Views](ranges/view "cpp/ranges/view") with [ownership](ranges/owning_view "cpp/ranges/owning view") | `202110L` | (C++20)(DR) | | `std::ranges::range_adaptor_closure` | `202202L` | (C++23) | | Relaxing [range adaptors](ranges "cpp/ranges") to allow for move only types | `202207L` | (C++23) | | `__cpp_lib_ranges_as_const` | `std::const_iterator`, [`std::ranges::as_const_view`](ranges/as_const_view "cpp/ranges/as const view") | `202207L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_as_rvalue` | [`std::ranges::as_rvalue_view`](ranges/as_rvalue_view "cpp/ranges/as rvalue view") | `202207L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_``cartesian_product` | [`std::ranges::cartesian_product_view`](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/cartesian_product_view&action=edit&redlink=1 "cpp/ranges/cartesian product view (page does not exist)") | `202207L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_chunk` | `std::ranges::chunk_view` | `202202L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_chunk_by` | `std::ranges::chunk_by_view` | `202202L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_contains` | [`std::ranges::contains`](algorithm/ranges/contains "cpp/algorithm/ranges/contains") | `202207L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++23) | | `__cpp_lib_ranges_fold` | `std::ranges` `fold` algorithms | `202207L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++23) | | `__cpp_lib_ranges_iota` | `std::ranges::iota` | `202202L` | [`<numeric>`](header/numeric "cpp/header/numeric") | (C++23) | | `__cpp_lib_ranges_join_with` | `std::ranges::join_with_view` | `202202L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_repeat` | [`std::ranges::repeat_view`](ranges/repeat_view "cpp/ranges/repeat view") | `202207L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_slide` | `std::ranges::slide_view` | `202202L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_``starts_ends_with` | `std::ranges::starts_with`, `std::ranges::ends_with` | `202106L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++23) | | `__cpp_lib_ranges_stride` | [`std::ranges::stride_view`](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/stride_view&action=edit&redlink=1 "cpp/ranges/stride view (page does not exist)") | `202207L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_to_container` | `std::ranges::to` | `202202L` | [`<ranges>`](header/ranges "cpp/header/ranges") | (C++23) | | `__cpp_lib_ranges_zip` | `std::ranges::zip_view`, `std::ranges::zip_transform_view`, `std::ranges::adjacent_view`, `std::ranges::adjacent_transform_view` | `202110L` | [`<ranges>`](header/ranges "cpp/header/ranges") [`<tuple>`](header/tuple "cpp/header/tuple") [`<utility>`](header/utility "cpp/header/utility") | (C++23) | | `__cpp_lib_raw_memory_``algorithms` | [Extending memory management tools](memory#Uninitialized_storage "cpp/memory") | `201606L` | [`<memory>`](header/memory "cpp/header/memory") | (C++17) | | `__cpp_lib_reference_``from_temporary` | `std::reference_constructs_from_temporary` and `std::reference_converts_from_temporary` | `202202L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++23) | | `__cpp_lib_remove_cvref` | [`std::remove_cvref`](types/remove_cvref "cpp/types/remove cvref") | `201711L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_result_of_sfinae` | `[std::result\_of](types/result_of "cpp/types/result of")` and [SFINAE](language/sfinae "cpp/language/sfinae") | `201210L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") [`<functional>`](header/functional "cpp/header/functional") | (C++14) | | `__cpp_lib_robust_``nonmodifying_seq_ops` | Making non-modifying sequence operations more robust (two-range overloads for `[std::mismatch](algorithm/mismatch "cpp/algorithm/mismatch")`, `[std::equal](algorithm/equal "cpp/algorithm/equal")` and `[std::is\_permutation](algorithm/is_permutation "cpp/algorithm/is permutation")`) | `201304L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++14) | | `__cpp_lib_sample` | `[std::sample](algorithm/sample "cpp/algorithm/sample")` | `201603L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++17) | | `__cpp_lib_scoped_lock` | [`std::scoped_lock`](thread/scoped_lock "cpp/thread/scoped lock") | `201703L` | [`<mutex>`](header/mutex "cpp/header/mutex") | (C++17) | | `__cpp_lib_semaphore` | [`std::counting_semaphore`, `std::binary_semaphore`](thread/counting_semaphore "cpp/thread/counting semaphore") | `201907L` | [`<semaphore>`](header/semaphore "cpp/header/semaphore") | (C++20) | | `__cpp_lib_shared_mutex` | `[std::shared\_mutex](thread/shared_mutex "cpp/thread/shared mutex")` (untimed) | `201505L` | [`<shared_mutex>`](header/shared_mutex "cpp/header/shared mutex") | (C++17) | | `__cpp_lib_shared_ptr_arrays` | `[std::shared\_ptr<T[]>](memory/shared_ptr "cpp/memory/shared ptr")` | `201611L` | [`<memory>`](header/memory "cpp/header/memory") | (C++17) | | Array support of `[std::make\_shared](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared")` | `201707L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_shared_ptr_weak_type` | [`shared_ptr::weak_type`](memory/shared_ptr "cpp/memory/shared ptr") | `201606L` | [`<memory>`](header/memory "cpp/header/memory") | (C++17) | | `__cpp_lib_shared_timed_mutex` | `[std::shared\_timed\_mutex](thread/shared_timed_mutex "cpp/thread/shared timed mutex")` | `201402L` | [`<shared_mutex>`](header/shared_mutex "cpp/header/shared mutex") | (C++14) | | `__cpp_lib_shift` | [`std::shift_left`](algorithm/shift "cpp/algorithm/shift") and [`std::shift_right`](algorithm/shift "cpp/algorithm/shift") | `201806L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++20) | | `std::ranges::shift_left` and `std::ranges::shift_right` | `202202L` | [`<algorithm>`](header/algorithm "cpp/header/algorithm") | (C++23) | | `__cpp_lib_smart_``ptr_for_overwrite` | Smart pointer creation with default initialization (`[std::allocate\_shared\_for\_overwrite](memory/shared_ptr/allocate_shared "cpp/memory/shared ptr/allocate shared")`, `[std::make\_shared\_for\_overwrite](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared")`, `[std::make\_unique\_for\_overwrite](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique")`) | `202002L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_source_location` | Source-code information capture (`[std::source\_location](utility/source_location "cpp/utility/source location")`) | `201907L` | [`<source_location>`](header/source_location "cpp/header/source location") | (C++20) | | `__cpp_lib_span` | [`std::span`](container/span "cpp/container/span") | `202002L` | [`<span>`](header/span "cpp/header/span") | (C++20) | | `__cpp_lib_spanstream` | [`std::spanbuf`](io/basic_spanbuf "cpp/io/basic spanbuf"), [`std::spanstream`](io/basic_spanstream "cpp/io/basic spanstream") | `202106L` | [`<spanstream>`](header/spanstream "cpp/header/spanstream") | (C++23) | | `__cpp_lib_ssize` | [`std::ssize`](iterator/size "cpp/iterator/size") and unsigned [`std::span::size`](container/span/size "cpp/container/span/size") | `201902L` | [`<iterator>`](header/iterator "cpp/header/iterator") | (C++20) | | `__cpp_lib_stacktrace` | [Stacktrace library](error#Stacktrace "cpp/error") | `202011L` | [`<stacktrace>`](header/stacktrace "cpp/header/stacktrace") | (C++23) | | `__cpp_lib_start_lifetime_as` | Explicit lifetime management (`std::start_lifetime_as`) | `202207L` | [`<memory>`](header/memory "cpp/header/memory") | (C++23) | | `__cpp_lib_starts_ends_with` | String prefix and suffix checking (`starts_with()` and `ends_with()` for `[std::string](string/basic_string "cpp/string/basic string")` and `[std::string\_view](string/basic_string_view "cpp/string/basic string view")`) | `201711L` | [`<string>`](header/string "cpp/header/string") [`<string_view>`](header/string_view "cpp/header/string view") | (C++20) | | `__cpp_lib_stdatomic_h` | Compatibility header for C atomic operations | `202011L` | [`<stdatomic.h>`](header/stdatomic.h "cpp/header/stdatomic.h") | (C++23) | | `__cpp_lib_string_contains` | `contains` functions of [`std::basic_string`](string/basic_string/contains "cpp/string/basic string/contains") and [`std::basic_string_view`](string/basic_string_view/contains "cpp/string/basic string view/contains") | `202011L` | [`<string>`](header/string "cpp/header/string") [`<string_view>`](header/string_view "cpp/header/string view") | (C++23) | | `__cpp_lib_string_``resize_and_overwrite` | `std::basic_string::resize_and_overwrite` | `202110L` | [`<string>`](header/string "cpp/header/string") | (C++23) | | `__cpp_lib_string_udls` | [User-defined literals for string types](string/basic_string/operator%22%22s "cpp/string/basic string/operator\"\"s") | `201304L` | [`<string>`](header/string "cpp/header/string") | (C++14) | | `__cpp_lib_string_view` | `[std::string\_view](string/basic_string_view "cpp/string/basic string view")` | `201606L` | [`<string>`](header/string "cpp/header/string") [`<string_view>`](header/string_view "cpp/header/string view") | (C++17) | | [ConstexprIterator](named_req/constexpriterator "cpp/named req/ConstexprIterator") | `201803L` | [`<string>`](header/string "cpp/header/string") [`<string_view>`](header/string_view "cpp/header/string view") | (C++20) | | `__cpp_lib_syncbuf` | Synchronized buffered ostream (`std::syncbuf`, `std::osyncstream`) and manipulators | `201803L` | [`<syncstream>`](header/syncstream "cpp/header/syncstream") | (C++20) | | `__cpp_lib_three_way_comparison` | [Three-way comparison](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") (library support); adding three-way comparison to the library | `201907L` | [`<compare>`](header/compare "cpp/header/compare") | (C++20) | | `__cpp_lib_to_address` | Utility to convert a pointer to a raw pointer (`std::to_address`) | `201711L` | [`<memory>`](header/memory "cpp/header/memory") | (C++20) | | `__cpp_lib_to_array` | [`std::to_array`](container/array/to_array "cpp/container/array/to array") | `201907L` | [`<array>`](header/array "cpp/header/array") | (C++20) | | `__cpp_lib_to_chars` | Elementary string conversions ([`std::to_chars`](utility/to_chars "cpp/utility/to chars"), [`std::from_chars`](utility/from_chars "cpp/utility/from chars")) | `201611L` | [`<charconv>`](header/charconv "cpp/header/charconv") | (C++17) | | `__cpp_lib_to_underlying` | [`std::to_underlying`](utility/to_underlying "cpp/utility/to underlying") | `202102L` | [`<utility>`](header/utility "cpp/header/utility") | (C++23) | | `__cpp_lib_transformation_``trait_aliases` | Alias templates for TransformationTraits | `201304L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++14) | | `__cpp_lib_transparent_``operators` | Transparent operator functors (`std::less<>` et al) | `201210L` | [`<functional>`](header/functional "cpp/header/functional") | (C++14) | | Transparent `[std::owner\_less](memory/owner_less "cpp/memory/owner less")` ([`std::owner_less<void>`](memory/owner_less_void "cpp/memory/owner less void")) | `201510L` | [`<memory>`](header/memory "cpp/header/memory") [`<functional>`](header/functional "cpp/header/functional") | (C++17) | | `__cpp_lib_tuple_element_t` | [`std::tuple_element_t`](utility/tuple_element#Helper_types "cpp/utility/tuple element") | `201402L` | [`<tuple>`](header/tuple "cpp/header/tuple") | (C++14) | | `__cpp_lib_tuple_like` | Compatibility between `[std::tuple](utility/tuple "cpp/utility/tuple")` and tuple-like objects (`[std::pair](utility/pair "cpp/utility/pair")`, `[std::array](container/array "cpp/container/array")`, `std::subrange`) | `202207L` | [`<map>`](header/map "cpp/header/map") [`<tuple>`](header/tuple "cpp/header/tuple") [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") [`<utility>`](header/utility "cpp/header/utility") | (C++23) | | `__cpp_lib_tuples_by_type` | [Addressing tuples by type](utility/tuple/get "cpp/utility/tuple/get") | `201304L` | [`<tuple>`](header/tuple "cpp/header/tuple") [`<utility>`](header/utility "cpp/header/utility") | (C++14) | | `__cpp_lib_type_identity` | `std::type_identity` | `201806L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_type_trait_``variable_templates` | Type traits variable templates (`[std::is\_void\_v](types/is_void "cpp/types/is void")`, etc) | `201510L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | | `__cpp_lib_uncaught_exceptions` | [`std::uncaught_exceptions`](error/uncaught_exception "cpp/error/uncaught exception") | `201411L` | [`<exception>`](header/exception "cpp/header/exception") | (C++17) | | `__cpp_lib_unordered_``map_try_emplace` | `[std::unordered\_map::try\_emplace](container/unordered_map/try_emplace "cpp/container/unordered map/try emplace")`, `[std::unordered\_map::insert\_or\_assign](container/unordered_map/insert_or_assign "cpp/container/unordered map/insert or assign")` | `201411L` | [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") | (C++17) | | `__cpp_lib_unreachable` | `std::unreachable` | `202202L` | [`<utility>`](header/utility "cpp/header/utility") | (C++23) | | `__cpp_lib_unwrap_ref` | `[std::unwrap\_ref\_decay](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference")` and `[std::unwrap\_reference](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference")` | `201811L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++20) | | `__cpp_lib_variant` | `[std::variant](utility/variant "cpp/utility/variant")`: a type-safe union for C++17 | `201606L` | [`<variant>`](header/variant "cpp/header/variant") | (C++17) | | `[std::visit](utility/variant/visit "cpp/utility/variant/visit")` for classes derived from `[std::variant](utility/variant "cpp/utility/variant")` | `202102L` | [`<variant>`](header/variant "cpp/header/variant") | (C++17)(DR) | | Fully `constexpr` `[std::variant](utility/variant "cpp/utility/variant")` | `202106L` | [`<variant>`](header/variant "cpp/header/variant") | (C++20)(DR) | | `__cpp_lib_void_t` | `[std::void\_t](types/void_t "cpp/types/void t")` | `201411L` | [`<type_traits>`](header/type_traits "cpp/header/type traits") | (C++17) | ### Example #### Normal usage ``` #ifdef __has_include // Check if __has_include is present # if __has_include(<optional>) // Check for a standard library # include <optional> # elif __has_include(<experimental/optional>) // Check for an experimental version # include <experimental/optional> # elif __has_include(<boost/optional.hpp>) // Try with an external library # include <boost/optional.hpp> # else // Not found at all # error "Missing <optional>" # endif #endif #ifdef __has_cpp_attribute // Check if __has_cpp_attribute is present # if __has_cpp_attribute(deprecated) // Check for an attribute # define DEPRECATED(msg) [[deprecated(msg)]] # endif #endif #ifndef DEPRECATED # define DEPRECATED(msg) #endif DEPRECATED("foo() has been deprecated") void foo(); #if __cpp_constexpr >= 201304 // Check for a specific version of a feature # define CONSTEXPR constexpr #else # define CONSTEXPR inline #endif CONSTEXPR int bar(unsigned i) { #if __cpp_binary_literals // Check for the presence of a feature unsigned mask1 = 0b11000000; unsigned mask2 = 0b00000111; #else unsigned mask1 = 0xC0; unsigned mask2 = 0x07; #endif if ( i & mask1 ) return 1; if ( i & mask2 ) return 2; return 0; } int main() { } ``` #### Compiler Features Dump The following program dumps C++ compiler features and attributes. ``` // Change these options to print out only necessary info. static struct PrintOptions { constexpr static bool titles = 1; constexpr static bool counters = 1; constexpr static bool attributes = 1; constexpr static bool general_features = 1; constexpr static bool core_features = 1; constexpr static bool lib_features = 1; constexpr static bool supported_features = 1; constexpr static bool unsupported_features = 1; constexpr static bool sort_by_date = 0; constexpr static bool separate_year_month = 0; constexpr static bool cxx11 = 1; constexpr static bool cxx14 = 1; constexpr static bool cxx17 = 1; constexpr static bool cxx20 = 1; constexpr static bool cxx23 = 1; constexpr static bool cxx26 = 0; } print; #if __cplusplus < 201100 # error "C++11 or better is required" #endif #include <algorithm> #include <cstring> #include <iomanip> #include <iostream> #include <string> #ifdef __has_include # if __has_include(<version>) # include <version> # endif #endif #define COMPILER_FEATURE_VALUE(value) #value #define COMPILER_FEATURE_ENTRY(name) { #name, COMPILER_FEATURE_VALUE(name) }, #ifdef __has_cpp_attribute # define COMPILER_ATTRIBUTE_VALUE_AS_STRING(s) #s # define COMPILER_ATTRIBUTE_AS_NUMBER(x) COMPILER_ATTRIBUTE_VALUE_AS_STRING(x) # define COMPILER_ATTRIBUTE_ENTRY(attr) \ { #attr, COMPILER_ATTRIBUTE_AS_NUMBER(__has_cpp_attribute(attr)) }, #else # define COMPILER_ATTRIBUTE_ENTRY(attr) { #attr, "_" }, #endif struct CompilerFeature { CompilerFeature(const char* name = nullptr, const char* value = nullptr) : name(name), value(value) {} const char* name; const char* value; }; static CompilerFeature cxx_core[] = { COMPILER_FEATURE_ENTRY(__cplusplus) COMPILER_FEATURE_ENTRY(__cpp_exceptions) COMPILER_FEATURE_ENTRY(__cpp_rtti) #if 0 COMPILER_FEATURE_ENTRY(__GNUC__) COMPILER_FEATURE_ENTRY(__GNUC_MINOR__) COMPILER_FEATURE_ENTRY(__GNUC_PATCHLEVEL__) COMPILER_FEATURE_ENTRY(__GNUG__) COMPILER_FEATURE_ENTRY(__clang__) COMPILER_FEATURE_ENTRY(__clang_major__) COMPILER_FEATURE_ENTRY(__clang_minor__) COMPILER_FEATURE_ENTRY(__clang_patchlevel__) #endif }; static CompilerFeature cxx11_core[] = { COMPILER_FEATURE_ENTRY(__cpp_alias_templates) COMPILER_FEATURE_ENTRY(__cpp_attributes) COMPILER_FEATURE_ENTRY(__cpp_constexpr) COMPILER_FEATURE_ENTRY(__cpp_decltype) COMPILER_FEATURE_ENTRY(__cpp_delegating_constructors) COMPILER_FEATURE_ENTRY(__cpp_inheriting_constructors) COMPILER_FEATURE_ENTRY(__cpp_initializer_lists) COMPILER_FEATURE_ENTRY(__cpp_lambdas) COMPILER_FEATURE_ENTRY(__cpp_nsdmi) COMPILER_FEATURE_ENTRY(__cpp_range_based_for) COMPILER_FEATURE_ENTRY(__cpp_raw_strings) COMPILER_FEATURE_ENTRY(__cpp_ref_qualifiers) COMPILER_FEATURE_ENTRY(__cpp_rvalue_references) COMPILER_FEATURE_ENTRY(__cpp_static_assert) COMPILER_FEATURE_ENTRY(__cpp_threadsafe_static_init) COMPILER_FEATURE_ENTRY(__cpp_unicode_characters) COMPILER_FEATURE_ENTRY(__cpp_unicode_literals) COMPILER_FEATURE_ENTRY(__cpp_user_defined_literals) COMPILER_FEATURE_ENTRY(__cpp_variadic_templates) }; static CompilerFeature cxx14_core[] = { COMPILER_FEATURE_ENTRY(__cpp_aggregate_nsdmi) COMPILER_FEATURE_ENTRY(__cpp_binary_literals) COMPILER_FEATURE_ENTRY(__cpp_constexpr) COMPILER_FEATURE_ENTRY(__cpp_decltype_auto) COMPILER_FEATURE_ENTRY(__cpp_generic_lambdas) COMPILER_FEATURE_ENTRY(__cpp_init_captures) COMPILER_FEATURE_ENTRY(__cpp_return_type_deduction) COMPILER_FEATURE_ENTRY(__cpp_sized_deallocation) COMPILER_FEATURE_ENTRY(__cpp_variable_templates) }; static CompilerFeature cxx14_lib[] = { COMPILER_FEATURE_ENTRY(__cpp_lib_chrono_udls) COMPILER_FEATURE_ENTRY(__cpp_lib_complex_udls) COMPILER_FEATURE_ENTRY(__cpp_lib_exchange_function) COMPILER_FEATURE_ENTRY(__cpp_lib_generic_associative_lookup) COMPILER_FEATURE_ENTRY(__cpp_lib_integer_sequence) COMPILER_FEATURE_ENTRY(__cpp_lib_integral_constant_callable) COMPILER_FEATURE_ENTRY(__cpp_lib_is_final) COMPILER_FEATURE_ENTRY(__cpp_lib_is_null_pointer) COMPILER_FEATURE_ENTRY(__cpp_lib_make_reverse_iterator) COMPILER_FEATURE_ENTRY(__cpp_lib_make_unique) COMPILER_FEATURE_ENTRY(__cpp_lib_null_iterators) COMPILER_FEATURE_ENTRY(__cpp_lib_quoted_string_io) COMPILER_FEATURE_ENTRY(__cpp_lib_result_of_sfinae) COMPILER_FEATURE_ENTRY(__cpp_lib_robust_nonmodifying_seq_ops) COMPILER_FEATURE_ENTRY(__cpp_lib_shared_timed_mutex) COMPILER_FEATURE_ENTRY(__cpp_lib_string_udls) COMPILER_FEATURE_ENTRY(__cpp_lib_transformation_trait_aliases) COMPILER_FEATURE_ENTRY(__cpp_lib_transparent_operators) COMPILER_FEATURE_ENTRY(__cpp_lib_tuple_element_t) COMPILER_FEATURE_ENTRY(__cpp_lib_tuples_by_type) }; static CompilerFeature cxx17_core[] = { COMPILER_FEATURE_ENTRY(__cpp_aggregate_bases) COMPILER_FEATURE_ENTRY(__cpp_aligned_new) COMPILER_FEATURE_ENTRY(__cpp_capture_star_this) COMPILER_FEATURE_ENTRY(__cpp_constexpr) COMPILER_FEATURE_ENTRY(__cpp_deduction_guides) COMPILER_FEATURE_ENTRY(__cpp_enumerator_attributes) COMPILER_FEATURE_ENTRY(__cpp_fold_expressions) COMPILER_FEATURE_ENTRY(__cpp_guaranteed_copy_elision) COMPILER_FEATURE_ENTRY(__cpp_hex_float) COMPILER_FEATURE_ENTRY(__cpp_if_constexpr) COMPILER_FEATURE_ENTRY(__cpp_inheriting_constructors) COMPILER_FEATURE_ENTRY(__cpp_inline_variables) COMPILER_FEATURE_ENTRY(__cpp_namespace_attributes) COMPILER_FEATURE_ENTRY(__cpp_noexcept_function_type) COMPILER_FEATURE_ENTRY(__cpp_nontype_template_args) COMPILER_FEATURE_ENTRY(__cpp_nontype_template_parameter_auto) COMPILER_FEATURE_ENTRY(__cpp_range_based_for) COMPILER_FEATURE_ENTRY(__cpp_static_assert) COMPILER_FEATURE_ENTRY(__cpp_structured_bindings) COMPILER_FEATURE_ENTRY(__cpp_template_template_args) COMPILER_FEATURE_ENTRY(__cpp_variadic_using) }; static CompilerFeature cxx17_lib[] = { COMPILER_FEATURE_ENTRY(__cpp_lib_addressof_constexpr) COMPILER_FEATURE_ENTRY(__cpp_lib_allocator_traits_is_always_equal) COMPILER_FEATURE_ENTRY(__cpp_lib_any) COMPILER_FEATURE_ENTRY(__cpp_lib_apply) COMPILER_FEATURE_ENTRY(__cpp_lib_array_constexpr) COMPILER_FEATURE_ENTRY(__cpp_lib_as_const) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_is_always_lock_free) COMPILER_FEATURE_ENTRY(__cpp_lib_bool_constant) COMPILER_FEATURE_ENTRY(__cpp_lib_boyer_moore_searcher) COMPILER_FEATURE_ENTRY(__cpp_lib_byte) COMPILER_FEATURE_ENTRY(__cpp_lib_chrono) COMPILER_FEATURE_ENTRY(__cpp_lib_clamp) COMPILER_FEATURE_ENTRY(__cpp_lib_enable_shared_from_this) COMPILER_FEATURE_ENTRY(__cpp_lib_execution) COMPILER_FEATURE_ENTRY(__cpp_lib_filesystem) COMPILER_FEATURE_ENTRY(__cpp_lib_gcd_lcm) COMPILER_FEATURE_ENTRY(__cpp_lib_hardware_interference_size) COMPILER_FEATURE_ENTRY(__cpp_lib_has_unique_object_representations) COMPILER_FEATURE_ENTRY(__cpp_lib_hypot) COMPILER_FEATURE_ENTRY(__cpp_lib_incomplete_container_elements) COMPILER_FEATURE_ENTRY(__cpp_lib_invoke) COMPILER_FEATURE_ENTRY(__cpp_lib_is_aggregate) COMPILER_FEATURE_ENTRY(__cpp_lib_is_invocable) COMPILER_FEATURE_ENTRY(__cpp_lib_is_swappable) COMPILER_FEATURE_ENTRY(__cpp_lib_launder) COMPILER_FEATURE_ENTRY(__cpp_lib_logical_traits) COMPILER_FEATURE_ENTRY(__cpp_lib_make_from_tuple) COMPILER_FEATURE_ENTRY(__cpp_lib_map_try_emplace) COMPILER_FEATURE_ENTRY(__cpp_lib_math_special_functions) COMPILER_FEATURE_ENTRY(__cpp_lib_memory_resource) COMPILER_FEATURE_ENTRY(__cpp_lib_node_extract) COMPILER_FEATURE_ENTRY(__cpp_lib_nonmember_container_access) COMPILER_FEATURE_ENTRY(__cpp_lib_not_fn) COMPILER_FEATURE_ENTRY(__cpp_lib_optional) COMPILER_FEATURE_ENTRY(__cpp_lib_parallel_algorithm) COMPILER_FEATURE_ENTRY(__cpp_lib_raw_memory_algorithms) COMPILER_FEATURE_ENTRY(__cpp_lib_sample) COMPILER_FEATURE_ENTRY(__cpp_lib_scoped_lock) COMPILER_FEATURE_ENTRY(__cpp_lib_shared_mutex) COMPILER_FEATURE_ENTRY(__cpp_lib_shared_ptr_arrays) COMPILER_FEATURE_ENTRY(__cpp_lib_shared_ptr_weak_type) COMPILER_FEATURE_ENTRY(__cpp_lib_string_view) COMPILER_FEATURE_ENTRY(__cpp_lib_to_chars) COMPILER_FEATURE_ENTRY(__cpp_lib_transparent_operators) COMPILER_FEATURE_ENTRY(__cpp_lib_type_trait_variable_templates) COMPILER_FEATURE_ENTRY(__cpp_lib_uncaught_exceptions) COMPILER_FEATURE_ENTRY(__cpp_lib_unordered_map_try_emplace) COMPILER_FEATURE_ENTRY(__cpp_lib_variant) COMPILER_FEATURE_ENTRY(__cpp_lib_void_t) }; static CompilerFeature cxx20_core[] = { COMPILER_FEATURE_ENTRY(__cpp_aggregate_paren_init) COMPILER_FEATURE_ENTRY(__cpp_char8_t) COMPILER_FEATURE_ENTRY(__cpp_concepts) COMPILER_FEATURE_ENTRY(__cpp_conditional_explicit) COMPILER_FEATURE_ENTRY(__cpp_consteval) COMPILER_FEATURE_ENTRY(__cpp_constexpr) COMPILER_FEATURE_ENTRY(__cpp_constexpr_dynamic_alloc) COMPILER_FEATURE_ENTRY(__cpp_constexpr_in_decltype) COMPILER_FEATURE_ENTRY(__cpp_constinit) COMPILER_FEATURE_ENTRY(__cpp_deduction_guides) COMPILER_FEATURE_ENTRY(__cpp_designated_initializers) COMPILER_FEATURE_ENTRY(__cpp_generic_lambdas) COMPILER_FEATURE_ENTRY(__cpp_impl_coroutine) COMPILER_FEATURE_ENTRY(__cpp_impl_destroying_delete) COMPILER_FEATURE_ENTRY(__cpp_impl_three_way_comparison) COMPILER_FEATURE_ENTRY(__cpp_init_captures) COMPILER_FEATURE_ENTRY(__cpp_modules) COMPILER_FEATURE_ENTRY(__cpp_nontype_template_args) COMPILER_FEATURE_ENTRY(__cpp_using_enum) }; static CompilerFeature cxx20_lib[] = { COMPILER_FEATURE_ENTRY(__cpp_lib_array_constexpr) COMPILER_FEATURE_ENTRY(__cpp_lib_assume_aligned) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_flag_test) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_float) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_lock_free_type_aliases) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_ref) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_shared_ptr) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_value_initialization) COMPILER_FEATURE_ENTRY(__cpp_lib_atomic_wait) COMPILER_FEATURE_ENTRY(__cpp_lib_barrier) COMPILER_FEATURE_ENTRY(__cpp_lib_bind_front) COMPILER_FEATURE_ENTRY(__cpp_lib_bit_cast) COMPILER_FEATURE_ENTRY(__cpp_lib_bitops) COMPILER_FEATURE_ENTRY(__cpp_lib_bounded_array_traits) COMPILER_FEATURE_ENTRY(__cpp_lib_char8_t) COMPILER_FEATURE_ENTRY(__cpp_lib_chrono) COMPILER_FEATURE_ENTRY(__cpp_lib_concepts) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_algorithms) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_complex) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_dynamic_alloc) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_functional) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_iterator) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_memory) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_numeric) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_string) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_string_view) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_tuple) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_utility) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_vector) COMPILER_FEATURE_ENTRY(__cpp_lib_coroutine) COMPILER_FEATURE_ENTRY(__cpp_lib_destroying_delete) COMPILER_FEATURE_ENTRY(__cpp_lib_endian) COMPILER_FEATURE_ENTRY(__cpp_lib_erase_if) COMPILER_FEATURE_ENTRY(__cpp_lib_execution) COMPILER_FEATURE_ENTRY(__cpp_lib_format) COMPILER_FEATURE_ENTRY(__cpp_lib_generic_unordered_lookup) COMPILER_FEATURE_ENTRY(__cpp_lib_int_pow2) COMPILER_FEATURE_ENTRY(__cpp_lib_integer_comparison_functions) COMPILER_FEATURE_ENTRY(__cpp_lib_interpolate) COMPILER_FEATURE_ENTRY(__cpp_lib_is_constant_evaluated) COMPILER_FEATURE_ENTRY(__cpp_lib_is_layout_compatible) COMPILER_FEATURE_ENTRY(__cpp_lib_is_nothrow_convertible) COMPILER_FEATURE_ENTRY(__cpp_lib_is_pointer_interconvertible) COMPILER_FEATURE_ENTRY(__cpp_lib_jthread) COMPILER_FEATURE_ENTRY(__cpp_lib_latch) COMPILER_FEATURE_ENTRY(__cpp_lib_list_remove_return_type) COMPILER_FEATURE_ENTRY(__cpp_lib_math_constants) COMPILER_FEATURE_ENTRY(__cpp_lib_polymorphic_allocator) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges) COMPILER_FEATURE_ENTRY(__cpp_lib_remove_cvref) COMPILER_FEATURE_ENTRY(__cpp_lib_semaphore) COMPILER_FEATURE_ENTRY(__cpp_lib_shared_ptr_arrays) COMPILER_FEATURE_ENTRY(__cpp_lib_shift) COMPILER_FEATURE_ENTRY(__cpp_lib_smart_ptr_for_overwrite) COMPILER_FEATURE_ENTRY(__cpp_lib_source_location) COMPILER_FEATURE_ENTRY(__cpp_lib_span) COMPILER_FEATURE_ENTRY(__cpp_lib_ssize) COMPILER_FEATURE_ENTRY(__cpp_lib_starts_ends_with) COMPILER_FEATURE_ENTRY(__cpp_lib_string_view) COMPILER_FEATURE_ENTRY(__cpp_lib_syncbuf) COMPILER_FEATURE_ENTRY(__cpp_lib_three_way_comparison) COMPILER_FEATURE_ENTRY(__cpp_lib_to_address) COMPILER_FEATURE_ENTRY(__cpp_lib_to_array) COMPILER_FEATURE_ENTRY(__cpp_lib_type_identity) COMPILER_FEATURE_ENTRY(__cpp_lib_unwrap_ref) }; static CompilerFeature cxx23_core[] = { //< Continue to Populate COMPILER_FEATURE_ENTRY(__cpp_char8_t) COMPILER_FEATURE_ENTRY(__cpp_concepts) COMPILER_FEATURE_ENTRY(__cpp_constexpr) COMPILER_FEATURE_ENTRY(__cpp_explicit_this_parameter) COMPILER_FEATURE_ENTRY(__cpp_if_consteval) COMPILER_FEATURE_ENTRY(__cpp_implicit_move) COMPILER_FEATURE_ENTRY(__cpp_multidimensional_subscript) COMPILER_FEATURE_ENTRY(__cpp_named_character_escapes) COMPILER_FEATURE_ENTRY(__cpp_size_t_suffix) COMPILER_FEATURE_ENTRY(__cpp_static_call_operator) }; static CompilerFeature cxx23_lib[] = { //< Continue to Populate COMPILER_FEATURE_ENTRY(__cpp_lib_adaptor_iterator_pair_constructor) COMPILER_FEATURE_ENTRY(__cpp_lib_algorithm_iterator_requirements) COMPILER_FEATURE_ENTRY(__cpp_lib_allocate_at_least) COMPILER_FEATURE_ENTRY(__cpp_lib_associative_heterogeneous_erasure) COMPILER_FEATURE_ENTRY(__cpp_lib_bind_back) COMPILER_FEATURE_ENTRY(__cpp_lib_byteswap) COMPILER_FEATURE_ENTRY(__cpp_lib_concepts) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_bitset) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_charconv) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_cmath) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_memory) COMPILER_FEATURE_ENTRY(__cpp_lib_constexpr_typeinfo) COMPILER_FEATURE_ENTRY(__cpp_lib_containers_ranges) COMPILER_FEATURE_ENTRY(__cpp_lib_expected) COMPILER_FEATURE_ENTRY(__cpp_lib_find_last) COMPILER_FEATURE_ENTRY(__cpp_lib_flat_map) COMPILER_FEATURE_ENTRY(__cpp_lib_flat_set) COMPILER_FEATURE_ENTRY(__cpp_lib_format) COMPILER_FEATURE_ENTRY(__cpp_lib_format_ranges) COMPILER_FEATURE_ENTRY(__cpp_lib_forward_like) COMPILER_FEATURE_ENTRY(__cpp_lib_generator) COMPILER_FEATURE_ENTRY(__cpp_lib_invoke_r) COMPILER_FEATURE_ENTRY(__cpp_lib_ios_noreplace) COMPILER_FEATURE_ENTRY(__cpp_lib_is_scoped_enum) COMPILER_FEATURE_ENTRY(__cpp_lib_mdspan) COMPILER_FEATURE_ENTRY(__cpp_lib_modules) COMPILER_FEATURE_ENTRY(__cpp_lib_move_iterator_concept) COMPILER_FEATURE_ENTRY(__cpp_lib_move_only_function) COMPILER_FEATURE_ENTRY(__cpp_lib_optional) COMPILER_FEATURE_ENTRY(__cpp_lib_out_ptr) COMPILER_FEATURE_ENTRY(__cpp_lib_print) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_as_const) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_as_rvalue) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_cartesian_product) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_chunk) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_chunk_by) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_contains) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_fold) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_iota) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_join_with) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_repeat) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_slide) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_starts_ends_with) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_stride) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_to_container) COMPILER_FEATURE_ENTRY(__cpp_lib_ranges_zip) COMPILER_FEATURE_ENTRY(__cpp_lib_reference_from_temporary) COMPILER_FEATURE_ENTRY(__cpp_lib_shift) COMPILER_FEATURE_ENTRY(__cpp_lib_spanstream) COMPILER_FEATURE_ENTRY(__cpp_lib_stacktrace) COMPILER_FEATURE_ENTRY(__cpp_lib_start_lifetime_as) COMPILER_FEATURE_ENTRY(__cpp_lib_stdatomic_h) COMPILER_FEATURE_ENTRY(__cpp_lib_string_contains) COMPILER_FEATURE_ENTRY(__cpp_lib_string_resize_and_overwrite) COMPILER_FEATURE_ENTRY(__cpp_lib_to_underlying) COMPILER_FEATURE_ENTRY(__cpp_lib_tuple_like) COMPILER_FEATURE_ENTRY(__cpp_lib_unreachable) COMPILER_FEATURE_ENTRY(__cpp_lib_variant) }; static CompilerFeature cxx26_core[] = { //< Populate COMPILER_FEATURE_ENTRY(__cpp_core_TODO) }; static CompilerFeature cxx26_lib[] = { //< Populate COMPILER_FEATURE_ENTRY(__cpp_lib_TODO) }; static CompilerFeature attributes[] = { COMPILER_ATTRIBUTE_ENTRY(assume) COMPILER_ATTRIBUTE_ENTRY(carries_dependency) COMPILER_ATTRIBUTE_ENTRY(deprecated) COMPILER_ATTRIBUTE_ENTRY(fallthrough) COMPILER_ATTRIBUTE_ENTRY(likely) COMPILER_ATTRIBUTE_ENTRY(maybe_unused) COMPILER_ATTRIBUTE_ENTRY(nodiscard) COMPILER_ATTRIBUTE_ENTRY(noreturn) COMPILER_ATTRIBUTE_ENTRY(no_unique_address) COMPILER_ATTRIBUTE_ENTRY(unlikely) }; constexpr bool is_feature_supported(const CompilerFeature& x) { return x.value[0] != '_' && x.value[0] != '0'; } inline void print_compiler_feature(const CompilerFeature& x) { constexpr static int max_name_length = 44; //< Update if necessary std::string value{ is_feature_supported(x) ? x.value : "------" }; if (value.back() == 'L') value.pop_back(); //~ 201603L -> 201603 if (print.separate_year_month) value.insert(4, 1, '-'); //~ 201603 -> 2016-03 if ( (print.supported_features && is_feature_supported(x)) or (print.unsupported_features && !is_feature_supported(x))) { std::cout << std::left << std::setw(max_name_length) << x.name << " " << value << '\n'; } } template<size_t N> inline void show(char const* title, CompilerFeature (&features)[N]) { if (print.titles) { std::cout << std::left << title << " ("; if (print.counters) std::cout << std::count_if( std::begin(features), std::end(features), is_feature_supported) << '/'; std::cout << N << ")\n"; } if (print.sort_by_date) { std::sort(std::begin(features), std::end(features), [](CompilerFeature const& lhs, CompilerFeature const& rhs) { return std::strcmp(lhs.value, rhs.value) < 0; }); } for (const CompilerFeature& x : features) { print_compiler_feature(x); } std::cout << '\n'; } int main() { if (print.general_features) show("C++ GENERAL", cxx_core); if (print.cxx11 && print.core_features) show("C++11 CORE", cxx11_core); if (print.cxx14 && print.core_features) show("C++14 CORE", cxx14_core); if (print.cxx14 && print.lib_features ) show("C++14 LIB" , cxx14_lib); if (print.cxx17 && print.core_features) show("C++17 CORE", cxx17_core); if (print.cxx17 && print.lib_features ) show("C++17 LIB" , cxx17_lib); if (print.cxx20 && print.core_features) show("C++20 CORE", cxx20_core); if (print.cxx20 && print.lib_features ) show("C++20 LIB" , cxx20_lib); if (print.cxx23 && print.core_features) show("C++23 CORE", cxx23_core); if (print.cxx23 && print.lib_features ) show("C++23 LIB" , cxx23_lib); if (print.cxx26 && print.core_features) show("C++26 CORE", cxx26_core); if (print.cxx26 && print.lib_features ) show("C++26 LIB" , cxx26_lib); if (print.attributes) show("ATTRIBUTES", attributes); } ``` Possible output: ``` C++ GENERAL (3/3) __cplusplus 202002 __cpp_exceptions 199711 __cpp_rtti 199711 C++11 CORE (19/19) __cpp_alias_templates 200704 __cpp_attributes 200809 __cpp_constexpr 201907 __cpp_decltype 200707 __cpp_delegating_constructors 200604 __cpp_inheriting_constructors 201511 __cpp_initializer_lists 200806 __cpp_lambdas 200907 __cpp_nsdmi 200809 __cpp_range_based_for 201603 __cpp_raw_strings 200710 __cpp_ref_qualifiers 200710 __cpp_rvalue_references 200610 __cpp_static_assert 201411 __cpp_threadsafe_static_init 200806 __cpp_unicode_characters 200704 __cpp_unicode_literals 200710 __cpp_user_defined_literals 200809 __cpp_variadic_templates 200704 C++14 CORE (9/9) __cpp_aggregate_nsdmi 201304 __cpp_binary_literals 201304 __cpp_constexpr 201907 __cpp_decltype_auto 201304 __cpp_generic_lambdas 201707 __cpp_init_captures 201803 __cpp_return_type_deduction 201304 __cpp_sized_deallocation ------ __cpp_variable_templates 201304 ... truncated ... ``` ### See also | | | | --- | --- | | [**Library feature-test macros**](utility/feature_test "cpp/utility/feature test") (C++20) | defined in the header [`<version>`](header/version "cpp/header/version") | | [C++ documentation](symbol_index/macro "cpp/symbol index/macro") for Macro Symbol Index | ### External links | | | --- | | [The official document on Feature-Test Recommendations](http://isocpp.org/std/standing-documents/sd-6-sg10-feature-test-recommendations) | | [Source code to dump compiler features](https://github.com/makelinux/examples/blob/develop/cpp/features.cpp) |
programming_docs
cpp Freestanding and hosted implementations Freestanding and hosted implementations ======================================= There are two kinds of implementations defined by the C++ standard: ***hosted*** and ***freestanding*** implementations. For *hosted* implementations the set of standard library headers required by the C++ standard is much larger than for *freestanding* ones. In a *freestanding* implementation execution may happen without an operating system. The kind of the implementation is implementation-defined. The macro `__STDC_HOSTED__` is predefined to `1` for hosted implementations and `0` for freestanding implementations. (since C++11). | | | | | | | | --- | --- | --- | --- | --- | --- | | Requirements on [multi-threaded executions and data races](language/memory_model "cpp/language/memory model") | *freestanding* | *hosted* | | --- | --- | | Under a *freestanding* implementation, it is implementation-defined whether a program can have more than one [thread of execution](thread "cpp/thread"). | Under a *hosted* implementation, a C++ program can have more than one [thread](thread "cpp/thread") running concurrently. | | (since C++11) | ### Requirements on the [`main`](language/main_function "cpp/language/main function") function | *freestanding* | *hosted* | | --- | --- | | In a *freestanding* implementation, it is implementation-defined whether a program is required to define a [`main`](language/main_function "cpp/language/main function") function. Start-up and termination is implementation-defined; start-up contains the execution of [constructors](language/constructor "cpp/language/constructor") for objects of [namespace scope](language/scope#Namespace_scope "cpp/language/scope") with static storage duration; termination contains the execution of [destructors](language/destructor "cpp/language/destructor") for objects with static [storage duration](language/storage_duration "cpp/language/storage duration"). | In a *hosted* implementation, a program must contain a global function called [`main`](language/main_function "cpp/language/main function"). Executing a program starts a main [thread of execution](thread "cpp/thread") in which the `main` function is invoked, and in which variables of static [storage duration](language/storage_duration "cpp/language/storage duration") might be initialized and destroyed. | ### Requirements on [standard library headers](index "cpp/header") A *freestanding* implementation has an implementation-defined set of headers. This set includes at least the headers in the following table: Headers required for a *freestanding* implementation | Types | [`<cstddef>`](header/cstddef "cpp/header/cstddef") | | Implementation properties | [`<limits>`](header/limits "cpp/header/limits")[`<cfloat>`](header/cfloat "cpp/header/cfloat") [`<climits>`](header/climits "cpp/header/climits") (since C++11)[`<version>`](header/version "cpp/header/version") (since C++20) | | Integer types | [`<cstdint>`](header/cstdint "cpp/header/cstdint") (since C++11) | | Start and termination | [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") (partial)[[1]](#cite_note-1) | | Dynamic memory management | [`<new>`](header/new "cpp/header/new") | | Type identification | [`<typeinfo>`](header/typeinfo "cpp/header/typeinfo") | | Source location | [`<source_location>`](header/source_location "cpp/header/source location") (since C++20) | | Exception handling | [`<exception>`](header/exception "cpp/header/exception") | | Initializer lists | [`<initializer_list>`](header/initializer_list "cpp/header/initializer list") (since C++11) | | Comparisons | [`<compare>`](header/compare "cpp/header/compare") (since C++20) | | Coroutines support | [`<coroutine>`](header/coroutine "cpp/header/coroutine") (since C++20) | | Other runtime support | [`<cstdarg>`](header/cstdarg "cpp/header/cstdarg") | | Fundamental library concepts | [`<concepts>`](header/concepts "cpp/header/concepts") (since C++20) | | Type traits | [`<type_traits>`](header/type_traits "cpp/header/type traits") (since C++11) | | Bit manipulation | [`<bit>`](header/bit "cpp/header/bit") (since C++20) | | Atomics | [`<atomic>`](header/atomic "cpp/header/atomic") (since C++11)[[2]](#cite_note-2) | | Deprecated headers | [`<ciso646>`](header/ciso646 "cpp/header/ciso646") [`<cstdalign>`](header/cstdalign "cpp/header/cstdalign") [`<cstdbool>`](header/cstdbool "cpp/header/cstdbool") (since C++11)(until C++20) | 1. The supplied version of the header `<cstdlib>` shall declare at least the functions `[std::abort](utility/program/abort "cpp/utility/program/abort")`, `[std::atexit](utility/program/atexit "cpp/utility/program/atexit")`, `[std::exit](utility/program/exit "cpp/utility/program/exit")`, `[std::at\_quick\_exit](utility/program/at_quick_exit "cpp/utility/program/at quick exit")` and `[std::quick\_exit](utility/program/quick_exit "cpp/utility/program/quick exit")` (since C++11). 2. Support for always lock-free integral atomic types and presence of type aliases `[std::atomic\_signed\_lock\_free](atomic/atomic "cpp/atomic/atomic")` and `[std::atomic\_unsigned\_lock\_free](atomic/atomic "cpp/atomic/atomic")` are implementation-defined in a freestanding implementation. (since C++20) ### Notes Compiler vendors may not correctly support freestanding implementation, for either implementation issues (see: [GCC bug 100057](https://gcc.gnu.org/bugzilla/show_bug.cgi?id=100057)) or just ignoring freestanding as a whole (LLVM libcxx and msvc stl). ### References * C++20 standard (ISO/IEC 14882:2020): + 4.1 Implementation compliance [intro.compliance] (p: 7) + 6.9.2 Multi-threaded executions and data races [intro.multithread] (p: 77) + 6.9.3.1 main function [basic.start.main] (p: 82) + 16.5.1.3 Freestanding implementations [compliance] (p: 470) * C++17 standard (ISO/IEC 14882:2017): + 4.1 Implementation compliance [intro.compliance] (p: 5) + 4.7 Multi-threaded executions and data races [intro.multithread] (p: 15) + 6.6.1 main function [basic.start.main] (p: 66) + 20.5.1.3 Freestanding implementations [compliance] (p: 458) * C++14 standard (ISO/IEC 14882:2014): + 1.4 Implementation compliance [intro.compliance] (p: 5) + 1.10 Multi-threaded executions and data races [intro.multithread] (p: 11) + 3.6.1 Main function [basic.start.main] (p: 62) + 17.6.1.3 Freestanding implementations [compliance] (p: 441) * C++11 standard (ISO/IEC 14882:2011): + 1.4 Implementation compliance [intro.compliance] (p: 5) + 1.10 Multi-threaded executions and data races [intro.multithread] (p: 11) + 3.6.1 Main function [basic.start.main] (p: 58) + 17.6.1.3 Freestanding implementations [compliance] (p: 408) * C++03 standard (ISO/IEC 14882:2003): + 1.4 Implementation compliance [intro.compliance] (p: 3) + 3.6.1 Main function [basic.start.main] (p: 43) + 17.4.1.3 Freestanding implementations [lib.compliance] (p: 326) ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [CWG 1938](https://cplusplus.github.io/CWG/issues/1938.html) | C++98 | an implementation did not needto document whether it is hosted | made the implementation kind implementation-defined (thus requires a documentation) | | [LWG 3653](https://cplusplus.github.io/LWG/issue3653) | C++20 | <coroutine> is freestanding, but uses std::hash which is not | Not yet fixed | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/language/conformance "c/language/conformance") for Conformance | cpp Named Requirements Named Requirements ================== The *named requirements* listed on this page are the named requirements used in the normative text of the C++ standard to define the expectations of the standard library. Some of these requirements are being formalized in C++20 using the [concepts](language/constraints "cpp/language/constraints") language feature. Until then, the burden is on the programmer to ensure that library templates are instantiated with template arguments that satisfy these requirements. Failure to do so may result in very complex compiler diagnostics. | | | --- | | Basic | | [DefaultConstructible](named_req/defaultconstructible "cpp/named req/DefaultConstructible") | specifies that an object of the type can be default constructed (named requirement) | | [MoveConstructible](named_req/moveconstructible "cpp/named req/MoveConstructible") (C++11) | specifies that an object of the type can be constructed from rvalue (named requirement) | | [CopyConstructible](named_req/copyconstructible "cpp/named req/CopyConstructible") | specifies that an object of the type can be constructed from lvalue (named requirement) | | [MoveAssignable](named_req/moveassignable "cpp/named req/MoveAssignable") (C++11) | specifies that an object of the type can be assigned from rvalue (named requirement) | | [CopyAssignable](named_req/copyassignable "cpp/named req/CopyAssignable") | specifies that an object of the type can be assigned from lvalue (named requirement) | | [Destructible](named_req/destructible "cpp/named req/Destructible") | specifies that an object of the type can be destroyed (named requirement) | | Type properties | | Note: the standard does not define named requirements with names specified in this subcategory. These are type categories defined by the core language. They are included here as named requirements only for consistency. | | [ScalarType](named_req/scalartype "cpp/named req/ScalarType") | object types that are not array types or class types (named requirement) | | [PODType](named_req/podtype "cpp/named req/PODType") | POD (Plain Old Data) types, compatible with C `struct` (named requirement) | | [TriviallyCopyable](named_req/triviallycopyable "cpp/named req/TriviallyCopyable") (C++11) | objects of these types can maintain their values after copying their underlying bytes (named requirement) | | [TrivialType](named_req/trivialtype "cpp/named req/TrivialType") (C++11) | objects of these types can be trivially constructed and copied (named requirement) | | [StandardLayoutType](named_req/standardlayouttype "cpp/named req/StandardLayoutType") (C++11) | these types are useful for communicating with code written in other programming languages (named requirement) | | [ImplicitLifetimeType](named_req/implicitlifetimetype "cpp/named req/ImplicitLifetimeType") (C++20) | objects of these types can be implicitly created, and their lifetimes can be implictly started (named requirement) | | Library-wide | | [EqualityComparable](named_req/equalitycomparable "cpp/named req/EqualityComparable") | `operator==` is an equivalence relation (named requirement) | | [LessThanComparable](named_req/lessthancomparable "cpp/named req/LessThanComparable") | `operator<` is a strict weak ordering relation (named requirement) | | [Swappable](named_req/swappable "cpp/named req/Swappable") (C++11) | can be swapped with an unqualified non-member function call `swap()` (named requirement) | | [ValueSwappable](named_req/valueswappable "cpp/named req/ValueSwappable") (C++11) | an [LegacyIterator](named_req/iterator "cpp/named req/Iterator") that dereferences to a [Swappable](named_req/swappable "cpp/named req/Swappable") type (named requirement) | | [NullablePointer](named_req/nullablepointer "cpp/named req/NullablePointer") (C++11) | a pointer-like type supporting a null value (named requirement) | | [Hash](named_req/hash "cpp/named req/Hash") (C++11) | a [FunctionObject](named_req/functionobject "cpp/named req/FunctionObject") that for inputs with different values has a low probability of giving the same output (named requirement) | | [Allocator](named_req/allocator "cpp/named req/Allocator") | a class type that contains allocation information (named requirement) | | [FunctionObject](named_req/functionobject "cpp/named req/FunctionObject") | an object that can be called with the function call syntax (named requirement) | | [Callable](named_req/callable "cpp/named req/Callable") | a type for which the invoke operation is defined (named requirement) | | [Predicate](named_req/predicate "cpp/named req/Predicate") | a [FunctionObject](named_req/functionobject "cpp/named req/FunctionObject") that returns a value convertible to `bool` for one argument without modifying it (named requirement) | | [BinaryPredicate](named_req/binarypredicate "cpp/named req/BinaryPredicate") | a [FunctionObject](named_req/functionobject "cpp/named req/FunctionObject") that returns a value convertible to `bool` for two arguments without modifying them (named requirement) | | [Compare](named_req/compare "cpp/named req/Compare") | a [BinaryPredicate](named_req/binarypredicate "cpp/named req/BinaryPredicate") that establishes an ordering relation (named requirement) | | | | --- | | Container | | [Container](named_req/container "cpp/named req/Container") | data structure that allows element access using iterators (named requirement) | | [ReversibleContainer](named_req/reversiblecontainer "cpp/named req/ReversibleContainer") | container using bidirectional iterators (named requirement) | | [AllocatorAwareContainer](named_req/allocatorawarecontainer "cpp/named req/AllocatorAwareContainer") (C++11) | container using an allocator (named requirement) | | [SequenceContainer](named_req/sequencecontainer "cpp/named req/SequenceContainer") | container with elements stored linearly (named requirement) | | [ContiguousContainer](named_req/contiguouscontainer "cpp/named req/ContiguousContainer") (C++17) | container with elements stored at adjacent memory addresses (named requirement) | | [AssociativeContainer](named_req/associativecontainer "cpp/named req/AssociativeContainer") | container that stores elements by associating them to keys (named requirement) | | [UnorderedAssociativeContainer](named_req/unorderedassociativecontainer "cpp/named req/UnorderedAssociativeContainer") (C++11) | container that stores elements stored in buckets by associating them to keys (named requirement) | | Container element | | [DefaultInsertable](named_req/defaultinsertable "cpp/named req/DefaultInsertable") (C++11) | element can be default-constructed in uninitialized storage (named requirement) | | [CopyInsertable](named_req/copyinsertable "cpp/named req/CopyInsertable") (C++11) | element can be copy-constructed in uninitialized storage (named requirement) | | [MoveInsertable](named_req/moveinsertable "cpp/named req/MoveInsertable") (C++11) | element can be move-constructed in uninitialized storage (named requirement) | | [EmplaceConstructible](named_req/emplaceconstructible "cpp/named req/EmplaceConstructible") (C++11) | element can be constructed in uninitialized storage (named requirement) | | [Erasable](named_req/erasable "cpp/named req/Erasable") (C++11) | element can be destroyed using an allocator (named requirement) | | Iterator | | [LegacyIterator](named_req/iterator "cpp/named req/Iterator") | general concept to access data within some data structure (named requirement) | | [LegacyInputIterator](named_req/inputiterator "cpp/named req/InputIterator") | iterator that can be used to read data (named requirement) | | [LegacyOutputIterator](named_req/outputiterator "cpp/named req/OutputIterator") | iterator that can be used to write data (named requirement) | | [LegacyForwardIterator](named_req/forwarditerator "cpp/named req/ForwardIterator") | iterator that can be used to read data multiple times (named requirement) | | [LegacyBidirectionalIterator](named_req/bidirectionaliterator "cpp/named req/BidirectionalIterator") | iterator that can be both incremented and decremented (named requirement) | | [LegacyRandomAccessIterator](named_req/randomaccessiterator "cpp/named req/RandomAccessIterator") | iterator that can be advanced in constant time (named requirement) | | [LegacyContiguousIterator](named_req/contiguousiterator "cpp/named req/ContiguousIterator") (C++17) | iterator to contiguously-allocated elements (named requirement) | | [ConstexprIterator](named_req/constexpriterator "cpp/named req/ConstexprIterator") (C++20) | iterator that can be used during constant expression evaluation (named requirement) | | Stream I/O functions | | [UnformattedInputFunction](named_req/unformattedinputfunction "cpp/named req/UnformattedInputFunction") | a stream input function that does not skip leading whitespace and counts the processed characters (named requirement) | | [FormattedInputFunction](named_req/formattedinputfunction "cpp/named req/FormattedInputFunction") | a stream input function that skips leading whitespace (named requirement) | | [UnformattedOutputFunction](named_req/unformattedoutputfunction "cpp/named req/UnformattedOutputFunction") | a basic stream output function (named requirement) | | [FormattedOutputFunction](named_req/formattedoutputfunction "cpp/named req/FormattedOutputFunction") | a stream output function that sets failbit on errors and returns a reference to the stream (named requirement) | | Random Number Generation | | [SeedSequence](named_req/seedsequence "cpp/named req/SeedSequence") (C++11) | consumes a sequence of integers and produces a sequence of 32-bit unsigned values (named requirement) | | [UniformRandomBitGenerator](named_req/uniformrandombitgenerator "cpp/named req/UniformRandomBitGenerator") (C++11) | returns uniformly distributed random unsigned integers (named requirement) | | [RandomNumberEngine](named_req/randomnumberengine "cpp/named req/RandomNumberEngine") (C++11) | a deterministic [UniformRandomBitGenerator](named_req/uniformrandombitgenerator "cpp/named req/UniformRandomBitGenerator"), defined by the seed (named requirement) | | [RandomNumberEngineAdaptor](named_req/randomnumberengineadaptor "cpp/named req/RandomNumberEngineAdaptor") (C++11) | a [RandomNumberEngine](named_req/randomnumberengine "cpp/named req/RandomNumberEngine") that transforms the output of another [RandomNumberEngine](named_req/randomnumberengine "cpp/named req/RandomNumberEngine") (named requirement) | | [RandomNumberDistribution](named_req/randomnumberdistribution "cpp/named req/RandomNumberDistribution") (C++11) | returns random numbers distributed according to a given mathematical probability density function (named requirement) | | Concurrency | | [BasicLockable](named_req/basiclockable "cpp/named req/BasicLockable") (C++11) | provides exclusive ownership semantics for execution agents (i.e. threads) (named requirement) | | [Lockable](named_req/lockable "cpp/named req/Lockable") (C++11) | a [BasicLockable](named_req/basiclockable "cpp/named req/BasicLockable") that supports attempted lock acquisition (named requirement) | | [TimedLockable](named_req/timedlockable "cpp/named req/TimedLockable") (C++11) | a [Lockable](named_req/lockable "cpp/named req/Lockable") that supports timed lock acquisition (named requirement) | | [SharedLockable](named_req/sharedlockable "cpp/named req/SharedLockable") (C++14) | provides shared ownership semantics for execution agents (i.e. threads) (named requirement) | | [SharedTimedLockable](named_req/sharedtimedlockable "cpp/named req/SharedTimedLockable") (C++14) | a [SharedLockable](named_req/sharedlockable "cpp/named req/SharedLockable") that supports timed lock acquisition (named requirement) | | [Mutex](named_req/mutex "cpp/named req/Mutex") (C++11) | a [Lockable](named_req/lockable "cpp/named req/Lockable") that protects against data races and sequentially consistent synchronization (named requirement) | | [TimedMutex](named_req/timedmutex "cpp/named req/TimedMutex") (C++11) | a [TimedLockable](named_req/timedlockable "cpp/named req/TimedLockable") that protects against data races and sequentially consistent synchronization (named requirement) | | [SharedMutex](named_req/sharedmutex "cpp/named req/SharedMutex") (C++17) | a [Mutex](named_req/mutex "cpp/named req/Mutex") that supports shared ownership semantics (named requirement) | | [SharedTimedMutex](named_req/sharedtimedmutex "cpp/named req/SharedTimedMutex") (C++14) | a [TimedMutex](named_req/timedmutex "cpp/named req/TimedMutex") that supports shared ownership semantics (named requirement) | | Other | | [UnaryTypeTrait](named_req/unarytypetrait "cpp/named req/UnaryTypeTrait") (C++11) | describes a property of a type (named requirement) | | [BinaryTypeTrait](named_req/binarytypetrait "cpp/named req/BinaryTypeTrait") (C++11) | describes a relationship between two types (named requirement) | | [TransformationTrait](named_req/transformationtrait "cpp/named req/TransformationTrait") (C++11) | modifies a property of a type (named requirement) | | [Clock](named_req/clock "cpp/named req/Clock") (C++11) | aggregates a duration, a time point, and a function to get the current time point (named requirement) | | [TrivialClock](named_req/trivialclock "cpp/named req/TrivialClock") (C++11) | a [Clock](named_req/clock "cpp/named req/Clock") that does not throw exceptions (named requirement) | | [CharTraits](named_req/chartraits "cpp/named req/CharTraits") | defines types and functions for a character type (named requirement) | | [BitmaskType](named_req/bitmasktype "cpp/named req/BitmaskType") | bitset, integer, or enumeration (named requirement) | | [NumericType](named_req/numerictype "cpp/named req/NumericType") | a type for which initialization is effectively equal to assignment (named requirement) | | [RegexTraits](named_req/regextraits "cpp/named req/RegexTraits") (C++11) | defines types and functions used by the [regular expressions library](regex "cpp/regex") (named requirement) | | [LiteralType](named_req/literaltype "cpp/named req/LiteralType") (C++11)(deprecated in C++20) | a type with constexpr constructor (named requirement) | | [Formatter](named_req/formatter "cpp/named req/Formatter") (C++20) | defines functions used by the [formatting library](utility/format "cpp/utility/format") (named requirement) |
programming_docs
cpp Input/output library Input/output library ==================== C++ includes two input/output libraries: an [OOP-style](https://en.wikipedia.org/wiki/Object-oriented_programming "enwiki:Object-oriented programming") stream-based I/O library and the standard set of C-style I/O functions. ### Stream-based I/O The stream-based input/output library is organized around abstract input/output devices. These abstract devices allow the same code to handle input/output to files, memory streams, or custom adaptor devices that perform arbitrary operations (e.g. compression) on the fly. Most of the classes are templated, so they can be adapted to any basic character type. Separate typedefs are provided for the most common basic character types (`char` and `wchar_t`). The classes are organized into the following hierarchy: ![std-io-complete-inheritance.svg]() Inheritance diagram. | | | --- | | Abstraction | | Defined in header `[<ios>](header/ios "cpp/header/ios")` | | [ios\_base](io/ios_base "cpp/io/ios base") | manages formatting flags and input/output exceptions (class) | | [basic\_ios](io/basic_ios "cpp/io/basic ios") | manages an arbitrary stream buffer (class template) | | Defined in header `[<streambuf>](header/streambuf "cpp/header/streambuf")` | | [basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf") | abstracts a raw device (class template) | | Defined in header `[<ostream>](header/ostream "cpp/header/ostream")` | | [basic\_ostream](io/basic_ostream "cpp/io/basic ostream") | wraps a given abstract device (`[std::basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf")`) and provides high-level output interface (class template) | | Defined in header `[<istream>](header/istream "cpp/header/istream")` | | [basic\_istream](io/basic_istream "cpp/io/basic istream") | wraps a given abstract device (`[std::basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf")`) and provides high-level input interface (class template) | | [basic\_iostream](io/basic_iostream "cpp/io/basic iostream") | wraps a given abstract device (`[std::basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf")`) and provides high-level input/output interface (class template) | | File I/O implementation | | Defined in header `[<fstream>](header/fstream "cpp/header/fstream")` | | [basic\_filebuf](io/basic_filebuf "cpp/io/basic filebuf") | implements raw file device (class template) | | [basic\_ifstream](io/basic_ifstream "cpp/io/basic ifstream") | implements high-level file stream input operations (class template) | | [basic\_ofstream](io/basic_ofstream "cpp/io/basic ofstream") | implements high-level file stream output operations (class template) | | [basic\_fstream](io/basic_fstream "cpp/io/basic fstream") | implements high-level file stream input/output operations (class template) | | String I/O implementation | | Defined in header `[<sstream>](header/sstream "cpp/header/sstream")` | | [basic\_stringbuf](io/basic_stringbuf "cpp/io/basic stringbuf") | implements raw string device (class template) | | [basic\_istringstream](io/basic_istringstream "cpp/io/basic istringstream") | implements high-level string stream input operations (class template) | | [basic\_ostringstream](io/basic_ostringstream "cpp/io/basic ostringstream") | implements high-level string stream output operations (class template) | | [basic\_stringstream](io/basic_stringstream "cpp/io/basic stringstream") | implements high-level string stream input/output operations (class template) | | Array I/O implementations | | Defined in header `[<spanstream>](header/spanstream "cpp/header/spanstream")` | | [basic\_spanbuf](io/basic_spanbuf "cpp/io/basic spanbuf") (C++23) | implements raw fixed character buffer device (class template) | | [basic\_ispanstream](io/basic_ispanstream "cpp/io/basic ispanstream") (C++23) | implements fixed character buffer input operations (class template) | | [basic\_ospanstream](io/basic_ospanstream "cpp/io/basic ospanstream") (C++23) | implements fixed character buffer output operations (class template) | | [basic\_spanstream](io/basic_spanstream "cpp/io/basic spanstream") (C++23) | implements fixed character buffer input/output operations (class template) | | Defined in header `[<strstream>](header/strstream "cpp/header/strstream")` | | [strstreambuf](io/strstreambuf "cpp/io/strstreambuf") (deprecated in C++98) | implements raw character array device (class) | | [istrstream](io/istrstream "cpp/io/istrstream") (deprecated in C++98) | implements character array input operations (class) | | [ostrstream](io/ostrstream "cpp/io/ostrstream") (deprecated in C++98) | implements character array output operations (class) | | [strstream](io/strstream "cpp/io/strstream") (deprecated in C++98) | implements character array input/output operations (class) | | Synchronized output | | Defined in header `[<syncstream>](header/syncstream "cpp/header/syncstream")` | | [basic\_syncbuf](io/basic_syncbuf "cpp/io/basic syncbuf") (C++20) | synchronized output device wrapper (class template) | | [basic\_osyncstream](io/basic_osyncstream "cpp/io/basic osyncstream") (C++20) | synchronized output stream wrapper (class template) | #### Typedefs The following typedefs for common character types are provided: ``` typedef basic_ios<char> ios; typedef basic_ios<wchar_t> wios; typedef basic_streambuf<char> streambuf; typedef basic_streambuf<wchar_t> wstreambuf; typedef basic_filebuf<char> filebuf; typedef basic_filebuf<wchar_t> wfilebuf; typedef basic_stringbuf<char> stringbuf; typedef basic_stringbuf<wchar_t> wstringbuf; typedef basic_syncbuf<char> syncbuf; // C++20 typedef basic_syncbuf<wchar_t> wsyncbuf; // C++20 typedef basic_spanbuf<char> spanbuf; // C++23 typedef basic_spanbuf<wchar_t> wspanbuf; // C++23 typedef basic_istream<char> istream; typedef basic_istream<wchar_t> wistream; typedef basic_ostream<char> ostream; typedef basic_ostream<wchar_t> wostream; typedef basic_iostream<char> iostream; typedef basic_iostream<wchar_t> wiostream; typedef basic_ifstream<char> ifstream; typedef basic_ifstream<wchar_t> wifstream; typedef basic_ofstream<char> ofstream; typedef basic_ofstream<wchar_t> wofstream; typedef basic_fstream<char> fstream; typedef basic_fstream<wchar_t> wfstream; typedef basic_istringstream<char> istringstream; typedef basic_istringstream<wchar_t> wistringstream; typedef basic_ostringstream<char> ostringstream; typedef basic_ostringstream<wchar_t> wostringstream; typedef basic_stringstream<char> stringstream; typedef basic_stringstream<wchar_t> wstringstream; typedef basic_osyncstream<char> osyncstream; // C++20 typedef basic_osyncstream<wchar_t> wosyncstream; // C++20 typedef basic_ispanstream<char> ispanstream; // C++23 typedef basic_ispanstream<wchar_t> wispanstream; // C++23 typedef basic_ospanstream<char> ospanstream; // C++23 typedef basic_ospanstream<wchar_t> wospanstream; // C++23 typedef basic_spanstream<char> spanstream; // C++23 typedef basic_spanstream<wchar_t> wspanstream; // C++23 ``` #### Predefined standard stream objects | Defined in header `[<iostream>](header/iostream "cpp/header/iostream")` | | --- | | [cinwcin](io/cin "cpp/io/cin") | reads from the standard C input stream `[stdin](io/c/std_streams "cpp/io/c/std streams")` (global object) | | [coutwcout](io/cout "cpp/io/cout") | writes to the standard C output stream `[stdout](io/c/std_streams "cpp/io/c/std streams")`(global object) | | [cerrwcerr](io/cerr "cpp/io/cerr") | writes to the standard C error stream `[stderr](io/c/std_streams "cpp/io/c/std streams")`, unbuffered(global object) | | [clogwclog](io/clog "cpp/io/clog") | writes to the standard C error stream `[stderr](io/c/std_streams "cpp/io/c/std streams")`(global object) | #### [I/O Manipulators](io/manip "cpp/io/manip") The stream-based I/O library uses [I/O manipulators](io/manip "cpp/io/manip") (e.g. `[std::boolalpha](io/manip/boolalpha "cpp/io/manip/boolalpha")`, `[std::hex](io/manip/hex "cpp/io/manip/hex")`, etc.) to control how streams behave. #### Types The following auxiliary types are defined: | Defined in header `[<ios>](header/ios "cpp/header/ios")` | | --- | | [streamoff](io/streamoff "cpp/io/streamoff") | represents relative file/stream position (offset from fpos), sufficient to represent any file size (typedef) | | [streamsize](io/streamsize "cpp/io/streamsize") | represents the number of characters transferred in an I/O operation or the size of an I/O buffer (typedef) | | [fpos](io/fpos "cpp/io/fpos") | represents absolute position in a stream or a file (class template) | The following typedef names for `[std::fpos](http://en.cppreference.com/w/cpp/io/fpos)<[std::mbstate\_t](http://en.cppreference.com/w/cpp/string/multibyte/mbstate_t)>` are provided: | Defined in header `[<iosfwd>](header/iosfwd "cpp/header/iosfwd")` | | --- | | Type | Definition | | `streampos` | `[std::fpos](http://en.cppreference.com/w/cpp/io/fpos)<[std::char\_traits](http://en.cppreference.com/w/cpp/string/char_traits)<char>::state\_type>` | | `wstreampos` | `[std::fpos](http://en.cppreference.com/w/cpp/io/fpos)<[std::char\_traits](http://en.cppreference.com/w/cpp/string/char_traits)<wchar\_t>::state\_type>` | | `u8streampos` (C++20) | `[std::fpos](http://en.cppreference.com/w/cpp/io/fpos)<[std::char\_traits](http://en.cppreference.com/w/cpp/string/char_traits)<char8_t>::state\_type>` | | `u16streampos` (C++11) | `[std::fpos](http://en.cppreference.com/w/cpp/io/fpos)<[std::char\_traits](http://en.cppreference.com/w/cpp/string/char_traits)<char16\_t>::state\_type>` | | `u32streampos` (C++11) | `[std::fpos](http://en.cppreference.com/w/cpp/io/fpos)<[std::char\_traits](http://en.cppreference.com/w/cpp/string/char_traits)<char32\_t>::state\_type>` | #### Error category interface | Defined in header `[<ios>](header/ios "cpp/header/ios")` | | --- | | [io\_errc](io/io_errc "cpp/io/io errc") (C++11) | the IO stream error codes (enum) | | [iostream\_category](io/iostream_category "cpp/io/iostream category") (C++11) | identifies the iostream error category (function) | ### [C-style I/O](io/c "cpp/io/c") C++ also includes the [input/output functions defined by C](io/c "cpp/io/c"), such as `[std::fopen](io/c/fopen "cpp/io/c/fopen")`, `[std::getc](io/c/fgetc "cpp/io/c/fgetc")`, etc. cpp Coroutine support (C++20) Coroutine support (C++20) ========================= The coroutine support library defines several types that provide compile and run-time support for [coroutines](language/coroutines "cpp/language/coroutines"). ### Coroutine traits | Defined in header `[<coroutine>](header/coroutine "cpp/header/coroutine")` | | --- | | [coroutine\_traits](coroutine/coroutine_traits "cpp/coroutine/coroutine traits") (C++20) | trait type for discovering coroutine promise types (class template) | ### Coroutine handle | Defined in header `[<coroutine>](header/coroutine "cpp/header/coroutine")` | | --- | | [coroutine\_handle](coroutine/coroutine_handle "cpp/coroutine/coroutine handle") (C++20) | used to refer to a suspended or executing coroutine (class template) | ### No-op coroutines | Defined in header `[<coroutine>](header/coroutine "cpp/header/coroutine")` | | --- | | [noop\_coroutine](coroutine/noop_coroutine "cpp/coroutine/noop coroutine") (C++20) | creates a coroutine handle that has no observable effects when resumed or destroyed (function) | | [noop\_coroutine\_promise](coroutine/noop_coroutine_promise "cpp/coroutine/noop coroutine promise") (C++20) | used for coroutines with no observable effects (class) | | [noop\_coroutine\_handle](coroutine/coroutine_handle "cpp/coroutine/coroutine handle") (C++20) | `[std::coroutine\_handle](http://en.cppreference.com/w/cpp/coroutine/coroutine_handle)<[std::noop\_coroutine\_promise](http://en.cppreference.com/w/cpp/coroutine/noop_coroutine_promise)>`, intended to refer to a no-op coroutine (typedef) | ### Trivial awaitables | Defined in header `[<coroutine>](header/coroutine "cpp/header/coroutine")` | | --- | | [suspend\_never](coroutine/suspend_never "cpp/coroutine/suspend never") (C++20) | indicates that an await-expression should never suspend (class) | | [suspend\_always](coroutine/suspend_always "cpp/coroutine/suspend always") (C++20) | indicates that an await-expression should always suspend (class) | cpp Containers library Containers library ================== The Containers library is a generic collection of class templates and algorithms that allow programmers to easily implement common data structures like queues, lists and stacks. There are two (until C++11)three (since C++11) classes of containers: * sequence containers, * associative containers, and | | | | --- | --- | | * unordered associative containers, | (since C++11) | each of which is designed to support a different set of operations. The container manages the storage space that is allocated for its elements and provides member functions to access them, either directly or through iterators (objects with properties similar to pointers). Most containers have at least several member functions in common, and share functionalities. Which container is the best for the particular application depends not only on the offered functionality, but also on its efficiency for different workloads. ### Sequence containers Sequence containers implement data structures which can be accessed sequentially. | | | | --- | --- | | [array](container/array "cpp/container/array") (C++11) | static contiguous array (class template) | | [vector](container/vector "cpp/container/vector") | dynamic contiguous array (class template) | | [deque](container/deque "cpp/container/deque") | double-ended queue (class template) | | [forward\_list](container/forward_list "cpp/container/forward list") (C++11) | singly-linked list (class template) | | [list](container/list "cpp/container/list") | doubly-linked list (class template) | ### Associative containers Associative containers implement sorted data structures that can be quickly searched (O(log n) complexity). | | | | --- | --- | | [set](container/set "cpp/container/set") | collection of unique keys, sorted by keys (class template) | | [map](container/map "cpp/container/map") | collection of key-value pairs, sorted by keys, keys are unique (class template) | | [multiset](container/multiset "cpp/container/multiset") | collection of keys, sorted by keys (class template) | | [multimap](container/multimap "cpp/container/multimap") | collection of key-value pairs, sorted by keys (class template) | ### Unordered associative containers (since C++11) Unordered associative containers implement unsorted (hashed) data structures that can be quickly searched (O(1) amortized, O(n) worst-case complexity). | | | | --- | --- | | [unordered\_set](container/unordered_set "cpp/container/unordered set") (C++11) | collection of unique keys, hashed by keys (class template) | | [unordered\_map](container/unordered_map "cpp/container/unordered map") (C++11) | collection of key-value pairs, hashed by keys, keys are unique (class template) | | [unordered\_multiset](container/unordered_multiset "cpp/container/unordered multiset") (C++11) | collection of keys, hashed by keys (class template) | | [unordered\_multimap](container/unordered_multimap "cpp/container/unordered multimap") (C++11) | collection of key-value pairs, hashed by keys (class template) | ### Container adaptors Container adaptors provide a different interface for sequential containers. | | | | --- | --- | | [stack](container/stack "cpp/container/stack") | adapts a container to provide stack (LIFO data structure) (class template) | | [queue](container/queue "cpp/container/queue") | adapts a container to provide queue (FIFO data structure) (class template) | | [priority\_queue](container/priority_queue "cpp/container/priority queue") | adapts a container to provide priority queue (class template) | | [flat\_set](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_set&action=edit&redlink=1 "cpp/container/flat set (page does not exist)") (C++23) | adapts a container to provide a collection of unique keys, sorted by keys (class template) | | [flat\_map](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_map&action=edit&redlink=1 "cpp/container/flat map (page does not exist)") (C++23) | adapts a container to provide a collection of key-value pairs, sorted by keys, keys are unique (class template) | | [flat\_multiset](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multiset&action=edit&redlink=1 "cpp/container/flat multiset (page does not exist)") (C++23) | adapts a container to provide a collection of keys, sorted by keys (class template) | | [flat\_multimap](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multimap&action=edit&redlink=1 "cpp/container/flat multimap (page does not exist)") (C++23) | adapts a container to provide a collection of key-value pairs, sorted by keys (class template) | ### Views `span`(since C++20) and `mdspan`(since C++23) are non-owning views over a contiguous sequence of objects, the storage of which is owned by some other object. | | | | --- | --- | | [span](container/span "cpp/container/span") (C++20) | a non-owning view over a contiguous sequence of objects (class template) | | [mdspan](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan&action=edit&redlink=1 "cpp/container/mdspan (page does not exist)") (C++23) | a multi-dimensional non-owning array view (class template) | ### Iterator invalidation Read-only methods never invalidate iterators or references. Methods which modify the contents of a container may invalidate iterators and/or references, as summarized in this table. | | | | | | | --- | --- | --- | --- | --- | | Category | Container | After **insertion**, are... | After **erasure**, are... | Conditionally | | **iterators** valid? | **references** valid? | **iterators** valid? | **references** valid? | | Sequence containers | [`array`](container/array "cpp/container/array") | N/A | N/A | | | [`vector`](container/vector "cpp/container/vector") | No | N/A | Insertion changed capacity | | Yes | Yes | Before modified element(s)(for insertion only if capacity didn't change) | | No | No | At or after modified element(s) | | [`deque`](container/deque "cpp/container/deque") | No | Yes | Yes, except erased element(s) | Modified first or last element | | No | No | Modified middle only | | [`list`](container/list "cpp/container/list") | Yes | Yes, except erased element(s) | | | [`forward_list`](container/forward_list "cpp/container/forward list") | Yes | Yes, except erased element(s) | | | Associative containers | [`set`](container/set "cpp/container/set")[`multiset`](container/multiset "cpp/container/multiset")[`map`](container/map "cpp/container/map")[`multimap`](container/multimap "cpp/container/multimap") | Yes | Yes, except erased element(s) | | | Unordered associative containers | [`unordered_set`](container/unordered_set "cpp/container/unordered set")[`unordered_multiset`](container/unordered_multiset "cpp/container/unordered multiset")[`unordered_map`](container/unordered_map "cpp/container/unordered map")[`unordered_multimap`](container/unordered_multimap "cpp/container/unordered multimap") | No | Yes | N/A | Insertion caused rehash | | Yes | Yes, except erased element(s) | No rehash | Here, **insertion** refers to any method which adds one or more elements to the container and **erasure** refers to any method which removes one or more elements from the container. * Examples of insertion methods are `[std::set::insert](container/set/insert "cpp/container/set/insert")`, `[std::map::emplace](container/map/emplace "cpp/container/map/emplace")`, `[std::vector::push\_back](container/vector/push_back "cpp/container/vector/push back")`, and `[std::deque::push\_front](container/deque/push_front "cpp/container/deque/push front")`. | | | | --- | --- | | * Note that `[std::unordered\_map::operator[]](container/unordered_map/operator_at "cpp/container/unordered map/operator at")` also counts, as it may insert an element into the map. | (since C++11) | * Examples of erasure methods are `[std::set::erase](container/set/erase "cpp/container/set/erase")`, `[std::vector::pop\_back](container/vector/pop_back "cpp/container/vector/pop back")`, `[std::deque::pop\_front](container/deque/pop_front "cpp/container/deque/pop front")`, and `[std::map::clear](container/map/clear "cpp/container/map/clear")`. + `clear` invalidates all iterators and references. Because it erases all elements, this technically complies with the rules above. Unless otherwise specified (either explicitly or by defining a function in terms of other functions), passing a container as an argument to a library function never invalidate iterators to, or change the values of, objects within that container. The past-the-end iterator deserves particular mention. In general this iterator is invalidated as though it were a normal iterator to a non-erased element. So `[std::set::end](container/set/end "cpp/container/set/end")` is never invalidated, `[std::unordered\_set::end](container/unordered_set/end "cpp/container/unordered set/end")` is invalidated only on rehash (since C++11), `[std::vector::end](container/vector/end "cpp/container/vector/end")` is always invalidated (since it is always after the modified elements), and so on. There is one exception: an erasure which deletes the last element of a `[std::deque](container/deque "cpp/container/deque")` *does* invalidate the past-the-end iterator, even though it is not an erased element of the container (or an element at all). Combined with the general rules for `[std::deque](container/deque "cpp/container/deque")` iterators, the net result is that the only modifying operation which does *not* invalidate `[std::deque::end](container/deque/end "cpp/container/deque/end")` is an erasure which deletes the first element, but not the last. | | | | --- | --- | | Thread safety1. All container functions can be called concurrently by different threads on different containers. More generally, the C++ standard library functions do not read objects accessible by other threads unless those objects are directly or indirectly accessible via the function arguments, including the this pointer. 2. All `const` member functions can be called concurrently by different threads on the same container. In addition, the member functions `begin()`, `end()`, `rbegin()`, `rend()`, `front()`, `back()`, `data()`, `find()`, `lower_bound()`, `upper_bound()`, `equal_range()`, `at()`, and, except in associative containers, `operator[]`, behave as `const` for the purposes of thread safety (that is, they can also be called concurrently by different threads on the same container). More generally, the C++ standard library functions do not modify objects unless those objects are accessible, directly or indirectly, via the function's non-const arguments, including the this pointer. 3. Different elements in the same container can be modified concurrently by different threads, except for the elements of `std::vector<bool>` (for example, a vector of `[std::future](thread/future "cpp/thread/future")` objects can be receiving values from multiple threads). 4. Iterator operations (e.g. incrementing an iterator) read, but do not modify the underlying container, and may be executed concurrently with operations on other iterators on the same container, with the const member functions, or reads from the elements. Container operations that invalidate any iterators modify the container and cannot be executed concurrently with any operations on existing iterators even if those iterators are not invalidated. 5. Elements of the same container can be modified concurrently with those member functions that are not specified to access these elements. More generally, the C++ standard library functions do not read objects indirectly accessible through their arguments (including other elements of a container) except when required by its specification. 6. In any case, container operations (as well as algorithms, or any other C++ standard library functions) may be parallelized internally as long as this does not change the user-visible results (e.g. `[std::transform](algorithm/transform "cpp/algorithm/transform")` may be parallelized, but not `[std::for\_each](algorithm/for_each "cpp/algorithm/for each")` which is specified to visit each element of a sequence in order) | (since C++11) | ### Member function table | | | | --- | --- | | | - functions present in C++03 | | | - functions present since C++11 | | | - functions present since C++17 | | | - functions present since C++20 | | | | | | | | --- | --- | --- | --- | --- | | | Sequence containers | Associative containers | Unordered associative containers | Container adaptors | | Header | `[<array>](header/array "cpp/header/array")` | `[<vector>](header/vector "cpp/header/vector")` | `[<deque>](header/deque "cpp/header/deque")` | `[<forward\_list>](header/forward_list "cpp/header/forward list")` | `[<list>](header/list "cpp/header/list")` | `[<set>](header/set "cpp/header/set")` | `[<map>](header/map "cpp/header/map")` | `[<unordered\_set>](header/unordered_set "cpp/header/unordered set")` | `[<unordered\_map>](header/unordered_map "cpp/header/unordered map")` | `[<stack>](header/stack "cpp/header/stack")` | `[<queue>](header/queue "cpp/header/queue")` | | Container | [| | | --- | | `array` |](container/array "cpp/container/array") | [| | | --- | | `vector` |](container/vector "cpp/container/vector") | [| | | --- | | `deque` |](container/deque "cpp/container/deque") | [| | | --- | | `forward_list` |](container/forward_list "cpp/container/forward list") | [| | | --- | | `list` |](container/list "cpp/container/list") | [| | | --- | | `set` |](container/set "cpp/container/set") | [| | | --- | | `multiset` |](container/multiset "cpp/container/multiset") | [| | | --- | | `map` |](container/map "cpp/container/map") | [| | | --- | | `multimap` |](container/multimap "cpp/container/multimap") | [| | | --- | | `unordered_set` |](container/unordered_set "cpp/container/unordered set") | [| | | --- | | `unordered_multiset` |](container/unordered_multiset "cpp/container/unordered multiset") | [| | | --- | | `unordered_map` |](container/unordered_map "cpp/container/unordered map") | [| | | --- | | `unordered_multimap` |](container/unordered_multimap "cpp/container/unordered multimap") | [| | | --- | | `stack` |](container/stack "cpp/container/stack") | [| | | --- | | `queue` |](container/queue "cpp/container/queue") | [| | | --- | | `priority_queue` |](container/priority_queue "cpp/container/priority queue") | | | | | | --- | | `(constructor)` | | (implicit) | [| | | --- | | `vector` |](container/vector/vector "cpp/container/vector/vector") | [| | | --- | | `deque` |](container/deque/deque "cpp/container/deque/deque") | [| | | --- | | `forward_list` |](container/forward_list/forward_list "cpp/container/forward list/forward list") | [| | | --- | | `list` |](container/list/list "cpp/container/list/list") | [| | | --- | | `set` |](container/set/set "cpp/container/set/set") | [| | | --- | | `multiset` |](container/multiset/multiset "cpp/container/multiset/multiset") | [| | | --- | | `map` |](container/map/map "cpp/container/map/map") | [| | | --- | | `multimap` |](container/multimap/multimap "cpp/container/multimap/multimap") | [| | | --- | | `unordered_set` |](container/unordered_set/unordered_set "cpp/container/unordered set/unordered set") | [| | | --- | | `unordered_multiset` |](container/unordered_multiset/unordered_multiset "cpp/container/unordered multiset/unordered multiset") | [| | | --- | | `unordered_map` |](container/unordered_map/unordered_map "cpp/container/unordered map/unordered map") | [| | | --- | | `unordered_multimap` |](container/unordered_multimap/unordered_multimap "cpp/container/unordered multimap/unordered multimap") | [| | | --- | | `stack` |](container/stack/stack "cpp/container/stack/stack") | [| | | --- | | `queue` |](container/queue/queue "cpp/container/queue/queue") | [| | | --- | | `priority_queue` |](container/priority_queue/priority_queue "cpp/container/priority queue/priority queue") | | | | | --- | | `(destructor)` | | (implicit) | [| | | --- | | `~vector` |](container/vector/~vector "cpp/container/vector/~vector") | [| | | --- | | `~deque` |](container/deque/~deque "cpp/container/deque/~deque") | [| | | --- | | `~forward_list` |](container/forward_list/~forward_list "cpp/container/forward list/~forward list") | [| | | --- | | `~list` |](container/list/~list "cpp/container/list/~list") | [| | | --- | | `~set` |](container/set/~set "cpp/container/set/~set") | [| | | --- | | `~multiset` |](container/multiset/~multiset "cpp/container/multiset/~multiset") | [| | | --- | | `~map` |](container/map/~map "cpp/container/map/~map") | [| | | --- | | `~multimap` |](container/multimap/~multimap "cpp/container/multimap/~multimap") | [| | | --- | | `~unordered_set` |](container/unordered_set/~unordered_set "cpp/container/unordered set/~unordered set") | [| | | --- | | `~unordered_multiset` |](container/unordered_multiset/~unordered_multiset "cpp/container/unordered multiset/~unordered multiset") | [| | | --- | | `~unordered_map` |](container/unordered_map/~unordered_map "cpp/container/unordered map/~unordered map") | [| | | --- | | `~unordered_multimap` |](container/unordered_multimap/~unordered_multimap "cpp/container/unordered multimap/~unordered multimap") | [| | | --- | | `~stack` |](container/stack/~stack "cpp/container/stack/~stack") | [| | | --- | | `~queue` |](container/queue/~queue "cpp/container/queue/~queue") | [| | | --- | | `~priority_queue` |](container/priority_queue/~priority_queue "cpp/container/priority queue/~priority queue") | | | | | --- | | `operator=` | | (implicit) | [| | | --- | | `operator=` |](container/vector/operator= "cpp/container/vector/operator=") | [| | | --- | | `operator=` |](container/deque/operator= "cpp/container/deque/operator=") | [| | | --- | | `operator=` |](container/forward_list/operator= "cpp/container/forward list/operator=") | [| | | --- | | `operator=` |](container/list/operator= "cpp/container/list/operator=") | [| | | --- | | `operator=` |](container/set/operator= "cpp/container/set/operator=") | [| | | --- | | `operator=` |](container/multiset/operator= "cpp/container/multiset/operator=") | [| | | --- | | `operator=` |](container/map/operator= "cpp/container/map/operator=") | [| | | --- | | `operator=` |](container/multimap/operator= "cpp/container/multimap/operator=") | [| | | --- | | `operator=` |](container/unordered_set/operator= "cpp/container/unordered set/operator=") | [| | | --- | | `operator=` |](container/unordered_multiset/operator= "cpp/container/unordered multiset/operator=") | [| | | --- | | `operator=` |](container/unordered_map/operator= "cpp/container/unordered map/operator=") | [| | | --- | | `operator=` |](container/unordered_multimap/operator= "cpp/container/unordered multimap/operator=") | [| | | --- | | `operator=` |](container/stack/operator= "cpp/container/stack/operator=") | [| | | --- | | `operator=` |](container/queue/operator= "cpp/container/queue/operator=") | [| | | --- | | `operator=` |](container/priority_queue/operator= "cpp/container/priority queue/operator=") | | | | | --- | | `assign` | | | [| | | --- | | `assign` |](container/vector/assign "cpp/container/vector/assign") | [| | | --- | | `assign` |](container/deque/assign "cpp/container/deque/assign") | [| | | --- | | `assign` |](container/forward_list/assign "cpp/container/forward list/assign") | [| | | --- | | `assign` |](container/list/assign "cpp/container/list/assign") | | | | | | | | | | | | | Iterators | | | | --- | | `begin` | | `cbegin` | | [| | | --- | | `begin` | | `cbegin` |](container/array/begin "cpp/container/array/begin") | [| | | --- | | `begin` | | `cbegin` |](container/vector/begin "cpp/container/vector/begin") | [| | | --- | | `begin` | | `cbegin` |](container/deque/begin "cpp/container/deque/begin") | [| | | --- | | `begin` | | `cbegin` |](container/forward_list/begin "cpp/container/forward list/begin") | [| | | --- | | `begin` | | `cbegin` |](container/list/begin "cpp/container/list/begin") | [| | | --- | | `begin` | | `cbegin` |](container/set/begin "cpp/container/set/begin") | [| | | --- | | `begin` | | `cbegin` |](container/multiset/begin "cpp/container/multiset/begin") | [| | | --- | | `begin` | | `cbegin` |](container/map/begin "cpp/container/map/begin") | [| | | --- | | `begin` | | `cbegin` |](container/multimap/begin "cpp/container/multimap/begin") | [| | | --- | | `begin` | | `cbegin` |](container/unordered_set/begin "cpp/container/unordered set/begin") | [| | | --- | | `begin` | | `cbegin` |](container/unordered_multiset/begin "cpp/container/unordered multiset/begin") | [| | | --- | | `begin` | | `cbegin` |](container/unordered_map/begin "cpp/container/unordered map/begin") | [| | | --- | | `begin` | | `cbegin` |](container/unordered_multimap/begin "cpp/container/unordered multimap/begin") | | | | | | | | --- | | `end` | | `cend` | | [| | | --- | | `end` | | `cend` |](container/array/end "cpp/container/array/end") | [| | | --- | | `end` | | `cend` |](container/vector/end "cpp/container/vector/end") | [| | | --- | | `end` | | `cend` |](container/deque/end "cpp/container/deque/end") | [| | | --- | | `end` | | `cend` |](container/forward_list/end "cpp/container/forward list/end") | [| | | --- | | `end` | | `cend` |](container/list/end "cpp/container/list/end") | [| | | --- | | `end` | | `cend` |](container/set/end "cpp/container/set/end") | [| | | --- | | `end` | | `cend` |](container/multiset/end "cpp/container/multiset/end") | [| | | --- | | `end` | | `cend` |](container/map/end "cpp/container/map/end") | [| | | --- | | `end` | | `cend` |](container/multimap/end "cpp/container/multimap/end") | [| | | --- | | `end` | | `cend` |](container/unordered_set/end "cpp/container/unordered set/end") | [| | | --- | | `end` | | `cend` |](container/unordered_multiset/end "cpp/container/unordered multiset/end") | [| | | --- | | `end` | | `cend` |](container/unordered_map/end "cpp/container/unordered map/end") | [| | | --- | | `end` | | `cend` |](container/unordered_multimap/end "cpp/container/unordered multimap/end") | | | | | | | | --- | | `rbegin` | | `crbegin` | | [| | | --- | | `rbegin` | | `crbegin` |](container/array/rbegin "cpp/container/array/rbegin") | [| | | --- | | `rbegin` | | `crbegin` |](container/vector/rbegin "cpp/container/vector/rbegin") | [| | | --- | | `rbegin` | | `crbegin` |](container/deque/rbegin "cpp/container/deque/rbegin") | | [| | | --- | | `rbegin` | | `crbegin` |](container/list/rbegin "cpp/container/list/rbegin") | [| | | --- | | `rbegin` | | `crbegin` |](container/set/rbegin "cpp/container/set/rbegin") | [| | | --- | | `rbegin` | | `crbegin` |](container/multiset/rbegin "cpp/container/multiset/rbegin") | [| | | --- | | `rbegin` | | `crbegin` |](container/map/rbegin "cpp/container/map/rbegin") | [| | | --- | | `rbegin` | | `crbegin` |](container/multimap/rbegin "cpp/container/multimap/rbegin") | | | | | | | | | | | | --- | | `rend` | | `crend` | | [| | | --- | | `rend` | | `crend` |](container/array/rend "cpp/container/array/rend") | [| | | --- | | `rend` | | `crend` |](container/vector/rend "cpp/container/vector/rend") | [| | | --- | | `rend` | | `crend` |](container/deque/rend "cpp/container/deque/rend") | | [| | | --- | | `rend` | | `crend` |](container/list/rend "cpp/container/list/rend") | [| | | --- | | `rend` | | `crend` |](container/set/rend "cpp/container/set/rend") | [| | | --- | | `rend` | | `crend` |](container/multiset/rend "cpp/container/multiset/rend") | [| | | --- | | `rend` | | `crend` |](container/map/rend "cpp/container/map/rend") | [| | | --- | | `rend` | | `crend` |](container/multimap/rend "cpp/container/multimap/rend") | | | | | | | | | Element access | | | | --- | | `at` | | [| | | --- | | `at` |](container/array/at "cpp/container/array/at") | [| | | --- | | `at` |](container/vector/at "cpp/container/vector/at") | [| | | --- | | `at` |](container/deque/at "cpp/container/deque/at") | | | | | [| | | --- | | `at` |](container/map/at "cpp/container/map/at") | | | | [| | | --- | | `at` |](container/unordered_map/at "cpp/container/unordered map/at") | | | | | | | | | --- | | `operator[]` | | [| | | --- | | `operator[]` |](container/array/operator_at "cpp/container/array/operator at") | [| | | --- | | `operator[]` |](container/vector/operator_at "cpp/container/vector/operator at") | [| | | --- | | `operator[]` |](container/deque/operator_at "cpp/container/deque/operator at") | | | | | [| | | --- | | `operator[]` |](container/map/operator_at "cpp/container/map/operator at") | | | | [| | | --- | | `operator[]` |](container/unordered_map/operator_at "cpp/container/unordered map/operator at") | | | | | | | | | --- | | `data` | | [| | | --- | | `data` |](container/array/data "cpp/container/array/data") | [| | | --- | | `data` |](container/vector/data "cpp/container/vector/data") | | | | | | | | | | | | | | | | | | | --- | | `front` | | [| | | --- | | `front` |](container/array/front "cpp/container/array/front") | [| | | --- | | `front` |](container/vector/front "cpp/container/vector/front") | [| | | --- | | `front` |](container/deque/front "cpp/container/deque/front") | [| | | --- | | `front` |](container/forward_list/front "cpp/container/forward list/front") | [| | | --- | | `front` |](container/list/front "cpp/container/list/front") | | | | | | | | | | [| | | --- | | `front` |](container/queue/front "cpp/container/queue/front") | [| | | --- | | `top` |](container/priority_queue/top "cpp/container/priority queue/top") | | | | | --- | | `back` | | [| | | --- | | `back` |](container/array/back "cpp/container/array/back") | [| | | --- | | `back` |](container/vector/back "cpp/container/vector/back") | [| | | --- | | `back` |](container/deque/back "cpp/container/deque/back") | | [| | | --- | | `back` |](container/list/back "cpp/container/list/back") | | | | | | | | | [| | | --- | | `top` |](container/stack/top "cpp/container/stack/top") | [| | | --- | | `back` |](container/queue/back "cpp/container/queue/back") | | | Capacity | | | | --- | | `empty` | | [| | | --- | | `empty` |](container/array/empty "cpp/container/array/empty") | [| | | --- | | `empty` |](container/vector/empty "cpp/container/vector/empty") | [| | | --- | | `empty` |](container/deque/empty "cpp/container/deque/empty") | [| | | --- | | `empty` |](container/forward_list/empty "cpp/container/forward list/empty") | [| | | --- | | `empty` |](container/list/empty "cpp/container/list/empty") | [| | | --- | | `empty` |](container/set/empty "cpp/container/set/empty") | [| | | --- | | `empty` |](container/multiset/empty "cpp/container/multiset/empty") | [| | | --- | | `empty` |](container/map/empty "cpp/container/map/empty") | [| | | --- | | `empty` |](container/multimap/empty "cpp/container/multimap/empty") | [| | | --- | | `empty` |](container/unordered_set/empty "cpp/container/unordered set/empty") | [| | | --- | | `empty` |](container/unordered_multiset/empty "cpp/container/unordered multiset/empty") | [| | | --- | | `empty` |](container/unordered_map/empty "cpp/container/unordered map/empty") | [| | | --- | | `empty` |](container/unordered_multimap/empty "cpp/container/unordered multimap/empty") | [| | | --- | | `empty` |](container/stack/empty "cpp/container/stack/empty") | [| | | --- | | `empty` |](container/queue/empty "cpp/container/queue/empty") | [| | | --- | | `empty` |](container/priority_queue/empty "cpp/container/priority queue/empty") | | | | | --- | | `size` | | [| | | --- | | `size` |](container/array/size "cpp/container/array/size") | [| | | --- | | `size` |](container/vector/size "cpp/container/vector/size") | [| | | --- | | `size` |](container/deque/size "cpp/container/deque/size") | | [| | | --- | | `size` |](container/list/size "cpp/container/list/size") | [| | | --- | | `size` |](container/set/size "cpp/container/set/size") | [| | | --- | | `size` |](container/multiset/size "cpp/container/multiset/size") | [| | | --- | | `size` |](container/map/size "cpp/container/map/size") | [| | | --- | | `size` |](container/multimap/size "cpp/container/multimap/size") | [| | | --- | | `size` |](container/unordered_set/size "cpp/container/unordered set/size") | [| | | --- | | `size` |](container/unordered_multiset/size "cpp/container/unordered multiset/size") | [| | | --- | | `size` |](container/unordered_map/size "cpp/container/unordered map/size") | [| | | --- | | `size` |](container/unordered_multimap/size "cpp/container/unordered multimap/size") | [| | | --- | | `size` |](container/stack/size "cpp/container/stack/size") | [| | | --- | | `size` |](container/queue/size "cpp/container/queue/size") | [| | | --- | | `size` |](container/priority_queue/size "cpp/container/priority queue/size") | | | | | --- | | `max_size` | | [| | | --- | | `max_size` |](container/array/max_size "cpp/container/array/max size") | [| | | --- | | `max_size` |](container/vector/max_size "cpp/container/vector/max size") | [| | | --- | | `max_size` |](container/deque/max_size "cpp/container/deque/max size") | [| | | --- | | `max_size` |](container/forward_list/max_size "cpp/container/forward list/max size") | [| | | --- | | `max_size` |](container/list/max_size "cpp/container/list/max size") | [| | | --- | | `max_size` |](container/set/max_size "cpp/container/set/max size") | [| | | --- | | `max_size` |](container/multiset/max_size "cpp/container/multiset/max size") | [| | | --- | | `max_size` |](container/map/max_size "cpp/container/map/max size") | [| | | --- | | `max_size` |](container/multimap/max_size "cpp/container/multimap/max size") | [| | | --- | | `max_size` |](container/unordered_set/max_size "cpp/container/unordered set/max size") | [| | | --- | | `max_size` |](container/unordered_multiset/max_size "cpp/container/unordered multiset/max size") | [| | | --- | | `max_size` |](container/unordered_map/max_size "cpp/container/unordered map/max size") | [| | | --- | | `max_size` |](container/unordered_multimap/max_size "cpp/container/unordered multimap/max size") | | | | | | | | --- | | `resize` | | | [| | | --- | | `resize` |](container/vector/resize "cpp/container/vector/resize") | [| | | --- | | `resize` |](container/deque/resize "cpp/container/deque/resize") | [| | | --- | | `resize` |](container/forward_list/resize "cpp/container/forward list/resize") | [| | | --- | | `resize` |](container/list/resize "cpp/container/list/resize") | | | | | | | | | | | | | | | | --- | | `capacity` | | | [| | | --- | | `capacity` |](container/vector/capacity "cpp/container/vector/capacity") | | | | | | | | [| | | --- | | `bucket_count` |](container/unordered_set/bucket_count "cpp/container/unordered set/bucket count") | [| | | --- | | `bucket_count` |](container/unordered_multiset/bucket_count "cpp/container/unordered multiset/bucket count") | [| | | --- | | `bucket_count` |](container/unordered_map/bucket_count "cpp/container/unordered map/bucket count") | [| | | --- | | `bucket_count` |](container/unordered_multimap/bucket_count "cpp/container/unordered multimap/bucket count") | | | | | | | | --- | | `reserve` | | | [| | | --- | | `reserve` |](container/vector/reserve "cpp/container/vector/reserve") | | | | | | | | [| | | --- | | `reserve` |](container/unordered_set/reserve "cpp/container/unordered set/reserve") | [| | | --- | | `reserve` |](container/unordered_multiset/reserve "cpp/container/unordered multiset/reserve") | [| | | --- | | `reserve` |](container/unordered_map/reserve "cpp/container/unordered map/reserve") | [| | | --- | | `reserve` |](container/unordered_multimap/reserve "cpp/container/unordered multimap/reserve") | | | | | | | | --- | | `shrink_to_fit` | | | [| | | --- | | `shrink_to_fit` |](container/vector/shrink_to_fit "cpp/container/vector/shrink to fit") | [| | | --- | | `shrink_to_fit` |](container/deque/shrink_to_fit "cpp/container/deque/shrink to fit") | | | | | | | | | | | | | | | Modifiers | | | | --- | | `clear` | | | [| | | --- | | `clear` |](container/vector/clear "cpp/container/vector/clear") | [| | | --- | | `clear` |](container/deque/clear "cpp/container/deque/clear") | [| | | --- | | `clear` |](container/forward_list/clear "cpp/container/forward list/clear") | [| | | --- | | `clear` |](container/list/clear "cpp/container/list/clear") | [| | | --- | | `clear` |](container/set/clear "cpp/container/set/clear") | [| | | --- | | `clear` |](container/multiset/clear "cpp/container/multiset/clear") | [| | | --- | | `clear` |](container/map/clear "cpp/container/map/clear") | [| | | --- | | `clear` |](container/multimap/clear "cpp/container/multimap/clear") | [| | | --- | | `clear` |](container/unordered_set/clear "cpp/container/unordered set/clear") | [| | | --- | | `clear` |](container/unordered_multiset/clear "cpp/container/unordered multiset/clear") | [| | | --- | | `clear` |](container/unordered_map/clear "cpp/container/unordered map/clear") | [| | | --- | | `clear` |](container/unordered_multimap/clear "cpp/container/unordered multimap/clear") | | | | | | | | --- | | `insert` | | | [| | | --- | | `insert` |](container/vector/insert "cpp/container/vector/insert") | [| | | --- | | `insert` |](container/deque/insert "cpp/container/deque/insert") | [| | | --- | | `insert_after` |](container/forward_list/insert_after "cpp/container/forward list/insert after") | [| | | --- | | `insert` |](container/list/insert "cpp/container/list/insert") | [| | | --- | | `insert` |](container/set/insert "cpp/container/set/insert") | [| | | --- | | `insert` |](container/multiset/insert "cpp/container/multiset/insert") | [| | | --- | | `insert` |](container/map/insert "cpp/container/map/insert") | [| | | --- | | `insert` |](container/multimap/insert "cpp/container/multimap/insert") | [| | | --- | | `insert` |](container/unordered_set/insert "cpp/container/unordered set/insert") | [| | | --- | | `insert` |](container/unordered_multiset/insert "cpp/container/unordered multiset/insert") | [| | | --- | | `insert` |](container/unordered_map/insert "cpp/container/unordered map/insert") | [| | | --- | | `insert` |](container/unordered_multimap/insert "cpp/container/unordered multimap/insert") | | | | | | | | --- | | `insert_or_assign` | | | | | | | | | [| | | --- | | `insert_or_assign` |](container/map/insert_or_assign "cpp/container/map/insert or assign") | | | | [| | | --- | | `insert_or_assign` |](container/unordered_map/insert_or_assign "cpp/container/unordered map/insert or assign") | | | | | | | | | --- | | `emplace` | | | [| | | --- | | `emplace` |](container/vector/emplace "cpp/container/vector/emplace") | [| | | --- | | `emplace` |](container/deque/emplace "cpp/container/deque/emplace") | [| | | --- | | `emplace_after` |](container/forward_list/emplace_after "cpp/container/forward list/emplace after") | [| | | --- | | `emplace` |](container/list/emplace "cpp/container/list/emplace") | [| | | --- | | `emplace` |](container/set/emplace "cpp/container/set/emplace") | [| | | --- | | `emplace` |](container/multiset/emplace "cpp/container/multiset/emplace") | [| | | --- | | `emplace` |](container/map/emplace "cpp/container/map/emplace") | [| | | --- | | `emplace` |](container/multimap/emplace "cpp/container/multimap/emplace") | [| | | --- | | `emplace` |](container/unordered_set/emplace "cpp/container/unordered set/emplace") | [| | | --- | | `emplace` |](container/unordered_multiset/emplace "cpp/container/unordered multiset/emplace") | [| | | --- | | `emplace` |](container/unordered_map/emplace "cpp/container/unordered map/emplace") | [| | | --- | | `emplace` |](container/unordered_multimap/emplace "cpp/container/unordered multimap/emplace") | | | | | | | | --- | | `emplace_hint` | | | | | | | [| | | --- | | `emplace_hint` |](container/set/emplace_hint "cpp/container/set/emplace hint") | [| | | --- | | `emplace_hint` |](container/multiset/emplace_hint "cpp/container/multiset/emplace hint") | [| | | --- | | `emplace_hint` |](container/map/emplace_hint "cpp/container/map/emplace hint") | [| | | --- | | `emplace_hint` |](container/multimap/emplace_hint "cpp/container/multimap/emplace hint") | [| | | --- | | `emplace_hint` |](container/unordered_set/emplace_hint "cpp/container/unordered set/emplace hint") | [| | | --- | | `emplace_hint` |](container/unordered_multiset/emplace_hint "cpp/container/unordered multiset/emplace hint") | [| | | --- | | `emplace_hint` |](container/unordered_map/emplace_hint "cpp/container/unordered map/emplace hint") | [| | | --- | | `emplace_hint` |](container/unordered_multimap/emplace_hint "cpp/container/unordered multimap/emplace hint") | | | | | | | | --- | | `try_emplace` | | | | | | | | | [| | | --- | | `try_emplace` |](container/map/try_emplace "cpp/container/map/try emplace") | | | | [| | | --- | | `try_emplace` |](container/unordered_map/try_emplace "cpp/container/unordered map/try emplace") | | | | | | | | | --- | | `erase` | | | [| | | --- | | `erase` |](container/vector/erase "cpp/container/vector/erase") | [| | | --- | | `erase` |](container/deque/erase "cpp/container/deque/erase") | [| | | --- | | `erase_after` |](container/forward_list/erase_after "cpp/container/forward list/erase after") | [| | | --- | | `erase` |](container/list/erase "cpp/container/list/erase") | [| | | --- | | `erase` |](container/set/erase "cpp/container/set/erase") | [| | | --- | | `erase` |](container/multiset/erase "cpp/container/multiset/erase") | [| | | --- | | `erase` |](container/map/erase "cpp/container/map/erase") | [| | | --- | | `erase` |](container/multimap/erase "cpp/container/multimap/erase") | [| | | --- | | `erase` |](container/unordered_set/erase "cpp/container/unordered set/erase") | [| | | --- | | `erase` |](container/unordered_multiset/erase "cpp/container/unordered multiset/erase") | [| | | --- | | `erase` |](container/unordered_map/erase "cpp/container/unordered map/erase") | [| | | --- | | `erase` |](container/unordered_multimap/erase "cpp/container/unordered multimap/erase") | | | | | | | | --- | | `push_front` | | | | [| | | --- | | `push_front` |](container/deque/push_front "cpp/container/deque/push front") | [| | | --- | | `push_front` |](container/forward_list/push_front "cpp/container/forward list/push front") | [| | | --- | | `push_front` |](container/list/push_front "cpp/container/list/push front") | | | | | | | | | | | | | | | | --- | | `emplace_front` | | | | [| | | --- | | `emplace_front` |](container/deque/emplace_front "cpp/container/deque/emplace front") | [| | | --- | | `emplace_front` |](container/forward_list/emplace_front "cpp/container/forward list/emplace front") | [| | | --- | | `emplace_front` |](container/list/emplace_front "cpp/container/list/emplace front") | | | | | | | | | | | | | | | | --- | | `pop_front` | | | | [| | | --- | | `pop_front` |](container/deque/pop_front "cpp/container/deque/pop front") | [| | | --- | | `pop_front` |](container/forward_list/pop_front "cpp/container/forward list/pop front") | [| | | --- | | `pop_front` |](container/list/pop_front "cpp/container/list/pop front") | | | | | | | | | | [| | | --- | | `pop` |](container/queue/pop "cpp/container/queue/pop") | [| | | --- | | `pop` |](container/priority_queue/pop "cpp/container/priority queue/pop") | | | | | --- | | `push_back` | | | [| | | --- | | `push_back` |](container/vector/push_back "cpp/container/vector/push back") | [| | | --- | | `push_back` |](container/deque/push_back "cpp/container/deque/push back") | | [| | | --- | | `push_back` |](container/list/push_back "cpp/container/list/push back") | | | | | | | | | [| | | --- | | `push` |](container/stack/push "cpp/container/stack/push") | [| | | --- | | `push` |](container/queue/push "cpp/container/queue/push") | [| | | --- | | `push` |](container/priority_queue/push "cpp/container/priority queue/push") | | | | | --- | | `emplace_back` | | | [| | | --- | | `emplace_back` |](container/vector/emplace_back "cpp/container/vector/emplace back") | [| | | --- | | `emplace_back` |](container/deque/emplace_back "cpp/container/deque/emplace back") | | [| | | --- | | `emplace_back` |](container/list/emplace_back "cpp/container/list/emplace back") | | | | | | | | | [| | | --- | | `emplace` |](container/stack/emplace "cpp/container/stack/emplace") | [| | | --- | | `emplace` |](container/queue/emplace "cpp/container/queue/emplace") | [| | | --- | | `emplace` |](container/priority_queue/emplace "cpp/container/priority queue/emplace") | | | | | --- | | `pop_back` | | | [| | | --- | | `pop_back` |](container/vector/pop_back "cpp/container/vector/pop back") | [| | | --- | | `pop_back` |](container/deque/pop_back "cpp/container/deque/pop back") | | [| | | --- | | `pop_back` |](container/list/pop_back "cpp/container/list/pop back") | | | | | | | | | [| | | --- | | `pop` |](container/stack/pop "cpp/container/stack/pop") | | | | | | | --- | | `swap` | | [| | | --- | | `swap` |](container/array/swap "cpp/container/array/swap") | [| | | --- | | `swap` |](container/vector/swap "cpp/container/vector/swap") | [| | | --- | | `swap` |](container/deque/swap "cpp/container/deque/swap") | [| | | --- | | `swap` |](container/forward_list/swap "cpp/container/forward list/swap") | [| | | --- | | `swap` |](container/list/swap "cpp/container/list/swap") | [| | | --- | | `swap` |](container/set/swap "cpp/container/set/swap") | [| | | --- | | `swap` |](container/multiset/swap "cpp/container/multiset/swap") | [| | | --- | | `swap` |](container/map/swap "cpp/container/map/swap") | [| | | --- | | `swap` |](container/multimap/swap "cpp/container/multimap/swap") | [| | | --- | | `swap` |](container/unordered_set/swap "cpp/container/unordered set/swap") | [| | | --- | | `swap` |](container/unordered_multiset/swap "cpp/container/unordered multiset/swap") | [| | | --- | | `swap` |](container/unordered_map/swap "cpp/container/unordered map/swap") | [| | | --- | | `swap` |](container/unordered_multimap/swap "cpp/container/unordered multimap/swap") | [| | | --- | | `swap` |](container/stack/swap "cpp/container/stack/swap") | [| | | --- | | `swap` |](container/queue/swap "cpp/container/queue/swap") | [| | | --- | | `swap` |](container/priority_queue/swap "cpp/container/priority queue/swap") | | | | | --- | | `merge` | | | | | [| | | --- | | `merge` |](container/forward_list/merge "cpp/container/forward list/merge") | [| | | --- | | `merge` |](container/list/merge "cpp/container/list/merge") | [| | | --- | | `merge` |](container/set/merge "cpp/container/set/merge") | [| | | --- | | `merge` |](container/multiset/merge "cpp/container/multiset/merge") | [| | | --- | | `merge` |](container/map/merge "cpp/container/map/merge") | [| | | --- | | `merge` |](container/multimap/merge "cpp/container/multimap/merge") | [| | | --- | | `merge` |](container/unordered_set/merge "cpp/container/unordered set/merge") | [| | | --- | | `merge` |](container/unordered_multiset/merge "cpp/container/unordered multiset/merge") | [| | | --- | | `merge` |](container/unordered_map/merge "cpp/container/unordered map/merge") | [| | | --- | | `merge` |](container/unordered_multimap/merge "cpp/container/unordered multimap/merge") | | | | | | | | --- | | `extract` | | | | | | | [| | | --- | | `extract` |](container/set/extract "cpp/container/set/extract") | [| | | --- | | `extract` |](container/multiset/extract "cpp/container/multiset/extract") | [| | | --- | | `extract` |](container/map/extract "cpp/container/map/extract") | [| | | --- | | `extract` |](container/multimap/extract "cpp/container/multimap/extract") | [| | | --- | | `extract` |](container/unordered_set/extract "cpp/container/unordered set/extract") | [| | | --- | | `extract` |](container/unordered_multiset/extract "cpp/container/unordered multiset/extract") | [| | | --- | | `extract` |](container/unordered_map/extract "cpp/container/unordered map/extract") | [| | | --- | | `extract` |](container/unordered_multimap/extract "cpp/container/unordered multimap/extract") | | | | | List operations | | | | --- | | `splice` | | | | | [| | | --- | | `splice_after` |](container/forward_list/splice_after "cpp/container/forward list/splice after") | [| | | --- | | `splice` |](container/list/splice "cpp/container/list/splice") | | | | | | | | | | | | | | | | --- | | `remove` | | | | | [| | | --- | | `remove` |](container/forward_list/remove "cpp/container/forward list/remove") | [| | | --- | | `remove` |](container/list/remove "cpp/container/list/remove") | | | | | | | | | | | | | | | | --- | | `remove_if` | | | | | [| | | --- | | `remove_if` |](container/forward_list/remove "cpp/container/forward list/remove") | [| | | --- | | `remove_if` |](container/list/remove "cpp/container/list/remove") | | | | | | | | | | | | | | | | --- | | `reverse` | | | | | [| | | --- | | `reverse` |](container/forward_list/reverse "cpp/container/forward list/reverse") | [| | | --- | | `reverse` |](container/list/reverse "cpp/container/list/reverse") | | | | | | | | | | | | | | | | --- | | `unique` | | | | | [| | | --- | | `unique` |](container/forward_list/unique "cpp/container/forward list/unique") | [| | | --- | | `unique` |](container/list/unique "cpp/container/list/unique") | | | | | | | | | | | | | | | | --- | | `sort` | | | | | [| | | --- | | `sort` |](container/forward_list/sort "cpp/container/forward list/sort") | [| | | --- | | `sort` |](container/list/sort "cpp/container/list/sort") | | | | | | | | | | | | | Lookup | | | | --- | | `count` | | | | | | | [| | | --- | | `count` |](container/set/count "cpp/container/set/count") | [| | | --- | | `count` |](container/multiset/count "cpp/container/multiset/count") | [| | | --- | | `count` |](container/map/count "cpp/container/map/count") | [| | | --- | | `count` |](container/multimap/count "cpp/container/multimap/count") | [| | | --- | | `count` |](container/unordered_set/count "cpp/container/unordered set/count") | [| | | --- | | `count` |](container/unordered_multiset/count "cpp/container/unordered multiset/count") | [| | | --- | | `count` |](container/unordered_map/count "cpp/container/unordered map/count") | [| | | --- | | `count` |](container/unordered_multimap/count "cpp/container/unordered multimap/count") | | | | | | | | --- | | `find` | | | | | | | [| | | --- | | `find` |](container/set/find "cpp/container/set/find") | [| | | --- | | `find` |](container/multiset/find "cpp/container/multiset/find") | [| | | --- | | `find` |](container/map/find "cpp/container/map/find") | [| | | --- | | `find` |](container/multimap/find "cpp/container/multimap/find") | [| | | --- | | `find` |](container/unordered_set/find "cpp/container/unordered set/find") | [| | | --- | | `find` |](container/unordered_multiset/find "cpp/container/unordered multiset/find") | [| | | --- | | `find` |](container/unordered_map/find "cpp/container/unordered map/find") | [| | | --- | | `find` |](container/unordered_multimap/find "cpp/container/unordered multimap/find") | | | | | | | | --- | | `contains` | | | | | | | [| | | --- | | `contains` |](container/set/contains "cpp/container/set/contains") | [| | | --- | | `contains` |](container/multiset/contains "cpp/container/multiset/contains") | [| | | --- | | `contains` |](container/map/contains "cpp/container/map/contains") | [| | | --- | | `contains` |](container/multimap/contains "cpp/container/multimap/contains") | [| | | --- | | `contains` |](container/unordered_set/contains "cpp/container/unordered set/contains") | [| | | --- | | `contains` |](container/unordered_multiset/contains "cpp/container/unordered multiset/contains") | [| | | --- | | `contains` |](container/unordered_map/contains "cpp/container/unordered map/contains") | [| | | --- | | `contains` |](container/unordered_multimap/contains "cpp/container/unordered multimap/contains") | | | | | | | | --- | | `lower_bound` | | | | | | | [| | | --- | | `lower_bound` |](container/set/lower_bound "cpp/container/set/lower bound") | [| | | --- | | `lower_bound` |](container/multiset/lower_bound "cpp/container/multiset/lower bound") | [| | | --- | | `lower_bound` |](container/map/lower_bound "cpp/container/map/lower bound") | [| | | --- | | `lower_bound` |](container/multimap/lower_bound "cpp/container/multimap/lower bound") | | | | | | | | | | | | --- | | `upper_bound` | | | | | | | [| | | --- | | `upper_bound` |](container/set/upper_bound "cpp/container/set/upper bound") | [| | | --- | | `upper_bound` |](container/multiset/upper_bound "cpp/container/multiset/upper bound") | [| | | --- | | `upper_bound` |](container/map/upper_bound "cpp/container/map/upper bound") | [| | | --- | | `upper_bound` |](container/multimap/upper_bound "cpp/container/multimap/upper bound") | | | | | | | | | | | | --- | | `equal_range` | | | | | | | [| | | --- | | `equal_range` |](container/set/equal_range "cpp/container/set/equal range") | [| | | --- | | `equal_range` |](container/multiset/equal_range "cpp/container/multiset/equal range") | [| | | --- | | `equal_range` |](container/map/equal_range "cpp/container/map/equal range") | [| | | --- | | `equal_range` |](container/multimap/equal_range "cpp/container/multimap/equal range") | [| | | --- | | `equal_range` |](container/unordered_set/equal_range "cpp/container/unordered set/equal range") | [| | | --- | | `equal_range` |](container/unordered_multiset/equal_range "cpp/container/unordered multiset/equal range") | [| | | --- | | `equal_range` |](container/unordered_map/equal_range "cpp/container/unordered map/equal range") | [| | | --- | | `equal_range` |](container/unordered_multimap/equal_range "cpp/container/unordered multimap/equal range") | | | | | Observers | | | | --- | | `key_comp` | | | | | | | [| | | --- | | `key_comp` |](container/set/key_comp "cpp/container/set/key comp") | [| | | --- | | `key_comp` |](container/multiset/key_comp "cpp/container/multiset/key comp") | [| | | --- | | `key_comp` |](container/map/key_comp "cpp/container/map/key comp") | [| | | --- | | `key_comp` |](container/multimap/key_comp "cpp/container/multimap/key comp") | | | | | | | | | | | | --- | | `value_comp` | | | | | | | [| | | --- | | `value_comp` |](container/set/value_comp "cpp/container/set/value comp") | [| | | --- | | `value_comp` |](container/multiset/value_comp "cpp/container/multiset/value comp") | [| | | --- | | `value_comp` |](container/map/value_comp "cpp/container/map/value comp") | [| | | --- | | `value_comp` |](container/multimap/value_comp "cpp/container/multimap/value comp") | | | | | | | | | | | | --- | | `hash_function` | | | | | | | | | | | [| | | --- | | `hash_function` |](container/unordered_set/hash_function "cpp/container/unordered set/hash function") | [| | | --- | | `hash_function` |](container/unordered_multiset/hash_function "cpp/container/unordered multiset/hash function") | [| | | --- | | `hash_function` |](container/unordered_map/hash_function "cpp/container/unordered map/hash function") | [| | | --- | | `hash_function` |](container/unordered_multimap/hash_function "cpp/container/unordered multimap/hash function") | | | | | | | | --- | | `key_eq` | | | | | | | | | | | [| | | --- | | `key_eq` |](container/unordered_set/key_eq "cpp/container/unordered set/key eq") | [| | | --- | | `key_eq` |](container/unordered_multiset/key_eq "cpp/container/unordered multiset/key eq") | [| | | --- | | `key_eq` |](container/unordered_map/key_eq "cpp/container/unordered map/key eq") | [| | | --- | | `key_eq` |](container/unordered_multimap/key_eq "cpp/container/unordered multimap/key eq") | | | | | Allocator | | | | --- | | `get_allocator` | | | [| | | --- | | `get_allocator` |](container/vector/get_allocator "cpp/container/vector/get allocator") | [| | | --- | | `get_allocator` |](container/deque/get_allocator "cpp/container/deque/get allocator") | [| | | --- | | `get_allocator` |](container/forward_list/get_allocator "cpp/container/forward list/get allocator") | [| | | --- | | `get_allocator` |](container/list/get_allocator "cpp/container/list/get allocator") | [| | | --- | | `get_allocator` |](container/set/get_allocator "cpp/container/set/get allocator") | [| | | --- | | `get_allocator` |](container/multiset/get_allocator "cpp/container/multiset/get allocator") | [| | | --- | | `get_allocator` |](container/map/get_allocator "cpp/container/map/get allocator") | [| | | --- | | `get_allocator` |](container/multimap/get_allocator "cpp/container/multimap/get allocator") | [| | | --- | | `get_allocator` |](container/unordered_set/get_allocator "cpp/container/unordered set/get allocator") | [| | | --- | | `get_allocator` |](container/unordered_multiset/get_allocator "cpp/container/unordered multiset/get allocator") | [| | | --- | | `get_allocator` |](container/unordered_map/get_allocator "cpp/container/unordered map/get allocator") | [| | | --- | | `get_allocator` |](container/unordered_multimap/get_allocator "cpp/container/unordered multimap/get allocator") | | | | | Container | [| | | --- | | `array` |](container/array "cpp/container/array") | [| | | --- | | `vector` |](container/vector "cpp/container/vector") | [| | | --- | | `deque` |](container/deque "cpp/container/deque") | [| | | --- | | `forward_list` |](container/forward_list "cpp/container/forward list") | [| | | --- | | `list` |](container/list "cpp/container/list") | [| | | --- | | `set` |](container/set "cpp/container/set") | [| | | --- | | `multiset` |](container/multiset "cpp/container/multiset") | [| | | --- | | `map` |](container/map "cpp/container/map") | [| | | --- | | `multimap` |](container/multimap "cpp/container/multimap") | [| | | --- | | `unordered_set` |](container/unordered_set "cpp/container/unordered set") | [| | | --- | | `unordered_multiset` |](container/unordered_multiset "cpp/container/unordered multiset") | [| | | --- | | `unordered_map` |](container/unordered_map "cpp/container/unordered map") | [| | | --- | | `unordered_multimap` |](container/unordered_multimap "cpp/container/unordered multimap") | [| | | --- | | `stack` |](container/stack "cpp/container/stack") | [| | | --- | | `queue` |](container/queue "cpp/container/queue") | [| | | --- | | `priority_queue` |](container/priority_queue "cpp/container/priority queue") | | | Sequence containers | Associative containers | Unordered associative containers | Container adaptors | ### Non-member function table ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 51](https://cplusplus.github.io/LWG/issue51) | C++98 | container iterators might be invalidatedby arbitrary library operation | they are only invalidatedwhen specified | ### See also C++ named requirements: * [Container](named_req/container "cpp/named req/Container") * [SequenceContainer](named_req/sequencecontainer "cpp/named req/SequenceContainer") * [ContiguousContainer](named_req/contiguouscontainer "cpp/named req/ContiguousContainer") * [ReversibleContainer](named_req/reversiblecontainer "cpp/named req/ReversibleContainer") * [AssociativeContainer](named_req/associativecontainer "cpp/named req/AssociativeContainer") * [AllocatorAwareContainer](named_req/allocatorawarecontainer "cpp/named req/AllocatorAwareContainer") * [UnorderedAssociativeContainer](named_req/unorderedassociativecontainer "cpp/named req/UnorderedAssociativeContainer") | | | | --- | --- | | [valarray](numeric/valarray "cpp/numeric/valarray") | numeric arrays, array masks and array slices (class template) |
programming_docs
cpp Strings library Strings library =============== The C++ strings library includes support for three general types of strings: * `[std::basic\_string](string/basic_string "cpp/string/basic string")` - a templated class designed to manipulate strings of any character type. * `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` (C++17) - a lightweight non-owning read-only view into a subsequence of a string. * Null-terminated strings - arrays of characters terminated by a special *null* character. ### `[std::basic\_string](string/basic_string "cpp/string/basic string")` The templated class `[std::basic\_string](string/basic_string "cpp/string/basic string")` generalizes how sequences of characters are manipulated and stored. String creation, manipulation, and destruction are all handled by a convenient set of class methods and related functions. Several specializations of `[std::basic\_string](string/basic_string "cpp/string/basic string")` are provided for commonly-used types: | Defined in header `[<string>](header/string "cpp/header/string")` | | --- | | Type | Definition | | `[std::string](string/basic_string "cpp/string/basic string")` | `[std::basic\_string](http://en.cppreference.com/w/cpp/string/basic_string)<char>` | | `[std::wstring](string/basic_string "cpp/string/basic string")` | `[std::basic\_string](http://en.cppreference.com/w/cpp/string/basic_string)<wchar\_t>` | | `[std::u8string](string/basic_string "cpp/string/basic string")` (since C++20) | `[std::basic\_string](http://en.cppreference.com/w/cpp/string/basic_string)<char8_t>` | | `[std::u16string](string/basic_string "cpp/string/basic string")` (since C++11) | `[std::basic\_string](http://en.cppreference.com/w/cpp/string/basic_string)<char16\_t>` | | `[std::u32string](string/basic_string "cpp/string/basic string")` (since C++11) | `[std::basic\_string](http://en.cppreference.com/w/cpp/string/basic_string)<char32\_t>` | | | | | | | | | | | | | | | | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [`std::basic_string_view`](string/basic_string_view "cpp/string/basic string view") The templated class `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` provides a lightweight object that offers read-only access to a string or a part of a string using an interface similar to the interface of `[std::basic\_string](string/basic_string "cpp/string/basic string")`. Several specializations of `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` are provided for commonly-used types: | Defined in header `[<string\_view>](header/string_view "cpp/header/string view")` | | --- | | Type | Definition | | `[std::string\_view](string/basic_string_view "cpp/string/basic string view")` (since C++17) | `[std::basic\_string\_view](http://en.cppreference.com/w/cpp/string/basic_string_view)<char>` | | `[std::wstring\_view](string/basic_string_view "cpp/string/basic string view")` (since C++17) | `[std::basic\_string\_view](http://en.cppreference.com/w/cpp/string/basic_string_view)<wchar\_t>` | | `[std::u8string\_view](string/basic_string_view "cpp/string/basic string view")` (since C++20) | `[std::basic\_string\_view](http://en.cppreference.com/w/cpp/string/basic_string_view)<char8_t>` | | `[std::u16string\_view](string/basic_string_view "cpp/string/basic string view")` (since C++17) | `[std::basic\_string\_view](http://en.cppreference.com/w/cpp/string/basic_string_view)<char16\_t>` | | `[std::u32string\_view](string/basic_string_view "cpp/string/basic string view")` (since C++17) | `[std::basic\_string\_view](http://en.cppreference.com/w/cpp/string/basic_string_view)<char32\_t>` | | (since C++17) | ### Null-terminated strings Null-terminated strings are arrays of characters that are terminated by a special *null* character. C++ provides functions to create, inspect, and modify null-terminated strings. There are three types of null-terminated strings: * [null-terminated byte strings](string/byte "cpp/string/byte") * [null-terminated multibyte strings](string/multibyte "cpp/string/multibyte") * [null-terminated wide strings](string/wide "cpp/string/wide") ### Additional support #### `[std::char\_traits](string/char_traits "cpp/string/char traits")` The string library also provides class template `[std::char\_traits](string/char_traits "cpp/string/char traits")` that defines types and functions for `[std::basic\_string](string/basic_string "cpp/string/basic string")` and `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` (since C++17). The following specializations are defined: | Defined in header `[<string>](header/string "cpp/header/string")` | | | | --- | --- | --- | | ``` template<> class char_traits<char>; ``` | | | | ``` template<> class char_traits<wchar_t>; ``` | | | | ``` template<> class char_traits<char8_t>; ``` | | (since C++20) | | ``` template<> class char_traits<char16_t>; ``` | | (since C++11) | | ``` template<> class char_traits<char32_t>; ``` | | (since C++11) | #### Conversions and classification The [localizations library](locale "cpp/locale") provides support for string conversions (e.g. `[std::wstring\_convert](locale/wstring_convert "cpp/locale/wstring convert")` or `std::toupper`) as well as functions that classify characters (e.g. `std::isspace` or `std::isdigit`). ### See also | | | --- | | [C++ documentation](locale "cpp/locale") for Localizations library | | [C documentation](https://en.cppreference.com/w/c/string "c/string") for Strings library | cpp C++ compiler support C++ compiler support ==================== This page is maintained as best-effort and may lag behind most recent compiler releases. If you see something is out-of-date, please help us by updating it! The following tables present compiler support for new C++ features. These include accepted revisions to the standard, as well as various technical specifications: * [C++23/2b core language features](#cpp23) (see below) * [C++23/2b library features](#C.2B.2B23_library_features) (see below) * [C++20 core language features](#cpp20) (see below) * [C++20 library features](#C.2B.2B20_library_features) (see below) * [C++17 core language features](compiler_support/17 "cpp/compiler support/17") (on a separate page) * [C++17 library features](compiler_support/17#C.2B.2B17_library_features "cpp/compiler support/17") (on a separate page) * [C++14 core language features](compiler_support/14 "cpp/compiler support/14") (on a separate page) * [C++14 library features](compiler_support/14#C.2B.2B14_library_features "cpp/compiler support/14") (on a separate page) * [C++11 core language features](compiler_support/11 "cpp/compiler support/11") (on a separate page) * [C++11 library features](compiler_support/11#C.2B.2B11_library_features "cpp/compiler support/11") (on a separate page) * [References](#References) (see below) #### Note *\** - hover over a cell with the version number to see notes. C++23 features --------------- Note that this list may change, as the draft C++23/2b standard evolves. ### C++23 core language features | C++23 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [Literal suffix](language/integer_literal "cpp/language/integer literal") for (signed) [`size_t`](types/size_t "cpp/types/size t") | [P0330R8](https://wg21.link/P0330R8) | 11 | 13 | | 13.1.6\* | | | | | | | | | | Make `()` more optional for [lambdas](language/lambda "cpp/language/lambda") | [P1102R2](https://wg21.link/P1102R2) | 11 | 13 | | 13.1.6\* | 6.3 | | | | | | | | | [`if consteval`](language/if#Consteval_if "cpp/language/if") | [P1938R3](https://wg21.link/P1938R3) | 12 | 14 | | 14.0.0\* | 6.3 | | | | | | | | | Removing Garbage Collection Support | [P2186R2](https://wg21.link/P2186R2) | 12 | | | | N/A | | | | | | | | | DR: C++ Identifier Syntax using Unicode Standard Annex 31 | [P1949R7](https://wg21.link/P1949R7) | 12 | 14 | | 14.0.0\* | 6.4 | | | | | | | | | DR: Allow Duplicate Attributes | [P2156R1](https://wg21.link/P2156R1) | 11 | 13 | | 13.1.6\* | | | | | | | | | | Narrowing contextual conversions in [`static_assert`](language/static_assert "cpp/language/static assert") and [constexpr if](language/if#Constexpr_if "cpp/language/if") | [P1401R5](https://wg21.link/P1401R5) | 9 | 13 (partial)\*14 | | 14.0.0\* | | | | | | | | | | Trimming whitespaces before line splicing | [P2223R2](https://wg21.link/P2223R2) | Yes | Yes | | Yes | | | | | | | | | | Make declaration order layout mandated | [P1847R4](https://wg21.link/P1847R4) | Yes | Yes | Yes | Yes | | | | | | | | | | Removing mixed wide [string literal concatenation](language/string_literal#Concatenation "cpp/language/string literal") | [P2201R1](https://wg21.link/P2201R1) | Yes | Yes | Yes | Yes | Yes | Yes | | | | | | | | [Deducing this](language/member_functions#Explicit_object_parameter "cpp/language/member functions") | [P0847R7](https://wg21.link/P0847R7) | | | 19.32\*(partial)\* | | 6.3 | | | | | | | | | [`auto(x)` and `auto{x}`](language/explicit_cast "cpp/language/explicit cast") | [P0849R8](https://wg21.link/P0849R8) | 12 | 15 | | | 6.4 | | | | | | | | | Change scope of lambda trailing-return-type | [P2036R3](https://wg21.link/P2036R3) | | | | | | | | | | | | | | [`#elifdef` and `#elifndef`](preprocessor/conditional "cpp/preprocessor/conditional") | [P2334R1](https://wg21.link/P2334R1) | 12 | 13 | | 13.1.6\* | | | | | | | | | | Non-literal variables (and labels and gotos) in [`constexpr`](language/constexpr "cpp/language/constexpr") functions | [P2242R3](https://wg21.link/P2242R3) | 12 | 15 | | | 6.3 | | | | | | | | | Consistent character literal encoding | [P2316R2](https://wg21.link/P2316R2) | Yes | Yes | | Yes | Yes | | | | | | | | | Character sets and encodings | [P2314R4](https://wg21.link/P2314R4) | | Yes | | Yes | | | | | | | | | | Extend init-statement to allow alias-declaration | [P2360R0](https://wg21.link/P2360R0) | 12 | 14 | | 14.0.0\* | | | | | | | | | | Multidimensional subscript operator | [P2128R6](https://wg21.link/P2128R6) | 12 | 15 | | | | | | | | | | | | Attributes on [lambdas](language/lambda "cpp/language/lambda") | [P2173R1](https://wg21.link/P2173R1) | 9 | 13 | | 13.1.6\* | | | | | | | | | | DR: Adjusting the value of feature testing macro `__cpp_concepts` | [P2493R0](https://wg21.link/P2493R0) | 12 | | 19.32\* | | 6.4 | | | | | | | | | [`#warning`](preprocessor/error "cpp/preprocessor/error") | [P2437R1](https://wg21.link/P2437R1) | Yes\* | Yes | | Yes | Yes | Yes | | | | | | | | Remove non-encodable wide character literals and multicharacter wide character literals | [P2362R3](https://wg21.link/P2362R3) | 13 | 14 | | | | | | | | | | | | Labels at the end of compound statements | [P2324R2](https://wg21.link/P2324R2) | 13 | | | | | | | | | | | | | Delimited escape sequences | [P2290R3](https://wg21.link/P2290R3) | 13 | 15 | | | | | | | | | | | | Named universal character escapes | [P2071R2](https://wg21.link/P2071R2) | 13 | 15 | | | | | | | | | | | | Relaxing some `constexpr` restrictions | [P2448R2](https://wg21.link/P2448R2) | | | | | | | | | | | | | | Simpler implicit move | [P2266R3](https://wg21.link/P2266R3) | | 13 | | | | | | | | | | | | `static operator()` | [P1169R4](https://wg21.link/P1169R4) | | | | | | | | | | | | | | Requirements for optional extended floating-point types | [P1467R9](https://wg21.link/P1467R9) | | | N/A | | | | | | | | | | | Class template argument deduction from inherited constructors | [P2582R1](https://wg21.link/P2582R1) | | | | | | | | | | | | | | Attribute `[[[assume](https://en.cppreference.com/mwiki/index.php?title=cpp/language/attributes/assume&action=edit&redlink=1 "cpp/language/attributes/assume (page does not exist)")]]` | [P1774R8](https://wg21.link/P1774R8) | | | | | | | | | | | | | | Support for UTF-8 as a portable source file encoding | [P2295R6](https://wg21.link/P2295R6) | | 15\* | | | | | | | | | | | | DR: De-deprecating volatile bitwise compound assignment operations | [P2327R1](https://wg21.link/P2327R1) | 13 | 15 | | | | | | | | | | | | DR: Relax requirements on `wchar_t` to match existing practices | [P2460R2](https://wg21.link/P2460R2) | Yes | Yes | | | | | | | | | | | | DR: Using unknown pointers and references in constant expressions | [P2280R4](https://wg21.link/P2280R4) | | | | | | | | | | | | | | DR: The Equality Operator You Are Looking For | [P2468R2](https://wg21.link/P2468R2) | | | | | | | | | | | | | | DR: `char8_t` Compatibility and Portability Fix | [P2513R3](https://wg21.link/P2513R3) | | | | | | | | | | | | | | C++23 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | ### C++23 library features | C++23 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [Stacktrace library](error#Stacktrace "cpp/error") | [P0881R7](https://wg21.link/P0881R7)[P2301R1](https://wg21.link/P2301R1) | 12 (partial)\* | | 19.34\* | | | | | | [`<stdatomic.h>`](header/stdatomic.h "cpp/header/stdatomic.h") | [P0943R6](https://wg21.link/P0943R6) | 12 | 15 | 19.31\* | | | | | | [`std::is_scoped_enum`](types/is_scoped_enum "cpp/types/is scoped enum") | [P1048R1](https://wg21.link/P1048R1) | 11 | 12 | 19.30\* | 13.0.0\* | | | | | [`basic_string::contains()`](string/basic_string/contains "cpp/string/basic string/contains"), [`basic_string_view::contains()`](string/basic_string_view/contains "cpp/string/basic string view/contains") | [P1679R3](https://wg21.link/P1679R3) | 11 | 12 | 19.30\* | 13.0.0\* | | | | | [`std::to_underlying`](utility/to_underlying "cpp/utility/to underlying") | [P1682R3](https://wg21.link/P1682R3) | 11 | 13 | 19.30\* | 13.1.6\* | | | | | Relaxing requirements for `[time\_point<>::clock](chrono/time_point "cpp/chrono/time point")` | [P2212R2](https://wg21.link/P2212R2) | N/A | | N/A | | | | | | DR: `[std::visit()](utility/variant/visit "cpp/utility/variant/visit")` for classes derived from `[std::variant](utility/variant "cpp/utility/variant")` | [P2162R2](https://wg21.link/P2162R2) | 11.3 | 13 | 19.20\*\*19.30\* | 13.1.6\* | | | | | DR: Conditionally borrowed ranges | [P2017R1](https://wg21.link/P2017R1) | 11 | | 19.30\* | | | | | | DR: Repairing [input range adaptors](ranges#Views "cpp/ranges") and `[std::counted\_iterator](iterator/counted_iterator "cpp/iterator/counted iterator")` | [P2259R1](https://wg21.link/P2259R1) | 12 | | 19.30\*(partial)\*19.31\* | | | | | | Providing size feedback in the Allocator interface | [P0401R6](https://wg21.link/P0401R6) | | 15 | 19.30\* | | | | | | [`<spanstream>`](header/spanstream "cpp/header/spanstream"): string-stream with [`std::span`](container/span "cpp/container/span")-based buffer | [P0448R4](https://wg21.link/P0448R4) | 12 | | 19.31\* | | | | | | [`std::out_ptr()`](memory/out_ptr_t/out_ptr "cpp/memory/out ptr t/out ptr"), [`std::inout_ptr()`](memory/inout_ptr_t/inout_ptr "cpp/memory/inout ptr t/inout ptr") | [P1132R8](https://wg21.link/P1132R8) | | | 19.30\* | | | | | | `constexpr` [`type_info::operator==()`](types/type_info/operator_cmp "cpp/types/type info/operator cmp") | [P1328R1](https://wg21.link/P1328R1) | 12 | | 19.33\* | | | | | | Iterator pair constructors for [`std::stack`](container/stack/stack "cpp/container/stack/stack") and [`std::queue`](container/queue/queue "cpp/container/queue/queue") | [P1425R4](https://wg21.link/P1425R4) | 12 | 14 | 19.31\* | | | | | | Non-deduction context for allocators in container deduction guides | [P1518R2](https://wg21.link/P1518R2) | 12 | 13 | 19.31\* | 13.1.6\* | | | | | [`ranges::starts_with()`](algorithm/ranges/starts_with "cpp/algorithm/ranges/starts with") and [`ranges::ends_with()`](algorithm/ranges/ends_with "cpp/algorithm/ranges/ends with") | [P1659R3](https://wg21.link/P1659R3) | | | 19.31\* | | | | | | Prohibiting `[std::basic\_string](string/basic_string "cpp/string/basic string")` and `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` construction from [`nullptr`](language/nullptr "cpp/language/nullptr") | [P2166R1](https://wg21.link/P2166R1) | 12 | 13 | 19.30\* | 13.1.6\* | | | | | [`std::invoke_r()`](utility/functional/invoke "cpp/utility/functional/invoke") | [P2136R3](https://wg21.link/P2136R3) | 12 | | 19.31\* | | | | | | Range [constructor](string/basic_string_view/basic_string_view "cpp/string/basic string view/basic string view") for `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` | [P1989R2](https://wg21.link/P1989R2) | 11 | 14 | 19.30\* | | | | | | Default template arguments for [`pair`](utility/pair "cpp/utility/pair")'s forwarding [constructor](utility/pair/pair "cpp/utility/pair/pair") | [P1951R1](https://wg21.link/P1951R1) | | 14 | 19.30\* | | | | | | Remove Garbage Collection and Reachability-Based Leak Detection ([library support](memory#Garbage_collector_support "cpp/memory")) | [P2186R2](https://wg21.link/P2186R2) | 12 | 14 | 19.30\* | | | | | | DR: [`views::join`](ranges/join_view "cpp/ranges/join view") should join all views of ranges | [P2328R1](https://wg21.link/P2328R1) | 11.2 | | 19.30\* | | | | | | DR: [`view`](ranges/view "cpp/ranges/view") does not require [`default_initializable`](concepts/default_initializable "cpp/concepts/default initializable") | [P2325R3](https://wg21.link/P2325R3) | 11.3 | | 19.30\* | | | | | | DR: Range adaptor objects bind arguments by value | [P2281R1](https://wg21.link/P2281R1) | 11 | | 19.29 (16.10)\*(partial)\*19.31\* | | | | | | DR: [`constexpr`](language/constexpr "cpp/language/constexpr") for `[std::optional](utility/optional "cpp/utility/optional")` and `[std::variant](utility/variant "cpp/utility/variant")` | [P2231R1](https://wg21.link/P2231R1) | 11.3 (partial)\*12 | 13 (partial)\* | 19.31\* | 13.1.6\* (partial). | | | | | DR: [`std::format()`](utility/format/format "cpp/utility/format/format") improvements | [P2216R3](https://wg21.link/P2216R3) | | 14 (partial)\* 15 | 19.32\* | | | | | | DR: [`views::lazy_split`](ranges/lazy_split_view "cpp/ranges/lazy split view") and redesigned [`views::split`](ranges/split_view "cpp/ranges/split view") | [P2210R2](https://wg21.link/P2210R2) | 12 | | 19.31\* | | | | | | zip: `views::zip`, `views::zip_transform`, `views::adjacent`, and `views::adjacent_transform` | [P2321R2](https://wg21.link/P2321R2) | 13 (partial)\* | 15 (partial)\* | 19.33\* (partial)\* | | | | | | Heterogeneous erasure overloads for associative containers | [P2077R3](https://wg21.link/P2077R3) | | | 19.32\* | | | | | | [`std::byteswap()`](numeric/byteswap "cpp/numeric/byteswap") | [P1272R4](https://wg21.link/P1272R4) | 12 | 14 | 19.31\* | | | | | | [Printing](io/basic_ostream/operator_ltlt "cpp/io/basic ostream/operator ltlt") `volatile T*` | [P1147R1](https://wg21.link/P1147R1) | 11.3 | 14 | 19.31\* | | | | | | [`basic_string::resize_and_overwrite()`](string/basic_string/resize_and_overwrite "cpp/string/basic string/resize and overwrite") | [P1072R10](https://wg21.link/P1072R10) | 12 | 14 | 19.31\* | | | | | | Monadic operations for `[std::optional](utility/optional "cpp/utility/optional")` | [P0798R8](https://wg21.link/P0798R8) | 12 | 14 | 19.32\* | | | | | | [`std::move_only_function`](utility/functional/move_only_function "cpp/utility/functional/move only function") | [P0288R9](https://wg21.link/P0288R9) | 12 | | 19.32\* | | | | | | Add a conditional noexcept specification to `[std::exchange](utility/exchange "cpp/utility/exchange")` | [P2401R0](https://wg21.link/P2401R0) | 12 | 14 | 19.25\* | | | | | | Require [`span`](container/span "cpp/container/span") & [`basic_string_view`](string/basic_string_view "cpp/string/basic string view") to be [TriviallyCopyable](named_req/triviallycopyable "cpp/named req/TriviallyCopyable") | [P2251R1](https://wg21.link/P2251R1) | Yes | Yes | Yes | Yes | | | | | Clarifying the status of the “C headers” | [P2340R1](https://wg21.link/P2340R1) | Yes | Yes | Yes | Yes | | | | | DR: Fix `views::istream` | [P2432R1](https://wg21.link/P2432R1) | 12 | | 19.31\* | | | | | | DR: Add support for non-const-formattable types to `[std::format](utility/format/format "cpp/utility/format/format")` | [P2418R2](https://wg21.link/P2418R2) | | 15 | 19.32\* | | | | | | DR: [`view`](ranges/view "cpp/ranges/view") with ownership | [P2415R2](https://wg21.link/P2415R2) | 12 | 14 | 19.31\* | | | | | | DR: Fixing locale handling in chrono formatters | [P2372R3](https://wg21.link/P2372R3) | | | 19.31\* | | | | | | DR: Cleaning up integer-class types | [P2393R1](https://wg21.link/P2393R1) | | | 19.32\* | | | | | | [`<expected>`](header/expected "cpp/header/expected") | [P0323R12](https://wg21.link/P0323R12)[P2549R1](https://wg21.link/P2549R1) | 12 | | 19.33\* | | | | | | constexpr for [`<cmath>`](header/cmath "cpp/header/cmath") and [`<cstdlib>`](header/cstdlib "cpp/header/cstdlib") | [P0533R9](https://wg21.link/P0533R9) | 4.6 (partial)\* | | | | | | | | [`std::unreachable()`](utility/unreachable "cpp/utility/unreachable") | [P0627R6](https://wg21.link/P0627R6) | 12 | 15 | 19.32\* | | | | | | Deprecating `[std::aligned\_storage](types/aligned_storage "cpp/types/aligned storage")` and `[std::aligned\_union](types/aligned_union "cpp/types/aligned union")` | [P1413R3](https://wg21.link/P1413R3) | | | 19.33\* | | | | | | [`std::reference_constructs_from_temporary`](types/reference_constructs_from_temporary "cpp/types/reference constructs from temporary") & [`std::reference_converts_from_temporary`](types/reference_converts_from_temporary "cpp/types/reference converts from temporary") | [P2255R2](https://wg21.link/P2255R2) | 13 (partial)\* | | | | | | | | constexpr `[std::unique\_ptr](memory/unique_ptr "cpp/memory/unique ptr")` | [P2273R3](https://wg21.link/P2273R3) | 12 | 16 (partial)\* | 19.33\* | | | | | | [`ranges::to()`](ranges/to "cpp/ranges/to") | [P1206R7](https://wg21.link/P1206R7) | | | | | | | | | Pipe support for user-defined range adaptors | [P2387R3](https://wg21.link/P2387R3) | | | 19.34\* | | | | | | [`ranges::iota()`](algorithm/ranges/iota "cpp/algorithm/ranges/iota"), [`ranges::shift_left()`](algorithm/ranges/shift "cpp/algorithm/ranges/shift"), and [`ranges::shift_right()`](algorithm/ranges/shift "cpp/algorithm/ranges/shift") | [P2440R1](https://wg21.link/P2440R1) | | | 19.34\* | | | | | | `views::join_with` | [P2441R2](https://wg21.link/P2441R2) | | | 19.34\* | | | | | | `views::chunk` and `views::slide` | [P2442R1](https://wg21.link/P2442R1) | | | 19.33\* | | | | | | `views::chunk_by` | [P2443R1](https://wg21.link/P2443R1) | | | 19.33\* | | | | | | `std::mdspan`: a non-owning multidimensional array reference | [P0009R18](https://wg21.link/P0009R18)[P2599R2](https://wg21.link/P2599R2)[P2604R0](https://wg21.link/P2604R0)[P2613R1](https://wg21.link/P2613R1) | | | | | | | | | [`<flat_map>`](header/flat_map "cpp/header/flat map") | [P0429R9](https://wg21.link/P0429R9) | | | | | | | | | [`<flat_set>`](header/flat_set "cpp/header/flat set") | [P1222R4](https://wg21.link/P1222R4) | | | | | | | | | [`ranges::find_last()`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last"), [`ranges::find_last_if()`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last"), and [`ranges::find_last_if_not()`](algorithm/ranges/find_last "cpp/algorithm/ranges/find last") | [P1223R5](https://wg21.link/P1223R5) | | | | | | | | | `views::stride` | [P1899R3](https://wg21.link/P1899R3) | | | 19.34\* | | | | | | Formatted output library | [P2093R14](https://wg21.link/P2093R14) | | | | | | | | | Compatibility between `[std::tuple](utility/tuple "cpp/utility/tuple")` and tuple-like objects | [P2165R4](https://wg21.link/P2165R4) | | 2.9 (partial)\* | | Partial\* | | | | | Rectifying constant iterators, sentinels, and ranges | [P2278R4](https://wg21.link/P2278R4) | | | | | | | | | Formatting Ranges | [P2286R8](https://wg21.link/P2286R8) | | | | | | | | | `constexpr` for integral overloads of [`std::to_chars()`](utility/to_chars "cpp/utility/to chars") and [`std::from_chars()`](utility/from_chars "cpp/utility/from chars"). | [P2291R3](https://wg21.link/P2291R3) | | | 19.34\* | | | | | | [`ranges::contains()`](algorithm/ranges/contains "cpp/algorithm/ranges/contains") and [`ranges::contains_subrange()`](algorithm/ranges/contains "cpp/algorithm/ranges/contains") | [P2302R4](https://wg21.link/P2302R4) | | | 19.34\* | | | | | | Ranges fold algorithms | [P2322R6](https://wg21.link/P2322R6) | | | | | | | | | `views::cartesian_product` | [P2374R4](https://wg21.link/P2374R4)[P2540R1](https://wg21.link/P2540R1) | | | | | | | | | Adding move-only types support for comparison concepts | [P2404R3](https://wg21.link/P2404R3) | | | | | | | | | Ranges iterators as inputs to non-ranges algorithms | [P2408R5](https://wg21.link/P2408R5) | | | 19.34\* | | | | | | constexpr `[std::bitset](utility/bitset "cpp/utility/bitset")` | [P2417R2](https://wg21.link/P2417R2) | | 16 | 19.34\* | | | | | | [`basic_string::substr()`](string/basic_string/substr "cpp/string/basic string/substr") `&&` | [P2438R2](https://wg21.link/P2438R2) | | | 19.34\* | | | | | | `views::as_rvalue` | [P2446R2](https://wg21.link/P2446R2) | | | 19.34\* | | | | | | Standard Library Modules | [P2465R3](https://wg21.link/P2465R3) | | | | | | | | | [`std::forward_like()`](utility/forward_like "cpp/utility/forward like") | [P2445R1](https://wg21.link/P2445R1) | | 16 | 19.34\* | | | | | | Support exclusive mode for `[std::fstream](io/basic_fstream "cpp/io/basic fstream")` | [P2467R1](https://wg21.link/P2467R1) | 12 | | | | | | | | `views::repeat` | [P2474R2](https://wg21.link/P2474R2) | | | | | | | | | Relaxing range adaptors to allow for move-only types | [P2494R2](https://wg21.link/P2494R2) | | | 19.34\* | | | | | | `[std::basic\_string\_view](string/basic_string_view "cpp/string/basic string view")` range [constructor](string/basic_string_view/basic_string_view "cpp/string/basic string view/basic string view") should be explicit | [P2499R0](https://wg21.link/P2499R0) | 12.2 | 16 | 19.34\* | | | | | | `std::generator`: synchronous coroutine generator for ranges | [P2502R2](https://wg21.link/P2502R2) | | | | | | | | | `std::basic_format_string` | [P2508R1](https://wg21.link/P2508R1) | | 15 | | | | | | | Add a conditional noexcept specification to `[std::apply](utility/apply "cpp/utility/apply")` | [P2517R0](https://wg21.link/P2517R0) | 10 | | 19.34\* | | | | | | Improve default container formatting | [P2585R1](https://wg21.link/P2585R1) | | | | | | | | | Explicit lifetime management | [P2590R2](https://wg21.link/P2590R2) | | | | | | | | | Clarify handling of encodings in localized formatting of chrono types | [P2419R2](https://wg21.link/P2419R2) | | | 19.34\*\* | | | | | | `[std::move\_iterator](iterator/move_iterator "cpp/iterator/move iterator")` should not always be [`input_iterator`](iterator/input_iterator "cpp/iterator/input iterator") | [P2520R0](https://wg21.link/P2520R0) | | | 19.34\*\* | | | | | | Deduction guides update for [deducing `this`](language/member_functions#Explicit_object_parameter "cpp/language/member functions") | [LWG3617](https://cplusplus.github.io/LWG/issue3617) | | | 19.34\* | | | | | | Deduction guides update for `static operator()` | [P1169R4](https://wg21.link/P1169R4) | | | | | | | | | Standard names and library support for extended floating-point types | [P1467R9](https://wg21.link/P1467R9) | | | | | | | | | C++23 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | C++20 features --------------- ### C++20 core language features | C++20 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Allow [lambda-capture](language/lambda#Lambda_capture "cpp/language/lambda") `[=, this]` | [P0409R2](https://wg21.link/P0409R2) | 8 | 6 | 19.22\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [`__VA_OPT__`](preprocessor/replace#Function-like_macros "cpp/preprocessor/replace") | [P0306R4](https://wg21.link/P0306R4)[P1042R1](https://wg21.link/P1042R1) | 8 (partial)\*10 (partial)\*12 | 9 | 19.25\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Designated initializers](language/aggregate_initialization#Designated_initializers "cpp/language/aggregate initialization") | [P0329R4](https://wg21.link/P0329R4) | 4.7 (partial)\*8 | 3.0 (partial)\*10 | 19.21\* | 12.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [template-parameter-list for generic lambdas](language/lambda#Syntax "cpp/language/lambda") | [P0428R2](https://wg21.link/P0428R2) | 8 | 9 | 19.22\* | 11.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Default member initializers for bit-fields](language/bit_field#Cpp20_Default_member_initializers_for_bit_fields "cpp/language/bit field") | [P0683R1](https://wg21.link/P0683R1) | 8 | 6 | 19.25\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | Initializer list constructors in class template argument deduction | [P0702R1](https://wg21.link/P0702R1) | 8 | 6 | 19.14\* | Yes | 5.0 | 2021.1 | | | | | 20.7 | | | `const&`-qualified pointers to members | [P0704R1](https://wg21.link/P0704R1) | 8 | 6 | 19.0 (2015)\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Concepts](language/constraints "cpp/language/constraints") | [P0734R0](https://wg21.link/P0734R0) | 6(TS only)10 | 10 | 19.23\* (partial)\*19.30\* | 12.0.0\* (partial). | 6.1 | 2021.5 | | | | | 20.11 | | | [Lambdas in unevaluated contexts](language/lambda#Lambdas_in_unevaluated_contexts "cpp/language/lambda") | [P0315R4](https://wg21.link/P0315R4) | 9 | 13 (partial)\*14 (partial)\* | 19.28 (16.8)\* | 13.1.6\* (partial). | | | | | | | | | | [Three-way comparison operator](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") | [P0515R3](https://wg21.link/P0515R3) | 10 | 8 (partial)10 | 19.20\* | | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Simplifying implicit lambda capture | [P0588R1](https://wg21.link/P0588R1) | 8 | | 19.24\* | | 5.1 | 2021.1 | | | | | 20.7 | | | [init-statements for range-based for](language/range-for#Syntax "cpp/language/range-for") | [P0614R1](https://wg21.link/P0614R1) | 9 | 8 | 19.25\* | 11.0.0\* | 6.0 | | | | | | 20.11 | | | Default constructible and assignable stateless [lambdas](language/lambda "cpp/language/lambda") | [P0624R2](https://wg21.link/P0624R2) | 9 | 8 | 19.22\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | `const` mismatch with defaulted copy constructor | [P0641R2](https://wg21.link/P0641R2) | 9 | 8 | 19.0 (2015)\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | Access checking on specializations | [P0692R1](https://wg21.link/P0692R1) | Yes | 8 (partial)14 | 19.26\* | 14.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | ADL and function templates that are not visible | [P0846R0](https://wg21.link/P0846R0) | 9 | 9 | 19.21\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Specify when `constexpr` function definitions are [needed for constant evaluation](language/constant_expression#Functions_and_variables_needed_for_constant_evaluation "cpp/language/constant expression") | [P0859R0](https://wg21.link/P0859R0) | 5.2 (partial)\*9 | 8 | 19.27\* (partial)\*19.31\* | 11.0.0\* | | | | | | | | | | Attributes `[[[likely](language/attributes/likely "cpp/language/attributes/likely")]]` and `[[[unlikely](language/attributes/likely "cpp/language/attributes/likely")]]` | [P0479R5](https://wg21.link/P0479R5) | 9 | 12 | 19.26\* | 13.0.0\* | 5.1 | | | | | | 20.7 | | | Make [`typename`](keywords/typename "cpp/keywords/typename") more optional | [P0634R3](https://wg21.link/P0634R3) | 9 | | 19.29 (16.10)\* | | 5.1 | | | | | | 20.7 | | | Pack expansion in lambda init-capture | [P0780R2](https://wg21.link/P0780R2) | 9 | 9 | 19.22\* | 11.0.3\* | 6.1 | | | | | | 20.11 | | | Attribute `[[[no\_unique\_address](language/attributes/no_unique_address "cpp/language/attributes/no unique address")]]` | [P0840R2](https://wg21.link/P0840R2) | 9 | 9 | 19.28 (16.9)\*\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Conditionally Trivial Special Member Functions | [P0848R3](https://wg21.link/P0848R3) | 10 | 16 | 19.28 (16.8)\* | | 6.1 | | | | | | 20.11 | | | DR: Relaxing the [structured bindings](language/structured_binding "cpp/language/structured binding") customization point finding rules | [P0961R1](https://wg21.link/P0961R1) | 8 | 8 | 19.21\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Relaxing the [range-`for` loop](language/range-for "cpp/language/range-for") customization point finding rules | [P0962R1](https://wg21.link/P0962R1) | 8 | 8 | 19.25\* | 11.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Allow structured bindings to accessible members | [P0969R0](https://wg21.link/P0969R0) | 8 | 8 | 19.21\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Destroying `operator delete`](memory/new/operator_delete "cpp/memory/new/operator delete") | [P0722R3](https://wg21.link/P0722R3) | 9 | 6 | 19.27\* | 10.0.0\* | 6.1 | | | | | | 20.11 | | | Class types in [non-type template parameters](language/template_parameters#Non-type_template_parameter "cpp/language/template parameters") | [P0732R2](https://wg21.link/P0732R2) | 9 | 12 (partial) | 19.26\*(partial)\*19.28 (16.9)\* | 13.0.0\* (partial). | 6.2 | | | | | | | | | Deprecate implicit [capture](language/lambda#Lambda_capture "cpp/language/lambda") of `this` via `[=]` | [P0806R2](https://wg21.link/P0806R2) | 9 | 7 | 19.22\* | 10.0.1\* | 5.1 | | | | | | 20.7 | | | [`explicit(bool)`](language/explicit "cpp/language/explicit") | [P0892R2](https://wg21.link/P0892R2) | 9 | 9 | 19.24\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Integrating [feature-test macros](feature_test "cpp/feature test") | [P0941R2](https://wg21.link/P0941R2) | 5 | 3.4 | 19.15\* (partial)19.20\* | Yes | 5.0 | 2021.1 | | | | | 20.7 | | | Prohibit aggregates with user-declared constructors | [P1008R1](https://wg21.link/P1008R1) | 9 | 8 | 19.20\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | `constexpr` [virtual function](language/virtual "cpp/language/virtual") | [P1064R0](https://wg21.link/P1064R0) | 9 | 9 | 19.28 (16.9)\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Consistency improvements for comparisons | [P1120R0](https://wg21.link/P1120R0) | 10 | 8 (partial)10 | 19.22\* | 12.0.0\* | 5.1 | | | | | | 20.7 | | | [`char8_t`](language/types#char8_t "cpp/language/types") | [P0482R6](https://wg21.link/P0482R6) | 9 | 7\* | 19.22\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [`std::is_constant_evaluated()`](types/is_constant_evaluated "cpp/types/is constant evaluated") | [P0595R2](https://wg21.link/P0595R2) | 9 | 9 | 19.25\* | 11.0.3\* | 5.1 | 19.1 | | | | | | | | `constexpr` `try`-`catch` blocks | [P1002R1](https://wg21.link/P1002R1) | 9 | 8 | 19.25\* | 10.0.1\* | 5.1 | | | | | | 20.7 | | | [Immediate functions](language/consteval "cpp/language/consteval") (`consteval`) | [P1073R3](https://wg21.link/P1073R3) | 10 (partial)\* 11 | 11 (partial)14 (partial)\* | 19.28 (16.8)\*\*(partial)19.29 (16.10)\* | 11.0.3\* (partial). | 5.1 | 2021.1 | | | | | 20.7 | | | [Nested inline namespaces](language/namespace "cpp/language/namespace") | [P1094R2](https://wg21.link/P1094R2) | 9 | 8 | 19.27\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | Yet another approach for [constrained](language/template_parameters#Type_template_parameter "cpp/language/template parameters") [declarations](language/auto "cpp/language/auto") | [P1141R2](https://wg21.link/P1141R2) | 10 | 10 | 19.26\* (partial)19.28 (16.9)\* | 12.0.5\* | 6.1 | | | | | | 20.11 | | | Signed integers are two's complement | [P1236R1](https://wg21.link/P1236R1) | 9 | 9 | Yes | 11.0.3\* | N/A | N/A | | | | | yes | | | [`dynamic_cast`](language/dynamic_cast "cpp/language/dynamic cast") and polymorphic [`typeid`](language/typeid "cpp/language/typeid") in [constant expressions](language/constant_expression "cpp/language/constant expression") | [P1327R1](https://wg21.link/P1327R1) | 10 | 9 | 19.29 (16.10)\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Changing the active member of a union inside `constexpr` | [P1330R0](https://wg21.link/P1330R0) | 9 | 9 | 19.10\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Coroutines](language/coroutines "cpp/language/coroutines") | [P0912R5](https://wg21.link/P0912R5) | 10 | 8 (partial) | 19.0 (2015)\* (partial)19.10\* (TS only)19.28 (16.8)\* | 10.0.1\* (partial). | 5.1 | 2021.1 | | | | | | | | Parenthesized [initialization of aggregates](language/aggregate_initialization "cpp/language/aggregate initialization") | [P0960R3](https://wg21.link/P0960R3) | 10 | | 19.28 (16.8)\* | | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Array size deduction in [`new`-expressions](language/new "cpp/language/new") | [P1009R2](https://wg21.link/P1009R2) | 11 | 9 | 19.27\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Modules](language/modules "cpp/language/modules") | [P1103R3](https://wg21.link/P1103R3) | 11 (partial) | 8 (partial) | 19.0 (2015)\* (partial)19.10\* (TS only)19.28 (16.8)\* | 10.0.1\* (partial). | | | | | | | | | | Stronger Unicode requirements | [P1041R4](https://wg21.link/P1041R4)[P1139R2](https://wg21.link/P1139R2) | 10 | Yes | 19.0 (2015)\* (P1041R4)19.26\* (P1139R2) | Yes | N/A | | | | | | | | | `<=> != ==` | [P1185R2](https://wg21.link/P1185R2) | 10 | 10 | 19.22\* | 12.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Explicitly defaulted functions with different exception specifications | [P1286R2](https://wg21.link/P1286R2) | 10 | 9 | 19.28 (16.8)\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Lambda capture and storage class specifiers of structured bindings | [P1091R3](https://wg21.link/P1091R3)[P1381R1](https://wg21.link/P1381R1) | 10 | 8 (partial)16 | 19.11\*(P1381R1)19.24\*(P1091R3) | 10.0.1\* (partial). | 5.1 | 2021.1 | | | | | 20.7 | | | Permit conversions to arrays of unknown bound | [P0388R4](https://wg21.link/P0388R4) | 10 | 14 | 19.27\* | 14.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | `constexpr` container operations | [P0784R7](https://wg21.link/P0784R7) | 10 | 10 | 19.28 (16.9)\* | 12.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | Deprecating some uses of [`volatile`](language/cv#Notes "cpp/language/cv") | [P1152R4](https://wg21.link/P1152R4) | 10 | 10 | 19.27\* | 12.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | [`constinit`](language/constinit "cpp/language/constinit") | [P1143R2](https://wg21.link/P1143R2) | 10 | 10 | 19.29 (16.10)\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | [Deprecate comma operator in subscripts](language/operator_other#Built-in_comma_operator "cpp/language/operator other") | [P1161R3](https://wg21.link/P1161R3) | 10 | 9 | 19.25\* | 11.0.3\* | 6.0 | 2021.5 | | | | | 20.11 | | | `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` with message | [P1301R4](https://wg21.link/P1301R4) | 10 | 9 | 19.25\* | 11.0.3\* | 6.0 | 2021.5 | | | | | 20.11 | | | Trivial default initialization in `constexpr` functions | [P1331R2](https://wg21.link/P1331R2) | 10 | 10 | 19.27\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | Unevaluated `asm`-declaration in `constexpr` functions | [P1668R1](https://wg21.link/P1668R1) | 10 | 10 | 19.28 (16.9)\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | [`using enum`](language/enum#Using-enum-declaration "cpp/language/enum") | [P1099R5](https://wg21.link/P1099R5) | 11 | 13 | 19.24\* | 13.1.6\* | 6.3 | | | | | | | | | Synthesizing [three-way comparison](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") for specified comparison category | [P1186R3](https://wg21.link/P1186R3) | 11 | 10 | 19.24\* | 12.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | DR: `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` for constructors | [P1771R1](https://wg21.link/P1771R1) | 10 | 9 | 19.24\* | 11.0.3\* | 6.0 | 2021.5 | | | | | 20.11 | | | [Class template argument deduction](language/class_template_argument_deduction "cpp/language/class template argument deduction") for alias templates | [P1814R0](https://wg21.link/P1814R0) | 10 | | 19.27\* | | | | | | | | | | | Class template argument deduction for aggregates | [P1816R0](https://wg21.link/P1816R0)[P2082R1](https://wg21.link/P2082R1) | 10(P1816R0)11(P2082R1) | | 19.27\* | | 6.3 | | | | | | | | | DR: [Implicit move](language/return "cpp/language/return") for more local objects and rvalue references | [P1825R0](https://wg21.link/P1825R0) | 11\* | 13 | 19.24\* | 13.1.6\* | 6.0 | 2021.5 | | | | | 20.11 | | | Allow defaulting comparisons by value | [P1946R0](https://wg21.link/P1946R0) | 10 | 10 | 19.25\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | Remove `std::weak_equality` and `std::strong_equality` | [P1959R0](https://wg21.link/P1959R0) | 10 | 10 | 19.25\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | Inconsistencies with non-type template parameters | [P1907R1](https://wg21.link/P1907R1) | 10 (partial)11 | 12 (partial) | 19.26\* | 13.1.6\* (partial). | 6.2 | | | | | | | | | DR: Pseudo-destructors end object lifetimes | [P0593R6](https://wg21.link/P0593R6) | 11 | 11 | Yes | 12.0.5\* | N/A | N/A | | | | | | | | DR: Converting from `T*` to `bool` should be considered narrowing | [P1957R2](https://wg21.link/P1957R2) | 10\* 11\* | 11 | 19.27\* | 12.0.5\* | 6.1 | | | | | | | | | C++20 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | ### C++20 library features | C++20 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [`std::endian`](types/endian "cpp/types/endian") | [P0463R1](https://wg21.link/P0463R1) | 8 | 7 | 19.22\* | 10.0.0\* | | | | | Extending `[std::make\_shared()](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared")` to support arrays | [P0674R1](https://wg21.link/P0674R1) | 12 | 15 | 19.27\* | | | | | | [Floating-point atomic](atomic/atomic#Specializations_for_floating-point_types "cpp/atomic/atomic") | [P0020R6](https://wg21.link/P0020R6) | 10 | | 19.22\* | | | | | | [Synchronized buffered](io/basic_syncbuf "cpp/io/basic syncbuf") ([`std::basic_osyncstream`](io/basic_osyncstream "cpp/io/basic osyncstream")) | [P0053R7](https://wg21.link/P0053R7) | 11 | | 19.29 (16.10)\* | | | | | | `constexpr` for [`<algorithm>`](header/algorithm "cpp/header/algorithm") and [`<utility>`](header/utility "cpp/header/utility") | [P0202R3](https://wg21.link/P0202R3) | 10 | 8 (partial)12 | 19.26\* | 10.0.1\* (partial) 13.0.0\* | | | | | More `constexpr` for [`<complex>`](header/complex "cpp/header/complex") | [P0415R1](https://wg21.link/P0415R1) | 9 | 7 (partial) | 19.27\* | 10.0.0\* (partial). | | | | | Make `[std::memory\_order](atomic/memory_order "cpp/atomic/memory order")` a scoped enumeration | [P0439R0](https://wg21.link/P0439R0) | 9 | 9 | 19.25\* | 11.0.3\* | | | | | [String](string/basic_string "cpp/string/basic string") [prefix](string/basic_string/starts_with "cpp/string/basic string/starts with") and [suffix](string/basic_string/ends_with "cpp/string/basic string/ends with") checking: [`string`](string/basic_string/starts_with "cpp/string/basic string/starts with")[`(_view)`](string/basic_string_view/starts_with "cpp/string/basic string view/starts with") [`::starts_with`](string/basic_string/starts_with "cpp/string/basic string/starts with")/[`ends_with`](string/basic_string_view/ends_with "cpp/string/basic string view/ends with") | [P0457R2](https://wg21.link/P0457R2) | 9 | 6 | 19.21\* | 10.0.0\* | | | | | Library support for [`operator<=>`](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") [`<compare>`](header/compare "cpp/header/compare") | [P0768R1](https://wg21.link/P0768R1) | 10 | 7 (partial)12 | 19.20\* (partial)19.28 (16.9)\* | 13.0.0\* | | | | | [`std::remove_cvref`](types/remove_cvref "cpp/types/remove cvref") | [P0550R2](https://wg21.link/P0550R2) | 9 | 6 | 19.20\* | 10.0.0\* | | | | | `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` in the [standard library](language/attributes/nodiscard#Standard_library "cpp/language/attributes/nodiscard") | [P0600R1](https://wg21.link/P0600R1) | 9 | 7 (partial) | 19.13\* (partial)19.22\* | 10.0.0\* (partial). | | | | | Using `std::move` in [numeric algorithms](numeric "cpp/numeric") | [P0616R0](https://wg21.link/P0616R0) | 9 | 12 | 19.23\* | 13.0.0\* | | | | | [Utility](memory/to_address "cpp/memory/to address") to convert a pointer to a raw pointer | [P0653R2](https://wg21.link/P0653R2) | 8 | 6 | 19.22\* | Yes | | | | | Atomic [`std::shared_ptr`](memory/shared_ptr/atomic2 "cpp/memory/shared ptr/atomic2") and [`std::weak_ptr`](memory/weak_ptr/atomic2 "cpp/memory/weak ptr/atomic2") | [P0718R2](https://wg21.link/P0718R2) | 12 | | 19.27\* | | | | | | [`std::span`](container/span "cpp/container/span") | [P0122R7](https://wg21.link/P0122R7) | 10 | 7 | 19.26\* | 10.0.0\* | | | | | [Calendar](chrono#Calendar "cpp/chrono") and [timezone](chrono#Time_zone "cpp/chrono") | [P0355R7](https://wg21.link/P0355R7) | 11 (partial) | 7 (partial) | 19.29 (16.10)\* | 10.0.0\* (partial). | | | | | [`<version>`](header/version "cpp/header/version") | [P0754R2](https://wg21.link/P0754R2) | 9 | 7 | 19.22\* | 10.0.0\* | | | | | Comparing unordered containers | [P0809R0](https://wg21.link/P0809R0) | Yes | Yes | 16.0\* | Yes | | | | | [ConstexprIterator](named_req/constexpriterator "cpp/named req/ConstexprIterator") requirements | [P0858R0](https://wg21.link/P0858R0) | 9 | 12 | 19.11\* | 13.0.0\* | | | | | `[std::basic\_string::reserve()](string/basic_string/reserve "cpp/string/basic string/reserve")` should not shrink | [P0966R1](https://wg21.link/P0966R1) | 11 | 8 | 19.25\* | 10.0.1\* | | | | | [Atomic Compare-And-Exchange](atomic/atomic/compare_exchange "cpp/atomic/atomic/compare exchange") with padding bits | [P0528R3](https://wg21.link/P0528R3) | | | 19.28 (16.8)\* | | | | | | [`std::atomic_ref`](atomic/atomic_ref "cpp/atomic/atomic ref") | [P0019R8](https://wg21.link/P0019R8) | 10 | | 19.28 (16.8)\* | | | | | | `contains()` member function of associative containers, e.g. [`std::map::contains()`](container/map/contains "cpp/container/map/contains") | [P0458R2](https://wg21.link/P0458R2) | 9 | 13 | 19.21\* | 13.1.6\* | | | | | DR: Guaranteed copy elision for [piecewise construction](memory/scoped_allocator_adaptor/construct "cpp/memory/scoped allocator adaptor/construct") | [P0475R1](https://wg21.link/P0475R1) | 9 | Yes | 19.29 (16.10)\* | Yes | | | | | [`std::bit_cast()`](numeric/bit_cast "cpp/numeric/bit cast") | [P0476R2](https://wg21.link/P0476R2) | 11 | 14 | 19.27\* | | | | | | [Integral power-of-2 operations](header/bit "cpp/header/bit"): [`std::bit_ceil()`](numeric/bit_ceil "cpp/numeric/bit ceil"), [`std::bit_floor()`](numeric/bit_floor "cpp/numeric/bit floor"), [`std::bit_width()`](numeric/bit_width "cpp/numeric/bit width"), [`std::has_single_bit()`](numeric/has_single_bit "cpp/numeric/has single bit"). | [P0556R3](https://wg21.link/P0556R3) [P1956R1](https://wg21.link/P1956R1) | 9 (P0556R3)10 (P1956R1) | 9 (P0556R3)12 (P1956R1) | 19.25\* (P0556R3)\*19.27\* (P1956R1)\*19.28 (16.8)\* | 11.0.3\* (P0556R3) 13.0.0\* (P1956R1). | | | | | Improving the return value of erase-like algorithms | [P0646R1](https://wg21.link/P0646R1) | 9 | 10 | 19.21\* | 12.0.0\* | | | | | [`std::destroying_delete`](memory/new/destroying_delete_t "cpp/memory/new/destroying delete t") | [P0722R3](https://wg21.link/P0722R3) | 9 | 9 | 19.27\* | 11.0.3\* | | | | | [`std::is_nothrow_convertible`](types/is_convertible "cpp/types/is convertible") | [P0758R1](https://wg21.link/P0758R1) | 9 | 9 | 19.23\* | 11.0.3\* | | | | | Add [`std::shift_left/right`](algorithm/shift "cpp/algorithm/shift") to [`<algorithm>`](header/algorithm "cpp/header/algorithm") | [P0769R2](https://wg21.link/P0769R2) | 10 | 12 | 19.21\* | 13.0.0\* | | | | | Constexpr for `[std::swap()](algorithm/swap "cpp/algorithm/swap")` and `swap` related functions | [P0879R0](https://wg21.link/P0879R0) | 10 | 13 | 19.26\* | 13.1.6\* | | | | | [`std::type_identity`](types/type_identity "cpp/types/type identity") | [P0887R1](https://wg21.link/P0887R1) | 9 | 8 | 19.21\* | 10.0.1\* | | | | | [Concepts library](concepts "cpp/concepts") | [P0898R3](https://wg21.link/P0898R3) | 10 | 13 | 19.23\* | 13.1.6\* | | | | | `constexpr` [comparison operators](container/array/operator_cmp "cpp/container/array/operator cmp") for `[std::array](container/array "cpp/container/array")` | [P1023R0](https://wg21.link/P1023R0) | 10 | 8 | 19.27\* | 10.0.1\* | | | | | [`std::unwrap_ref_decay` and `std::unwrap_reference`](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference") | [P0318R1](https://wg21.link/P0318R1) | 9 | 8 | 19.21\* | 10.0.1\* | | | | | [`std::bind_front()`](utility/functional/bind_front "cpp/utility/functional/bind front") | [P0356R5](https://wg21.link/P0356R5) | 9 | 13 | 19.25\* | 13.1.6\* | | | | | `[std::reference\_wrapper](utility/functional/reference_wrapper "cpp/utility/functional/reference wrapper")` for incomplete types | [P0357R3](https://wg21.link/P0357R3) | 9 | 8 | 19.26\* | 10.0.1\* | | | | | Fixing [`operator>>(basic_istream&, CharT*)`](io/basic_istream/operator_gtgt2 "cpp/io/basic istream/operator gtgt2") | [P0487R1](https://wg21.link/P0487R1) | 11 | 8 | 19.23\* | 10.0.1\* | | | | | Library support for [`char8_t`](language/types#char8_t "cpp/language/types") | [P0482R6](https://wg21.link/P0482R6) | 9 | 8 (partial)\* | 19.22\* | 10.0.1\* (partial). | | | | | [Utility functions](memory/uses_allocator_construction_args "cpp/memory/uses allocator construction args") to implement [uses-allocator](memory/make_obj_using_allocator "cpp/memory/make obj using allocator") [construction](memory/uninitialized_construct_using_allocator "cpp/memory/uninitialized construct using allocator") | [P0591R4](https://wg21.link/P0591R4) | 9 | | 19.29 (16.10)\* | | | | | | DR: `[std::variant](utility/variant "cpp/utility/variant")` and `[std::optional](utility/optional "cpp/utility/optional")` should propagate copy/move triviality | [P0602R4](https://wg21.link/P0602R4) | 8.3 | 8 | 19.11\* | 10.0.1\* | | | | | A sane `[std::variant](utility/variant "cpp/utility/variant")` converting constructor | [P0608R3](https://wg21.link/P0608R3) | 10 | 9 | 19.29 (16.10)\* | 11.0.3\* | | | | | `[std::function](utility/functional/function "cpp/utility/functional/function")`'s move constructor should be [`noexcept`](language/noexcept "cpp/language/noexcept") | [P0771R1](https://wg21.link/P0771R1) | 7.2 | 6 | 19.22\* | Yes | | | | | The [One](iterator "cpp/iterator") [Ranges](ranges "cpp/ranges") [Proposal](algorithm/ranges "cpp/algorithm/ranges") | [P0896R4](https://wg21.link/P0896R4) | 10 (partial)\* | 13 (partial)15 | 19.29 (16.10)\* | | | | | | Heterogeneous lookup for [unordered containers](container#Unordered_associative_containers "cpp/container") | [P0919R3](https://wg21.link/P0919R3) [P1690R1](https://wg21.link/P1690R1) | 11 | 12 | 19.23\* (P0919R3)19.25\* (P1690R1) | 13.0.0\* | | | | | [`<chrono>`](header/chrono "cpp/header/chrono") `zero()`, `min()`, and `max()` should be [`noexcept`](language/noexcept "cpp/language/noexcept") | [P0972R0](https://wg21.link/P0972R0) | 9 | 8 | 19.14\* | 10.0.1\* | | | | | `constexpr` in `[std::pointer\_traits](memory/pointer_traits "cpp/memory/pointer traits")` | [P1006R1](https://wg21.link/P1006R1) | 9 | 8 | 19.26\* | 10.0.1\* | | | | | [`std::assume_aligned()`](memory/assume_aligned "cpp/memory/assume aligned") | [P1007R3](https://wg21.link/P1007R3) | 9\*11 | 15 | 19.28 (16.9)\* | | | | | | Smart pointer creation with default initialization (e.g. [`make_unique_for_overwrite`](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique")) | [P1020R1](https://wg21.link/P1020R1)[P1973R1](https://wg21.link/P1973R1) | 11 (unique\_ptr)12 (shared\_ptr) | | 19.28 (16.9)\* | | | | | | Misc `constexpr` bits | [P1032R1](https://wg21.link/P1032R1) | 10 | 13 | 19.28 (16.8)\* | 13.1.6\* | | | | | Remove comparison operators of [`std::span`](container/span "cpp/container/span") | [P1085R2](https://wg21.link/P1085R2) | 10 | 8 | 19.26\* | 10.0.1\* | | | | | Make stateful allocator propagation more consistent for [`operator+(basic_string)`](string/basic_string/operator_plus_ "cpp/string/basic string/operator+") | [P1165R1](https://wg21.link/P1165R1) | 10 | 15 | 19.26\* | | | | | | Consistent container erasure, e.g. [`std::erase(std::vector)`](container/vector/erase2 "cpp/container/vector/erase2"), or [`std::erase_if(std::map)`](container/map/erase_if "cpp/container/map/erase if") | [P1209R0](https://wg21.link/P1209R0) [P1115R3](https://wg21.link/P1115R3) | 9 (P1209R0)10 (P1115R3) | 8 (P1209R0) 11 (P1115R3) | 19.25\* (P1209R0)19.27\* (P1115R3) | 10.0.1\* (P1209R0) 12.0.5\* (P1115R3). | | | | | Standard library header units | [P1502R1](https://wg21.link/P1502R1) | 11 | | 19.29 (16.10)\* | | | | | | [`polymorphic_allocator<>`](memory/polymorphic_allocator "cpp/memory/polymorphic allocator") as a vocabulary type | [P0339R6](https://wg21.link/P0339R6) | 9 | | 19.28 (16.9)\* | | | | | | [`std::execution::unseq`](algorithm/execution_policy_tag "cpp/algorithm/execution policy tag") | [P1001R2](https://wg21.link/P1001R2) | 9 | | 19.28 (16.8)\* | | | | | | [`std::lerp()`](numeric/lerp "cpp/numeric/lerp") and [`std::midpoint()`](numeric/midpoint "cpp/numeric/midpoint") | [P0811R3](https://wg21.link/P0811R3) | 9 | 9 | 19.23\* (partial)19.28 (16.8)\* | 11.0.3\* | | | | | Usability enhancements for [`std::span`](container/span "cpp/container/span") | [P1024R3](https://wg21.link/P1024R3) | 10 | 9\*14 | 19.26\* | 11.0.3\* | | | | | DR: Make [`create_directory()`](filesystem/create_directory "cpp/filesystem/create directory") intuitive | [P1164R1](https://wg21.link/P1164R1) | 8.3 | 12 | 19.20\* | 13.0.0\* | | | | | [`std::ssize()`](iterator/size "cpp/iterator/size") and unsigned extent for [`std::span`](container/span "cpp/container/span") | [P1227R2](https://wg21.link/P1227R2) | 10 | 9 | 19.25\* | 11.0.3\* | | | | | Traits for ([un](types/is_unbounded_array "cpp/types/is unbounded array"))[bounded](types/is_bounded_array "cpp/types/is bounded array") arrays | [P1357R1](https://wg21.link/P1357R1) | 9 | 9 | 19.25\* | 11.0.3\* | | | | | [`std::to_array()`](container/array/to_array "cpp/container/array/to array") | [P0325R4](https://wg21.link/P0325R4) | 10 | 10 | 19.25\* | 12.0.0\* | | | | | Efficient access to `[std::basic\_stringbuf](io/basic_stringbuf "cpp/io/basic stringbuf")`’s buffer | [P0408R7](https://wg21.link/P0408R7) | 11 | | 19.29 (16.10)\* | | | | | | [Layout](types/is_layout_compatible "cpp/types/is layout compatible")-[compatibility](types/is_corresponding_member "cpp/types/is corresponding member") and [pointer](types/is_pointer_interconvertible_base_of "cpp/types/is pointer interconvertible base of")-[interconvertibility](types/is_pointer_interconvertible_with_class "cpp/types/is pointer interconvertible with class") traits | [P0466R5](https://wg21.link/P0466R5) | 12 | | 19.29 (16.10)\*\* | | | | | | [Bit operations](numeric#Bit_manipulation_.28since_C.2B.2B20.29 "cpp/numeric"): `std::` [`rotl()`](numeric/rotl "cpp/numeric/rotl"), [`rotr()`](numeric/rotr "cpp/numeric/rotr"), [`countl_zero()`](numeric/countl_zero "cpp/numeric/countl zero"), [`countl_one()`](numeric/countl_one "cpp/numeric/countl one"), [`countr_zero()`](numeric/countr_zero "cpp/numeric/countr zero"), [`countr_one()`](numeric/countr_one "cpp/numeric/countr one"), [`popcount()`](numeric/popcount "cpp/numeric/popcount"). | [P0553R4](https://wg21.link/P0553R4) | 9 | 9 | 19.25\*\*19.28 (16.8)\* | 11.0.3\* | | | | | [Mathematical constants](numeric/constants "cpp/numeric/constants") | [P0631R8](https://wg21.link/P0631R8) | 10 | 11 | 19.25\* | 12.0.5\* | | | | | [Text formatting](utility/format "cpp/utility/format") | [P0645R10](https://wg21.link/P0645R10) | | 14\* | 19.29 (16.10)\* | | | | | | [`std::stop_token`](thread/stop_token "cpp/thread/stop token") and [`std::jthread`](thread/jthread "cpp/thread/jthread") | [P0660R10](https://wg21.link/P0660R10) | 10 | | 19.28 (16.9)\* | | | | | | `constexpr` `[std::allocator](memory/allocator "cpp/memory/allocator")` and related utilities | [P0784R7](https://wg21.link/P0784R7) | 10 | 12 | 19.29 (16.10)\* | 13.0.0\* | | | | | `constexpr` `[std::string](string/basic_string "cpp/string/basic string")` | [P0980R1](https://wg21.link/P0980R1) | 12 | 15 | 19.29 (16.10)\*19.30\*\* | | | | | | `constexpr` `[std::vector](container/vector "cpp/container/vector")` | [P1004R2](https://wg21.link/P1004R2) | 12 | 15 | 19.29 (16.10)\*19.30\*\* | | | | | | Input [range adaptors](ranges "cpp/ranges") | [P1035R7](https://wg21.link/P1035R7) | 10 | | 19.29 (16.10)\* | | | | | | `constexpr` `[std::invoke()](utility/functional/invoke "cpp/utility/functional/invoke")` and related utilities | [P1065R2](https://wg21.link/P1065R2) | 10 | 12 | 19.28 (16.8)\* | 13.0.0\* | | | | | Atomic waiting and notifying, [`std::counting_semaphore`](thread/counting_semaphore "cpp/thread/counting semaphore"), [`std::latch`](thread/latch "cpp/thread/latch") and [`std::barrier`](thread/barrier "cpp/thread/barrier") | [P1135R6](https://wg21.link/P1135R6) | 11 | 11 | 19.28 (16.8)\* | 13.1.6\* | | | | | [`std::source_location`](utility/source_location "cpp/utility/source location") | [P1208R6](https://wg21.link/P1208R6) | 11 | 15 (partial)\* | 19.29 (16.10)\* | | | | | | Adding [`<=>`](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") to the standard library | [P1614R2](https://wg21.link/P1614R2) | 10 | 14 (partial)\* | 19.29 (16.10)\* | 13.1.6\* (partial). | | | | | `constexpr` default constructor of `[std::atomic](atomic/atomic "cpp/atomic/atomic")` and `[std::atomic\_flag](atomic/atomic_flag "cpp/atomic/atomic flag")` | [P0883R2](https://wg21.link/P0883R2) | 10 | 13 | 19.26\* | 13.1.6\* | | | | | `constexpr` for [numeric algorithms](numeric#Numeric_operations "cpp/numeric") | [P1645R1](https://wg21.link/P1645R1) | 10 | 12 | 19.26\* | 13.0.0\* | | | | | [Safe integral comparisons](utility#Integer_comparison_functions "cpp/utility") | [P0586R2](https://wg21.link/P0586R2) | 10 | 13 | 19.27\* | 13.1.6\* | | | | | C++20 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | C++17 features --------------- * [C++17 core language features](compiler_support/17 "cpp/compiler support/17") * [C++17 library features](compiler_support/17#C.2B.2B17_library_features "cpp/compiler support/17") C++14 features --------------- * [C++14 core language features](compiler_support/14 "cpp/compiler support/14") * [C++14 library features](compiler_support/14#C.2B.2B14_library_features "cpp/compiler support/14") C++11 features --------------- * [C++11 core language features](compiler_support/11 "cpp/compiler support/11") * [C++11 library features](compiler_support/11#C.2B.2B11_library_features "cpp/compiler support/11") References ----------- Individual vendor compatibility checklists (these are more up-to-date than the table above). * GCC (Updated 2021-02) + [C++11 core language support status](https://gcc.gnu.org/projects/cxx-status.html#cxx11) (complete as of 4.8.1, except for n2670, which is implemented by no compiler and removed in C++23) + [C++14 core language support status](https://gcc.gnu.org/projects/cxx-status.html#cxx14) (complete as of 5.1) + [C++17 core language support status](https://gcc.gnu.org/projects/cxx-status.html#cxx17) (complete as of 7.1) + [C++20 core language support status](https://gcc.gnu.org/projects/cxx-status.html#cxx20)(complete as of 11.0, except part of modules) + [C++23 core language support status](https://gcc.gnu.org/projects/cxx-status.html#cxx23) + [C++11 library support status](https://gcc.gnu.org/onlinedocs/libstdc++/manual/status.html#status.iso.2011) (complete as of 5.1) + [C++14 library support status](https://gcc.gnu.org/onlinedocs/libstdc++/manual/status.html#status.iso.2014) (complete as of 5.1) + [C++17 library support status](https://gcc.gnu.org/onlinedocs/libstdc++/manual/status.html#status.iso.2017) (complete as of 12.0) + [C++20 library support status](https://gcc.gnu.org/onlinedocs/libstdc++/manual/status.html#status.iso.2020) + [C++23 library support status](https://gcc.gnu.org/onlinedocs/libstdc++/manual/status.html#status.iso.2023) + [Technical Specifications support status](https://gcc.gnu.org/projects/cxx-status.html#tses) + [Core language defect report status](https://gcc.gnu.org/projects/cxx-dr-status.html) * Clang++ (Updated 2021-07) + [C++11 core language support status](https://clang.llvm.org/cxx_status.html#cxx11) (complete as of 3.3) + C++11 library support status (complete as of [2012-07-29](https://github.com/llvm-mirror/libcxx/commit/5fec82dc0db3623546038e4a86baa44f749e554f#diff-c330060c0d4b6fb493c2be0ff80a3f7e)) + [C++14 core language support status](https://clang.llvm.org/cxx_status.html#cxx14) (complete as of 3.4) + [C++14 library support status](https://libcxx.llvm.org/Status/Cxx14.html) (complete as of 3.5) + [C++17 core language support status](https://clang.llvm.org/cxx_status.html#cxx17) (complete as of 5.0) + [C++17 library support status](https://libcxx.llvm.org/Status/Cxx17.html) + [C++20 core language support status](https://clang.llvm.org/cxx_status.html#cxx20) + [C++20 library support status](https://libcxx.llvm.org/Status/Cxx20.html) + [C++23 core language support status](https://clang.llvm.org/cxx_status.html#cxx23) + [C++23 library support status](https://libcxx.llvm.org/Status/Cxx2b.html) + [Technical Specifications support status](https://clang.llvm.org/cxx_status.html#ts) + [Core language defect report status](https://clang.llvm.org/cxx_dr_status.html) * Apple Clang (Updated 2019-06) + [Xcode toolchain versions on Wikipedia](https://en.wikipedia.org/wiki/Xcode#Toolchain_versions) + [Xcode release notes](https://developer.apple.com/documentation/xcode_release_notes) * Microsoft Visual Studio (updated 2022-03) + [Microsoft C/C++ language conformance (Visual Studio 2015 and later)](https://docs.microsoft.com/en-us/cpp/overview/visual-cpp-language-conformance) + [Upcoming releases Visual Studio 2022 change log](https://github.com/microsoft/STL/wiki/Changelog) + [C++17/20 Features and Fixes in Visual Studio 2019](https://devblogs.microsoft.com/cppblog/cpp17-20-features-and-fixes-in-vs-2019/) + [STL Features and Fixes in VS 2017 15.8](https://devblogs.microsoft.com/cppblog/stl-features-and-fixes-in-vs-2017-15-8/) + C++17 [Announcing: MSVC Conforms to the C++ Standard](https://devblogs.microsoft.com/cppblog/announcing-msvc-conforms-to-the-c-standard/) (complete as of 15.7) + C++17 [Features And STL Fixes In VS 2017 15.5](https://devblogs.microsoft.com/cppblog/c17-progress-in-vs-2017-15-5-and-15-6/) + C++17 [Features And STL Fixes In VS 2017 15.3](https://devblogs.microsoft.com/cppblog/c17-features-and-stl-fixes-in-vs-2017-15-3/) + C++11/C++14/C++17 [core language and library status in VS2017.3](https://devblogs.microsoft.com/cppblog/c17-features-in-vs-2017-3/) + C++11/C++14/C++17 core language support status - [C++11/14/17 core language support status in VS2010, VS2012, VS2013, and VS2015](https://msdn.microsoft.com/en-us/library/hh567368.aspx#featurelist) - [VS2013 vs. VS2015 CTP0](https://devblogs.microsoft.com/cppblog/c1114-core-language-features-in-vs-2013-and-the-nov-2013-ctp/) - [VS2013 vs. VS2015 CTP1](https://devblogs.microsoft.com/cppblog/c1114-feature-tables-for-visual-studio-14-ctp1/) - [VS2013 vs. VS2015 CTP3](https://devblogs.microsoft.com/cppblog/c1114-features-in-visual-studio-14-ctp3/) (includes the roadmap table) - [VS2015 ("VS14") preview](https://devblogs.microsoft.com/cppblog/c111417-features-in-vs-2015-preview/) - [VS2015 ("VS14") release candidate](https://devblogs.microsoft.com/cppblog/c111417-features-in-vs-2015-rc/) (C++11 remains incomplete, but C++17 support appears) - [VS2019](https://docs.microsoft.com/en-us/cpp/overview/visual-cpp-language-conformance) + [C++11 and C++14 library support status](https://msdn.microsoft.com/en-us/library/hh567368.aspx#stl) + [C++11/14/17 Features In VS 2015 RTM](https://devblogs.microsoft.com/cppblog/c111417-features-in-vs-2015-rtm/) including core language and standard library (including technical specifications) + [C++14/17 features in VS 2015 Update 2 standard library](https://devblogs.microsoft.com/cppblog/vs-2015-update-2s-stl-is-c17-so-far-feature-complete/) library is feature complete up to current C++17 with few minor issues (some defect reports, some constexprs, etc) + [C++14/17 Features and STL Fixes in VS “15” Preview 5](https://devblogs.microsoft.com/cppblog/c1417-features-and-stl-fixes-in-vs-15-preview-5/) including a detailed C++17 status table * Intel C++ (Updated 2022-01) + [C++11 core language support status](https://software.intel.com/en-us/articles/c0x-features-supported-by-intel-c-compiler) (complete as of 15.0) + [C++14 core language support status](https://software.intel.com/en-us/articles/c14-features-supported-by-intel-c-compiler) (functionally complete as of 17.0 - N3664 is an optimization) + [C++17 core language support status](https://software.intel.com/en-us/articles/c17-features-supported-by-intel-c-compiler) (incomplete) + [C++20 core language support status](https://www.intel.com/content/www/us/en/developer/articles/technical/c20-features-supported-by-intel-cpp-compiler.html) (incomplete) + [C++17 features of Intel 19.0 beta](https://software.intel.com/en-us/articles/intel-c-compiler-190-for-linux-release-notes-for-intel-parallel-studio-xe-2019#cpp17) + Intel does not ship an implementation of the C++ standard library, except for - [Parallel STL](https://software.intel.com/en-us/get-started-with-pstl) (an implementation of the C++17 standard library algorithms with support for execution policies) + [Intel's compatibility with versions of libstdc++ from GCC](https://charm.cs.illinois.edu/redmine/issues/1560#note-6) * EDG (Updated 2022-03) + [C++11 core language support status](https://www.edg.com/features.html) + [C++14 core language support status](https://www.edg.com/cpp14_features.html) + [C++17 core language support status](https://www.edg.com/cpp17_features.html) + [C++20 core language support status](https://www.edg.com/cpp20_features.html) + [C++23 core language support status](https://www.edg.com/cpp23_features.html) + EDG does not ship an implementation of the C++ standard library * Oracle C++ (updated 2017-07) + Version number is compiler version, not Oracle Studio version + [C++11 core language support status in 5.13](https://docs.oracle.com/cd/E37069_01/html/E37071/gncix.html) - [Missing C++11 support added in 5.14 (page has a typo, and still says 5.13)](https://docs.oracle.com/cd/E60778_01/html/E60742/gkeza.html#scrolltoc) + [C++14 features added in 5.14](https://docs.oracle.com/cd/E60778_01/html/E60742/gncix.html#scrolltoc) - Full C++14 support added in 5.15. + Oracle ships 4 implementations of the C++ standard library: - libCstd (RogueWave Standard Library version 2), predates C++98 - stlport4 (STLport Standard Library version 4.5.3), predates C++03 - stdcxx4 (Apache Standard Library version 4), predates C++11 - libstdc++ (GCC runtime library, support for C++11 and C++14 depending on release) * IBM XL C++ (updated 2018-05) + IBM XL C++ for Linux - [Core language support status](https://www.ibm.com/support/knowledgecenter/en/SSXVZZ_16.1.0/com.ibm.xlcpp161.lelinux.doc/language_ref/standard_features.html): C++11 complete as of 13.1.6, C++14 partial in 16.1.0 - IBM does not ship an implementation of C++ standard library for Linux (uses GNU libstdc++) + IBM XL C++ for AIX - [Core language support status](https://www.ibm.com/support/knowledgecenter/en/SSGH3R_13.1.3/com.ibm.xlcpp1313.aix.doc/language_ref/cpp0x_exts.html): C++11 partial in 13.1.3 and 16.1.0 (xlC frontend), complete in 16.1.0 (xlclang frontend) - IBM ships [a version of Dinkumware library](https://www-01.ibm.com/support/knowledgecenter/SSGH3R_13.1.0/com.ibm.xlcpp131.aix.doc/standlib/header_files.html?lang=en) for AIX with full support for C++ TR1, including <regex>, but no C++11 - [IBM XL C/C++ compilers features](https://www.ibm.com/support/pages/ibm-xl-cc-compilers-features) * HP aCC + [HP aC++ A.06.28 release notes (including C++11 core language features)](https://h20565.www2.hpe.com/hpsc/doc/public/display?sp4ts.oid=4145774&docLocale=en_US&docId=emr_na-c04221956) + HP ships a version of RogueWave STL 2.0 implementation of the C++98 standard library * Digital Mars C++ + [C++11 core language support status](https://www.digitalmars.com/ctg/CPP0x-Language-Implementation.html) * Embarcadero C++ + [Language features compliance status](https://docwiki.embarcadero.com/RADStudio/Berlin/en/C%2B%2B11_Language_Features_Compliance_Status) (RAD Studio 10.1 Berlin), including C++11 features supported by legacy and Clang-enhanced compilers (based on Clang 3.3) + [Language features compliance status](https://docwiki.embarcadero.com/RADStudio/Rio/en/Modern_C%2B%2B_Language_Features_Compliance_Status) (RAD Studio 10.3 Rio), including C++11 features supported by legacy compilers and C++11, C++14, and C++17 features supported by the Clang-enhanced compilers (based on Clang 5.0) * Cray (updated 2020-02) + [Cray C and C++ Reference Manual (8.4)](https://docs.cray.com/books/S-2179-84/S-2179-84.pdf) For version 8.4, claims all of C++14 is supported except alignas + [Cray C and C++ Reference Manual (8.6)](https://pubs.cray.com/content/S-2179/8.6/cray-c-and-c++-reference-manual-s-2179-86/cray-c-and-c++-dialect-use#concept_kgd_fcr_3s) For version 8.6, claims all of C++14 is supported + [Cray C and C++ Reference Manual (9.1)](https://pubs.cray.com/pdf-attachments/attachment?pubId=00761211-DC&attachmentId=pub_00761211-DC.pdf) for version 9.1 doesn't claim support beyond C++14 * Portland Group (PGI) (updated 2019-01) + [Release notes for 2016](https://www.pgroup.com/doc/pgirn161.pdf) claim C++14 support, except "generalized constexpr and constexpr member functions and implicit const, variable templates, clarifying memory allocation (merged allocation)" + [Release notes for 2018](https://www.pgroup.com/resources/docs/18.1/pdf/pgirn181-x86.pdf) + [Reference manual of PGI 19.1](https://www.pgroup.com/resources/docs/19.1/x86/pgi-ref-guide/index.htm) + PGI does not ship an implementation of C++ standard library * Nvidia Cuda nvcc (updated 2021-03-04) + [CUDA C Programming Guide (v11.2.1)](https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#c-cplusplus-language-support) + "All C++17 language features are supported in nvcc version 11.0 and later, subject to restrictions described [here](https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#cpp17)." + NVCC does not ship an implementation of C++ standard library * Texas Instruments (updated 2018-05) + [cl430 version v18.1.0](https://www.ti.com/lit/ug/slau132r/slau132r.pdf) claims C++14 support * Analog Devices (updated 2018-05) + [CrossCore Embedded Studio 2.8.0 for SHARC](https://www.analog.com/media/en/dsp-documentation/software-manuals/cces-SharcCompiler-manual.pdf) claims C++11 support.
programming_docs
cpp C++11 C++11 ===== **C++11** is the second major version of C++ and the most important update since C++98. A large number of changes were introduced to both standardize existing practices and improve the abstractions available to the C++ programmers. Before it was finally approved by ISO on 12 August 2011, the name 'C++0x' was used because it was expected to be published before 2010. It took 8 years between C++03 and C++11, so it has become the longest interval between versions so far. Since then, currently, C++ updates every 3 years regularly. Following features were merged into C++11: * From [TR1](https://en.cppreference.com/w/cpp/experimental "cpp/experimental"): all of TR1 except [Special Functions](numeric/special_functions "cpp/numeric/special functions"). * From Boost: [The thread library](thread "cpp/thread"), [`exception_ptr`](error/exception_ptr "cpp/error/exception ptr"), [`error_code`](error/error_code "cpp/error/error code") and [`error_condition`](error/error_condition "cpp/error/error condition"), iterator improvements ([`begin`](iterator/begin "cpp/iterator/begin"), [`end`](iterator/end "cpp/iterator/end"), [`next`](iterator/next "cpp/iterator/next"), [`prev`](iterator/prev "cpp/iterator/prev")) * From C: C-style Unicode conversion functions Core language features ----------------------- * [`auto`](language/auto "cpp/language/auto") and [`decltype`](language/decltype "cpp/language/decltype") * [defaulted](language/function#Function_definition "cpp/language/function") and [deleted](language/function#Deleted_functions "cpp/language/function") functions * [`final`](language/final "cpp/language/final") and [`override`](language/override "cpp/language/override") * [trailing return type](language/function#Function_declaration "cpp/language/function") * [rvalue references](language/reference "cpp/language/reference") * [move constructors](language/move_constructor "cpp/language/move constructor") and [move assignment operators](language/move_assignment "cpp/language/move assignment") * [scoped enums](language/enum "cpp/language/enum") * [`constexpr`](language/constexpr "cpp/language/constexpr") and [literal types](named_req/literaltype "cpp/named req/LiteralType") * [list initialization](language/list_initialization "cpp/language/list initialization") * [delegating](language/initializer_list#Delegating_constructor "cpp/language/initializer list") and [inherited](language/using_declaration "cpp/language/using declaration") constructors * brace-or-equal [initializers](language/initialization "cpp/language/initialization") * [`nullptr`](language/nullptr "cpp/language/nullptr") * [`long long`](language/types#Integer_types "cpp/language/types") * [`char16_t` and `char32_t`](language/types#Character_types "cpp/language/types") * [type aliases](language/type_alias "cpp/language/type alias") * [variadic templates](language/parameter_pack "cpp/language/parameter pack") * [generalized (non-trivial) unions](language/union "cpp/language/union") * [generalized PODs](named_req/podtype "cpp/named req/PODType") ([trivial types](named_req/trivialtype "cpp/named req/TrivialType") and [standard-layout types](named_req/standardlayouttype "cpp/named req/StandardLayoutType")) * [Unicode string literals](language/string_literal "cpp/language/string literal") * [user-defined literals](language/user_literal "cpp/language/user literal") * [attributes](language/attributes "cpp/language/attributes") * [lambda expressions](language/lambda "cpp/language/lambda") * [`noexcept`](language/noexcept_spec "cpp/language/noexcept spec") specifier and [`noexcept`](language/noexcept "cpp/language/noexcept") operator * [`alignof`](language/alignof "cpp/language/alignof") and [`alignas`](language/alignas "cpp/language/alignas") * multithreaded [memory model](language/memory_model "cpp/language/memory model") * [thread-local storage](language/storage_duration#Storage_duration "cpp/language/storage duration") * [GC interface](memory#Garbage_collector_support "cpp/memory") (removed in C++23) * [range-`for`](language/range-for "cpp/language/range-for") (based on a Boost library) * [`static_assert`](language/static_assert "cpp/language/static assert") (based on a Boost library) Library features ----------------- ### Headers * [`<array>`](header/array "cpp/header/array") * [`<atomic>`](header/atomic "cpp/header/atomic") * [`<cfenv>`](header/cfenv "cpp/header/cfenv") * [`<chrono>`](header/chrono "cpp/header/chrono") * [`<cinttypes>`](header/cinttypes "cpp/header/cinttypes") * [`<condition_variable>`](header/condition_variable "cpp/header/condition variable") * [`<cstdint>`](header/cstdint "cpp/header/cstdint") * [`<cuchar>`](header/cuchar "cpp/header/cuchar") * [`<forward_list>`](header/forward_list "cpp/header/forward list") * [`<future>`](header/future "cpp/header/future") * [`<initializer_list>`](header/initializer_list "cpp/header/initializer list") * [`<mutex>`](header/mutex "cpp/header/mutex") * [`<random>`](header/random "cpp/header/random") * [`<ratio>`](header/ratio "cpp/header/ratio") * [`<regex>`](header/regex "cpp/header/regex") * [`<scoped_allocator>`](header/scoped_allocator "cpp/header/scoped allocator") * [`<system_error>`](header/system_error "cpp/header/system error") * [`<thread>`](header/thread "cpp/header/thread") * [`<tuple>`](header/tuple "cpp/header/tuple") * [`<typeindex>`](header/typeindex "cpp/header/typeindex") * [`<type_traits>`](header/type_traits "cpp/header/type traits") * [`<unordered_map>`](header/unordered_map "cpp/header/unordered map") * [`<unordered_set>`](header/unordered_set "cpp/header/unordered set") ### Library features * [concurrency support library](thread "cpp/thread") * `emplace()` and other use of rvalue references throughout all parts of the existing library * `[std::unique\_ptr](memory/unique_ptr "cpp/memory/unique ptr")` * `[std::move\_iterator](iterator/move_iterator "cpp/iterator/move iterator")` * `[std::initializer\_list](utility/initializer_list "cpp/utility/initializer list")` * [stateful](named_req/allocator#Stateful_and_stateless_allocators "cpp/named req/Allocator") and [scoped](memory/scoped_allocator_adaptor "cpp/memory/scoped allocator adaptor") allocators * `[std::forward\_list](container/forward_list "cpp/container/forward list")` * [chrono library](chrono "cpp/chrono") * [ratio library](numeric/ratio "cpp/numeric/ratio") * new [algorithms](algorithm "cpp/algorithm"): + `[std::all\_of](algorithm/all_any_none_of "cpp/algorithm/all any none of")`, `[std::any\_of](algorithm/all_any_none_of "cpp/algorithm/all any none of")`, `[std::none\_of](algorithm/all_any_none_of "cpp/algorithm/all any none of")`, + `[std::find\_if\_not](algorithm/find "cpp/algorithm/find")`, + `[std::copy\_if](algorithm/copy "cpp/algorithm/copy")`, `[std::copy\_n](algorithm/copy_n "cpp/algorithm/copy n")`, + [std::move](algorithm/move "cpp/algorithm/move"), `[std::move\_backward](algorithm/move_backward "cpp/algorithm/move backward")`, + `[std::random\_shuffle](algorithm/random_shuffle "cpp/algorithm/random shuffle")`, `[std::shuffle](algorithm/random_shuffle "cpp/algorithm/random shuffle")`, + `[std::is\_partitioned](algorithm/is_partitioned "cpp/algorithm/is partitioned")`, `[std::partition\_copy](algorithm/partition_copy "cpp/algorithm/partition copy")`, `[std::partition\_point](algorithm/partition_point "cpp/algorithm/partition point")`, + `[std::is\_sorted](algorithm/is_sorted "cpp/algorithm/is sorted")`, `[std::is\_sorted\_until](algorithm/is_sorted_until "cpp/algorithm/is sorted until")`, + `[std::is\_heap](algorithm/is_heap "cpp/algorithm/is heap")`, `[std::is\_heap\_until](algorithm/is_heap_until "cpp/algorithm/is heap until")`, + `[std::minmax](algorithm/minmax "cpp/algorithm/minmax")`, `[std::minmax\_element](algorithm/minmax_element "cpp/algorithm/minmax element")`, + `[std::is\_permutation](algorithm/is_permutation "cpp/algorithm/is permutation")`, + `[std::iota](algorithm/iota "cpp/algorithm/iota")`, + `[std::uninitialized\_copy\_n](memory/uninitialized_copy_n "cpp/memory/uninitialized copy n")` * [Unicode conversion facets](locale#Locale-independent_unicode_conversion_facets "cpp/locale") * `[std::function](utility/functional/function "cpp/utility/functional/function")` * `[std::exception\_ptr](error/exception_ptr "cpp/error/exception ptr")` * `[std::error\_code](error/error_code "cpp/error/error code")` and `[std::error\_condition](error/error_condition "cpp/error/error condition")` * [iterator](iterator "cpp/iterator") improvements: + `[std::begin](iterator/begin "cpp/iterator/begin")` + `[std::end](iterator/end "cpp/iterator/end")` + `[std::next](iterator/next "cpp/iterator/next")` + `[std::prev](iterator/prev "cpp/iterator/prev")` * [Unicode conversion functions](string/multibyte "cpp/string/multibyte") Defect reports --------------- | Defect Reports fixed in C++11 (741 core, 868 library) | | --- | | * [CWG#4](https://wg21.cmeerw.net/cwg/issue4) * [CWG#5](https://wg21.cmeerw.net/cwg/issue5) * [CWG#8](https://wg21.cmeerw.net/cwg/issue8) * [CWG#9](https://wg21.cmeerw.net/cwg/issue9) * [CWG#10](https://wg21.cmeerw.net/cwg/issue10) * [CWG#11](https://wg21.cmeerw.net/cwg/issue11) * [CWG#16](https://wg21.cmeerw.net/cwg/issue16) * [CWG#28](https://wg21.cmeerw.net/cwg/issue28) * [CWG#29](https://wg21.cmeerw.net/cwg/issue29) * [CWG#39](https://wg21.cmeerw.net/cwg/issue39) * [CWG#44](https://wg21.cmeerw.net/cwg/issue44) * [CWG#45](https://wg21.cmeerw.net/cwg/issue45) * [CWG#54](https://wg21.cmeerw.net/cwg/issue54) * [CWG#58](https://wg21.cmeerw.net/cwg/issue58) * [CWG#60](https://wg21.cmeerw.net/cwg/issue60) * [CWG#62](https://wg21.cmeerw.net/cwg/issue62) * [CWG#63](https://wg21.cmeerw.net/cwg/issue63) * [CWG#70](https://wg21.cmeerw.net/cwg/issue70) * [CWG#77](https://wg21.cmeerw.net/cwg/issue77) * [CWG#78](https://wg21.cmeerw.net/cwg/issue78) * [CWG#86](https://wg21.cmeerw.net/cwg/issue86) * [CWG#87](https://wg21.cmeerw.net/cwg/issue87) * [CWG#96](https://wg21.cmeerw.net/cwg/issue96) * [CWG#106](https://wg21.cmeerw.net/cwg/issue106) * [CWG#112](https://wg21.cmeerw.net/cwg/issue112) * [CWG#113](https://wg21.cmeerw.net/cwg/issue113) * [CWG#115](https://wg21.cmeerw.net/cwg/issue115) * [CWG#118](https://wg21.cmeerw.net/cwg/issue118) * [CWG#119](https://wg21.cmeerw.net/cwg/issue119) * [CWG#122](https://wg21.cmeerw.net/cwg/issue122) * [CWG#124](https://wg21.cmeerw.net/cwg/issue124) * [CWG#125](https://wg21.cmeerw.net/cwg/issue125) * [CWG#136](https://wg21.cmeerw.net/cwg/issue136) * [CWG#139](https://wg21.cmeerw.net/cwg/issue139) * [CWG#140](https://wg21.cmeerw.net/cwg/issue140) * [CWG#141](https://wg21.cmeerw.net/cwg/issue141) * [CWG#143](https://wg21.cmeerw.net/cwg/issue143) * [CWG#158](https://wg21.cmeerw.net/cwg/issue158) * [CWG#160](https://wg21.cmeerw.net/cwg/issue160) * [CWG#162](https://wg21.cmeerw.net/cwg/issue162) * [CWG#172](https://wg21.cmeerw.net/cwg/issue172) * [CWG#175](https://wg21.cmeerw.net/cwg/issue175) * [CWG#177](https://wg21.cmeerw.net/cwg/issue177) * [CWG#180](https://wg21.cmeerw.net/cwg/issue180) * [CWG#184](https://wg21.cmeerw.net/cwg/issue184) * [CWG#195](https://wg21.cmeerw.net/cwg/issue195) * [CWG#197](https://wg21.cmeerw.net/cwg/issue197) * [CWG#198](https://wg21.cmeerw.net/cwg/issue198) * [CWG#199](https://wg21.cmeerw.net/cwg/issue199) * [CWG#201](https://wg21.cmeerw.net/cwg/issue201) * [CWG#204](https://wg21.cmeerw.net/cwg/issue204) * [CWG#207](https://wg21.cmeerw.net/cwg/issue207) * [CWG#208](https://wg21.cmeerw.net/cwg/issue208) * [CWG#214](https://wg21.cmeerw.net/cwg/issue214) * [CWG#215](https://wg21.cmeerw.net/cwg/issue215) * [CWG#216](https://wg21.cmeerw.net/cwg/issue216) * [CWG#218](https://wg21.cmeerw.net/cwg/issue218) * [CWG#220](https://wg21.cmeerw.net/cwg/issue220) * [CWG#221](https://wg21.cmeerw.net/cwg/issue221) * [CWG#222](https://wg21.cmeerw.net/cwg/issue222) * [CWG#224](https://wg21.cmeerw.net/cwg/issue224) * [CWG#226](https://wg21.cmeerw.net/cwg/issue226) * [CWG#228](https://wg21.cmeerw.net/cwg/issue228) * [CWG#237](https://wg21.cmeerw.net/cwg/issue237) * [CWG#239](https://wg21.cmeerw.net/cwg/issue239) * [CWG#244](https://wg21.cmeerw.net/cwg/issue244) * [CWG#245](https://wg21.cmeerw.net/cwg/issue245) * [CWG#246](https://wg21.cmeerw.net/cwg/issue246) * [CWG#248](https://wg21.cmeerw.net/cwg/issue248) * [CWG#252](https://wg21.cmeerw.net/cwg/issue252) * [CWG#254](https://wg21.cmeerw.net/cwg/issue254) * [CWG#256](https://wg21.cmeerw.net/cwg/issue256) * [CWG#257](https://wg21.cmeerw.net/cwg/issue257) * [CWG#258](https://wg21.cmeerw.net/cwg/issue258) * [CWG#259](https://wg21.cmeerw.net/cwg/issue259) * [CWG#261](https://wg21.cmeerw.net/cwg/issue261) * [CWG#262](https://wg21.cmeerw.net/cwg/issue262) * [CWG#263](https://wg21.cmeerw.net/cwg/issue263) * [CWG#270](https://wg21.cmeerw.net/cwg/issue270) * [CWG#272](https://wg21.cmeerw.net/cwg/issue272) * [CWG#273](https://wg21.cmeerw.net/cwg/issue273) * [CWG#274](https://wg21.cmeerw.net/cwg/issue274) * [CWG#275](https://wg21.cmeerw.net/cwg/issue275) * [CWG#276](https://wg21.cmeerw.net/cwg/issue276) * [CWG#277](https://wg21.cmeerw.net/cwg/issue277) * [CWG#280](https://wg21.cmeerw.net/cwg/issue280) * [CWG#281](https://wg21.cmeerw.net/cwg/issue281) * [CWG#283](https://wg21.cmeerw.net/cwg/issue283) * [CWG#284](https://wg21.cmeerw.net/cwg/issue284) * [CWG#286](https://wg21.cmeerw.net/cwg/issue286) * [CWG#288](https://wg21.cmeerw.net/cwg/issue288) * [CWG#289](https://wg21.cmeerw.net/cwg/issue289) * [CWG#291](https://wg21.cmeerw.net/cwg/issue291) * [CWG#295](https://wg21.cmeerw.net/cwg/issue295) * [CWG#296](https://wg21.cmeerw.net/cwg/issue296) * [CWG#298](https://wg21.cmeerw.net/cwg/issue298) * [CWG#299](https://wg21.cmeerw.net/cwg/issue299) * [CWG#300](https://wg21.cmeerw.net/cwg/issue300) * [CWG#301](https://wg21.cmeerw.net/cwg/issue301) * [CWG#302](https://wg21.cmeerw.net/cwg/issue302) * [CWG#305](https://wg21.cmeerw.net/cwg/issue305) * [CWG#306](https://wg21.cmeerw.net/cwg/issue306) * [CWG#309](https://wg21.cmeerw.net/cwg/issue309) * [CWG#317](https://wg21.cmeerw.net/cwg/issue317) * [CWG#318](https://wg21.cmeerw.net/cwg/issue318) * [CWG#319](https://wg21.cmeerw.net/cwg/issue319) * [CWG#320](https://wg21.cmeerw.net/cwg/issue320) * [CWG#322](https://wg21.cmeerw.net/cwg/issue322) * [CWG#323](https://wg21.cmeerw.net/cwg/issue323) * [CWG#324](https://wg21.cmeerw.net/cwg/issue324) * [CWG#326](https://wg21.cmeerw.net/cwg/issue326) * [CWG#327](https://wg21.cmeerw.net/cwg/issue327) * [CWG#328](https://wg21.cmeerw.net/cwg/issue328) * [CWG#329](https://wg21.cmeerw.net/cwg/issue329) * [CWG#331](https://wg21.cmeerw.net/cwg/issue331) * [CWG#335](https://wg21.cmeerw.net/cwg/issue335) * [CWG#336](https://wg21.cmeerw.net/cwg/issue336) * [CWG#337](https://wg21.cmeerw.net/cwg/issue337) * [CWG#339](https://wg21.cmeerw.net/cwg/issue339) * [CWG#341](https://wg21.cmeerw.net/cwg/issue341) * [CWG#345](https://wg21.cmeerw.net/cwg/issue345) * [CWG#348](https://wg21.cmeerw.net/cwg/issue348) * [CWG#349](https://wg21.cmeerw.net/cwg/issue349) * [CWG#351](https://wg21.cmeerw.net/cwg/issue351) * [CWG#352](https://wg21.cmeerw.net/cwg/issue352) * [CWG#353](https://wg21.cmeerw.net/cwg/issue353) * [CWG#354](https://wg21.cmeerw.net/cwg/issue354) * [CWG#355](https://wg21.cmeerw.net/cwg/issue355) * [CWG#357](https://wg21.cmeerw.net/cwg/issue357) * [CWG#362](https://wg21.cmeerw.net/cwg/issue362) * [CWG#364](https://wg21.cmeerw.net/cwg/issue364) * [CWG#366](https://wg21.cmeerw.net/cwg/issue366) * [CWG#367](https://wg21.cmeerw.net/cwg/issue367) * [CWG#368](https://wg21.cmeerw.net/cwg/issue368) * [CWG#370](https://wg21.cmeerw.net/cwg/issue370) * [CWG#372](https://wg21.cmeerw.net/cwg/issue372) * [CWG#373](https://wg21.cmeerw.net/cwg/issue373) * [CWG#374](https://wg21.cmeerw.net/cwg/issue374) * [CWG#377](https://wg21.cmeerw.net/cwg/issue377) * [CWG#378](https://wg21.cmeerw.net/cwg/issue378) * [CWG#379](https://wg21.cmeerw.net/cwg/issue379) * [CWG#381](https://wg21.cmeerw.net/cwg/issue381) * [CWG#382](https://wg21.cmeerw.net/cwg/issue382) * [CWG#383](https://wg21.cmeerw.net/cwg/issue383) * [CWG#385](https://wg21.cmeerw.net/cwg/issue385) * [CWG#387](https://wg21.cmeerw.net/cwg/issue387) * [CWG#389](https://wg21.cmeerw.net/cwg/issue389) * [CWG#390](https://wg21.cmeerw.net/cwg/issue390) * [CWG#391](https://wg21.cmeerw.net/cwg/issue391) * [CWG#392](https://wg21.cmeerw.net/cwg/issue392) * [CWG#394](https://wg21.cmeerw.net/cwg/issue394) * [CWG#396](https://wg21.cmeerw.net/cwg/issue396) * [CWG#397](https://wg21.cmeerw.net/cwg/issue397) * [CWG#398](https://wg21.cmeerw.net/cwg/issue398) * [CWG#400](https://wg21.cmeerw.net/cwg/issue400) * [CWG#401](https://wg21.cmeerw.net/cwg/issue401) * [CWG#403](https://wg21.cmeerw.net/cwg/issue403) * [CWG#404](https://wg21.cmeerw.net/cwg/issue404) * [CWG#406](https://wg21.cmeerw.net/cwg/issue406) * [CWG#407](https://wg21.cmeerw.net/cwg/issue407) * [CWG#408](https://wg21.cmeerw.net/cwg/issue408) * [CWG#409](https://wg21.cmeerw.net/cwg/issue409) * [CWG#410](https://wg21.cmeerw.net/cwg/issue410) * [CWG#413](https://wg21.cmeerw.net/cwg/issue413) * [CWG#414](https://wg21.cmeerw.net/cwg/issue414) * [CWG#415](https://wg21.cmeerw.net/cwg/issue415) * [CWG#416](https://wg21.cmeerw.net/cwg/issue416) * [CWG#417](https://wg21.cmeerw.net/cwg/issue417) * [CWG#420](https://wg21.cmeerw.net/cwg/issue420) * [CWG#421](https://wg21.cmeerw.net/cwg/issue421) * [CWG#424](https://wg21.cmeerw.net/cwg/issue424) * [CWG#425](https://wg21.cmeerw.net/cwg/issue425) * [CWG#427](https://wg21.cmeerw.net/cwg/issue427) * [CWG#428](https://wg21.cmeerw.net/cwg/issue428) * [CWG#429](https://wg21.cmeerw.net/cwg/issue429) * [CWG#430](https://wg21.cmeerw.net/cwg/issue430) * [CWG#431](https://wg21.cmeerw.net/cwg/issue431) * [CWG#432](https://wg21.cmeerw.net/cwg/issue432) * [CWG#433](https://wg21.cmeerw.net/cwg/issue433) * [CWG#436](https://wg21.cmeerw.net/cwg/issue436) * [CWG#437](https://wg21.cmeerw.net/cwg/issue437) * [CWG#438](https://wg21.cmeerw.net/cwg/issue438) * [CWG#439](https://wg21.cmeerw.net/cwg/issue439) * [CWG#441](https://wg21.cmeerw.net/cwg/issue441) * [CWG#442](https://wg21.cmeerw.net/cwg/issue442) * [CWG#443](https://wg21.cmeerw.net/cwg/issue443) * [CWG#446](https://wg21.cmeerw.net/cwg/issue446) * [CWG#447](https://wg21.cmeerw.net/cwg/issue447) * [CWG#448](https://wg21.cmeerw.net/cwg/issue448) * [CWG#450](https://wg21.cmeerw.net/cwg/issue450) * [CWG#451](https://wg21.cmeerw.net/cwg/issue451) * [CWG#452](https://wg21.cmeerw.net/cwg/issue452) * [CWG#454](https://wg21.cmeerw.net/cwg/issue454) * [CWG#457](https://wg21.cmeerw.net/cwg/issue457) * [CWG#458](https://wg21.cmeerw.net/cwg/issue458) * [CWG#460](https://wg21.cmeerw.net/cwg/issue460) * [CWG#463](https://wg21.cmeerw.net/cwg/issue463) * [CWG#464](https://wg21.cmeerw.net/cwg/issue464) * [CWG#466](https://wg21.cmeerw.net/cwg/issue466) * [CWG#468](https://wg21.cmeerw.net/cwg/issue468) * [CWG#470](https://wg21.cmeerw.net/cwg/issue470) * [CWG#474](https://wg21.cmeerw.net/cwg/issue474) * [CWG#475](https://wg21.cmeerw.net/cwg/issue475) * [CWG#477](https://wg21.cmeerw.net/cwg/issue477) * [CWG#479](https://wg21.cmeerw.net/cwg/issue479) * [CWG#480](https://wg21.cmeerw.net/cwg/issue480) * [CWG#481](https://wg21.cmeerw.net/cwg/issue481) * [CWG#484](https://wg21.cmeerw.net/cwg/issue484) * [CWG#485](https://wg21.cmeerw.net/cwg/issue485) * [CWG#486](https://wg21.cmeerw.net/cwg/issue486) * [CWG#488](https://wg21.cmeerw.net/cwg/issue488) * [CWG#490](https://wg21.cmeerw.net/cwg/issue490) * [CWG#491](https://wg21.cmeerw.net/cwg/issue491) * [CWG#492](https://wg21.cmeerw.net/cwg/issue492) * [CWG#493](https://wg21.cmeerw.net/cwg/issue493) * [CWG#494](https://wg21.cmeerw.net/cwg/issue494) * [CWG#495](https://wg21.cmeerw.net/cwg/issue495) * [CWG#497](https://wg21.cmeerw.net/cwg/issue497) * [CWG#499](https://wg21.cmeerw.net/cwg/issue499) * [CWG#500](https://wg21.cmeerw.net/cwg/issue500) * [CWG#502](https://wg21.cmeerw.net/cwg/issue502) * [CWG#505](https://wg21.cmeerw.net/cwg/issue505) * [CWG#506](https://wg21.cmeerw.net/cwg/issue506) * [CWG#508](https://wg21.cmeerw.net/cwg/issue508) * [CWG#509](https://wg21.cmeerw.net/cwg/issue509) * [CWG#510](https://wg21.cmeerw.net/cwg/issue510) * [CWG#513](https://wg21.cmeerw.net/cwg/issue513) * [CWG#514](https://wg21.cmeerw.net/cwg/issue514) * [CWG#515](https://wg21.cmeerw.net/cwg/issue515) * [CWG#516](https://wg21.cmeerw.net/cwg/issue516) * [CWG#517](https://wg21.cmeerw.net/cwg/issue517) * [CWG#518](https://wg21.cmeerw.net/cwg/issue518) * [CWG#519](https://wg21.cmeerw.net/cwg/issue519) * [CWG#520](https://wg21.cmeerw.net/cwg/issue520) * [CWG#521](https://wg21.cmeerw.net/cwg/issue521) * [CWG#522](https://wg21.cmeerw.net/cwg/issue522) * [CWG#524](https://wg21.cmeerw.net/cwg/issue524) * [CWG#525](https://wg21.cmeerw.net/cwg/issue525) * [CWG#526](https://wg21.cmeerw.net/cwg/issue526) * [CWG#527](https://wg21.cmeerw.net/cwg/issue527) * [CWG#530](https://wg21.cmeerw.net/cwg/issue530) * [CWG#531](https://wg21.cmeerw.net/cwg/issue531) * [CWG#532](https://wg21.cmeerw.net/cwg/issue532) * [CWG#534](https://wg21.cmeerw.net/cwg/issue534) * [CWG#537](https://wg21.cmeerw.net/cwg/issue537) * [CWG#538](https://wg21.cmeerw.net/cwg/issue538) * [CWG#540](https://wg21.cmeerw.net/cwg/issue540) * [CWG#541](https://wg21.cmeerw.net/cwg/issue541) * [CWG#542](https://wg21.cmeerw.net/cwg/issue542) * [CWG#543](https://wg21.cmeerw.net/cwg/issue543) * [CWG#546](https://wg21.cmeerw.net/cwg/issue546) * [CWG#547](https://wg21.cmeerw.net/cwg/issue547) * [CWG#551](https://wg21.cmeerw.net/cwg/issue551) * [CWG#556](https://wg21.cmeerw.net/cwg/issue556) * [CWG#557](https://wg21.cmeerw.net/cwg/issue557) * [CWG#558](https://wg21.cmeerw.net/cwg/issue558) * [CWG#559](https://wg21.cmeerw.net/cwg/issue559) * [CWG#561](https://wg21.cmeerw.net/cwg/issue561) * [CWG#564](https://wg21.cmeerw.net/cwg/issue564) * [CWG#568](https://wg21.cmeerw.net/cwg/issue568) * [CWG#569](https://wg21.cmeerw.net/cwg/issue569) * [CWG#570](https://wg21.cmeerw.net/cwg/issue570) * [CWG#571](https://wg21.cmeerw.net/cwg/issue571) * [CWG#572](https://wg21.cmeerw.net/cwg/issue572) * [CWG#573](https://wg21.cmeerw.net/cwg/issue573) * [CWG#575](https://wg21.cmeerw.net/cwg/issue575) * [CWG#576](https://wg21.cmeerw.net/cwg/issue576) * [CWG#580](https://wg21.cmeerw.net/cwg/issue580) * [CWG#582](https://wg21.cmeerw.net/cwg/issue582) * [CWG#587](https://wg21.cmeerw.net/cwg/issue587) * [CWG#588](https://wg21.cmeerw.net/cwg/issue588) * [CWG#589](https://wg21.cmeerw.net/cwg/issue589) * [CWG#590](https://wg21.cmeerw.net/cwg/issue590) * [CWG#592](https://wg21.cmeerw.net/cwg/issue592) * [CWG#594](https://wg21.cmeerw.net/cwg/issue594) * [CWG#598](https://wg21.cmeerw.net/cwg/issue598) * [CWG#599](https://wg21.cmeerw.net/cwg/issue599) * [CWG#601](https://wg21.cmeerw.net/cwg/issue601) * [CWG#602](https://wg21.cmeerw.net/cwg/issue602) * [CWG#603](https://wg21.cmeerw.net/cwg/issue603) * [CWG#604](https://wg21.cmeerw.net/cwg/issue604) * [CWG#605](https://wg21.cmeerw.net/cwg/issue605) * [CWG#606](https://wg21.cmeerw.net/cwg/issue606) * [CWG#608](https://wg21.cmeerw.net/cwg/issue608) * [CWG#611](https://wg21.cmeerw.net/cwg/issue611) * [CWG#612](https://wg21.cmeerw.net/cwg/issue612) * [CWG#613](https://wg21.cmeerw.net/cwg/issue613) * [CWG#614](https://wg21.cmeerw.net/cwg/issue614) * [CWG#615](https://wg21.cmeerw.net/cwg/issue615) * [CWG#618](https://wg21.cmeerw.net/cwg/issue618) * [CWG#619](https://wg21.cmeerw.net/cwg/issue619) * [CWG#620](https://wg21.cmeerw.net/cwg/issue620) * [CWG#621](https://wg21.cmeerw.net/cwg/issue621) * [CWG#624](https://wg21.cmeerw.net/cwg/issue624) * [CWG#625](https://wg21.cmeerw.net/cwg/issue625) * [CWG#626](https://wg21.cmeerw.net/cwg/issue626) * [CWG#628](https://wg21.cmeerw.net/cwg/issue628) * [CWG#629](https://wg21.cmeerw.net/cwg/issue629) * [CWG#630](https://wg21.cmeerw.net/cwg/issue630) * [CWG#632](https://wg21.cmeerw.net/cwg/issue632) * [CWG#633](https://wg21.cmeerw.net/cwg/issue633) * [CWG#634](https://wg21.cmeerw.net/cwg/issue634) * [CWG#637](https://wg21.cmeerw.net/cwg/issue637) * [CWG#638](https://wg21.cmeerw.net/cwg/issue638) * [CWG#639](https://wg21.cmeerw.net/cwg/issue639) * [CWG#641](https://wg21.cmeerw.net/cwg/issue641) * [CWG#642](https://wg21.cmeerw.net/cwg/issue642) * [CWG#644](https://wg21.cmeerw.net/cwg/issue644) * [CWG#645](https://wg21.cmeerw.net/cwg/issue645) * [CWG#647](https://wg21.cmeerw.net/cwg/issue647) * [CWG#648](https://wg21.cmeerw.net/cwg/issue648) * [CWG#649](https://wg21.cmeerw.net/cwg/issue649) * [CWG#650](https://wg21.cmeerw.net/cwg/issue650) * [CWG#651](https://wg21.cmeerw.net/cwg/issue651) * [CWG#652](https://wg21.cmeerw.net/cwg/issue652) * [CWG#653](https://wg21.cmeerw.net/cwg/issue653) * [CWG#654](https://wg21.cmeerw.net/cwg/issue654) * [CWG#655](https://wg21.cmeerw.net/cwg/issue655) * [CWG#656](https://wg21.cmeerw.net/cwg/issue656) * [CWG#657](https://wg21.cmeerw.net/cwg/issue657) * [CWG#658](https://wg21.cmeerw.net/cwg/issue658) * [CWG#659](https://wg21.cmeerw.net/cwg/issue659) * [CWG#660](https://wg21.cmeerw.net/cwg/issue660) * [CWG#661](https://wg21.cmeerw.net/cwg/issue661) * [CWG#663](https://wg21.cmeerw.net/cwg/issue663) * [CWG#664](https://wg21.cmeerw.net/cwg/issue664) * [CWG#665](https://wg21.cmeerw.net/cwg/issue665) * [CWG#666](https://wg21.cmeerw.net/cwg/issue666) * [CWG#667](https://wg21.cmeerw.net/cwg/issue667) * [CWG#668](https://wg21.cmeerw.net/cwg/issue668) * [CWG#671](https://wg21.cmeerw.net/cwg/issue671) * [CWG#672](https://wg21.cmeerw.net/cwg/issue672) * [CWG#674](https://wg21.cmeerw.net/cwg/issue674) * [CWG#676](https://wg21.cmeerw.net/cwg/issue676) * [CWG#677](https://wg21.cmeerw.net/cwg/issue677) * [CWG#678](https://wg21.cmeerw.net/cwg/issue678) * [CWG#679](https://wg21.cmeerw.net/cwg/issue679) * [CWG#680](https://wg21.cmeerw.net/cwg/issue680) * [CWG#681](https://wg21.cmeerw.net/cwg/issue681) * [CWG#683](https://wg21.cmeerw.net/cwg/issue683) * [CWG#684](https://wg21.cmeerw.net/cwg/issue684) * [CWG#685](https://wg21.cmeerw.net/cwg/issue685) * [CWG#686](https://wg21.cmeerw.net/cwg/issue686) * [CWG#688](https://wg21.cmeerw.net/cwg/issue688) * [CWG#690](https://wg21.cmeerw.net/cwg/issue690) * [CWG#691](https://wg21.cmeerw.net/cwg/issue691) * [CWG#692](https://wg21.cmeerw.net/cwg/issue692) * [CWG#693](https://wg21.cmeerw.net/cwg/issue693) * [CWG#694](https://wg21.cmeerw.net/cwg/issue694) * [CWG#695](https://wg21.cmeerw.net/cwg/issue695) * [CWG#696](https://wg21.cmeerw.net/cwg/issue696) * [CWG#699](https://wg21.cmeerw.net/cwg/issue699) * [CWG#700](https://wg21.cmeerw.net/cwg/issue700) * [CWG#701](https://wg21.cmeerw.net/cwg/issue701) * [CWG#702](https://wg21.cmeerw.net/cwg/issue702) * [CWG#703](https://wg21.cmeerw.net/cwg/issue703) * [CWG#704](https://wg21.cmeerw.net/cwg/issue704) * [CWG#705](https://wg21.cmeerw.net/cwg/issue705) * [CWG#707](https://wg21.cmeerw.net/cwg/issue707) * [CWG#709](https://wg21.cmeerw.net/cwg/issue709) * [CWG#710](https://wg21.cmeerw.net/cwg/issue710) * [CWG#711](https://wg21.cmeerw.net/cwg/issue711) * [CWG#713](https://wg21.cmeerw.net/cwg/issue713) * [CWG#714](https://wg21.cmeerw.net/cwg/issue714) * [CWG#715](https://wg21.cmeerw.net/cwg/issue715) * [CWG#716](https://wg21.cmeerw.net/cwg/issue716) * [CWG#717](https://wg21.cmeerw.net/cwg/issue717) * [CWG#719](https://wg21.cmeerw.net/cwg/issue719) * [CWG#720](https://wg21.cmeerw.net/cwg/issue720) * [CWG#721](https://wg21.cmeerw.net/cwg/issue721) * [CWG#722](https://wg21.cmeerw.net/cwg/issue722) * [CWG#726](https://wg21.cmeerw.net/cwg/issue726) * [CWG#730](https://wg21.cmeerw.net/cwg/issue730) * [CWG#731](https://wg21.cmeerw.net/cwg/issue731) * [CWG#732](https://wg21.cmeerw.net/cwg/issue732) * [CWG#734](https://wg21.cmeerw.net/cwg/issue734) * [CWG#735](https://wg21.cmeerw.net/cwg/issue735) * [CWG#737](https://wg21.cmeerw.net/cwg/issue737) * [CWG#738](https://wg21.cmeerw.net/cwg/issue738) * [CWG#740](https://wg21.cmeerw.net/cwg/issue740) * [CWG#741](https://wg21.cmeerw.net/cwg/issue741) * [CWG#743](https://wg21.cmeerw.net/cwg/issue743) * [CWG#744](https://wg21.cmeerw.net/cwg/issue744) * [CWG#746](https://wg21.cmeerw.net/cwg/issue746) * [CWG#749](https://wg21.cmeerw.net/cwg/issue749) * [CWG#750](https://wg21.cmeerw.net/cwg/issue750) * [CWG#751](https://wg21.cmeerw.net/cwg/issue751) * [CWG#752](https://wg21.cmeerw.net/cwg/issue752) * [CWG#753](https://wg21.cmeerw.net/cwg/issue753) * [CWG#754](https://wg21.cmeerw.net/cwg/issue754) * [CWG#756](https://wg21.cmeerw.net/cwg/issue756) * [CWG#757](https://wg21.cmeerw.net/cwg/issue757) * [CWG#758](https://wg21.cmeerw.net/cwg/issue758) * [CWG#759](https://wg21.cmeerw.net/cwg/issue759) * [CWG#760](https://wg21.cmeerw.net/cwg/issue760) * [CWG#761](https://wg21.cmeerw.net/cwg/issue761) * [CWG#762](https://wg21.cmeerw.net/cwg/issue762) * [CWG#763](https://wg21.cmeerw.net/cwg/issue763) * [CWG#764](https://wg21.cmeerw.net/cwg/issue764) * [CWG#765](https://wg21.cmeerw.net/cwg/issue765) * [CWG#766](https://wg21.cmeerw.net/cwg/issue766) * [CWG#767](https://wg21.cmeerw.net/cwg/issue767) * [CWG#768](https://wg21.cmeerw.net/cwg/issue768) * [CWG#769](https://wg21.cmeerw.net/cwg/issue769) * [CWG#770](https://wg21.cmeerw.net/cwg/issue770) * [CWG#771](https://wg21.cmeerw.net/cwg/issue771) * [CWG#772](https://wg21.cmeerw.net/cwg/issue772) * [CWG#773](https://wg21.cmeerw.net/cwg/issue773) * [CWG#774](https://wg21.cmeerw.net/cwg/issue774) * [CWG#775](https://wg21.cmeerw.net/cwg/issue775) * [CWG#776](https://wg21.cmeerw.net/cwg/issue776) * [CWG#777](https://wg21.cmeerw.net/cwg/issue777) * [CWG#778](https://wg21.cmeerw.net/cwg/issue778) * [CWG#779](https://wg21.cmeerw.net/cwg/issue779) * [CWG#782](https://wg21.cmeerw.net/cwg/issue782) * [CWG#784](https://wg21.cmeerw.net/cwg/issue784) * [CWG#785](https://wg21.cmeerw.net/cwg/issue785) * [CWG#786](https://wg21.cmeerw.net/cwg/issue786) * [CWG#787](https://wg21.cmeerw.net/cwg/issue787) * [CWG#788](https://wg21.cmeerw.net/cwg/issue788) * [CWG#789](https://wg21.cmeerw.net/cwg/issue789) * [CWG#790](https://wg21.cmeerw.net/cwg/issue790) * [CWG#792](https://wg21.cmeerw.net/cwg/issue792) * [CWG#793](https://wg21.cmeerw.net/cwg/issue793) * [CWG#796](https://wg21.cmeerw.net/cwg/issue796) * [CWG#797](https://wg21.cmeerw.net/cwg/issue797) * [CWG#798](https://wg21.cmeerw.net/cwg/issue798) * [CWG#799](https://wg21.cmeerw.net/cwg/issue799) * [CWG#801](https://wg21.cmeerw.net/cwg/issue801) * [CWG#803](https://wg21.cmeerw.net/cwg/issue803) * [CWG#804](https://wg21.cmeerw.net/cwg/issue804) * [CWG#805](https://wg21.cmeerw.net/cwg/issue805) * [CWG#806](https://wg21.cmeerw.net/cwg/issue806) * [CWG#808](https://wg21.cmeerw.net/cwg/issue808) * [CWG#809](https://wg21.cmeerw.net/cwg/issue809) * [CWG#810](https://wg21.cmeerw.net/cwg/issue810) * [CWG#811](https://wg21.cmeerw.net/cwg/issue811) * [CWG#812](https://wg21.cmeerw.net/cwg/issue812) * [CWG#814](https://wg21.cmeerw.net/cwg/issue814) * [CWG#815](https://wg21.cmeerw.net/cwg/issue815) * [CWG#816](https://wg21.cmeerw.net/cwg/issue816) * [CWG#817](https://wg21.cmeerw.net/cwg/issue817) * [CWG#818](https://wg21.cmeerw.net/cwg/issue818) * [CWG#820](https://wg21.cmeerw.net/cwg/issue820) * [CWG#823](https://wg21.cmeerw.net/cwg/issue823) * [CWG#828](https://wg21.cmeerw.net/cwg/issue828) * [CWG#830](https://wg21.cmeerw.net/cwg/issue830) * [CWG#831](https://wg21.cmeerw.net/cwg/issue831) * [CWG#832](https://wg21.cmeerw.net/cwg/issue832) * [CWG#833](https://wg21.cmeerw.net/cwg/issue833) * [CWG#834](https://wg21.cmeerw.net/cwg/issue834) * [CWG#835](https://wg21.cmeerw.net/cwg/issue835) * [CWG#837](https://wg21.cmeerw.net/cwg/issue837) * [CWG#838](https://wg21.cmeerw.net/cwg/issue838) * [CWG#840](https://wg21.cmeerw.net/cwg/issue840) * [CWG#842](https://wg21.cmeerw.net/cwg/issue842) * [CWG#845](https://wg21.cmeerw.net/cwg/issue845) * [CWG#846](https://wg21.cmeerw.net/cwg/issue846) * [CWG#847](https://wg21.cmeerw.net/cwg/issue847) * [CWG#850](https://wg21.cmeerw.net/cwg/issue850) * [CWG#853](https://wg21.cmeerw.net/cwg/issue853) * [CWG#854](https://wg21.cmeerw.net/cwg/issue854) * [CWG#855](https://wg21.cmeerw.net/cwg/issue855) * [CWG#858](https://wg21.cmeerw.net/cwg/issue858) * [CWG#860](https://wg21.cmeerw.net/cwg/issue860) * [CWG#861](https://wg21.cmeerw.net/cwg/issue861) * [CWG#862](https://wg21.cmeerw.net/cwg/issue862) * [CWG#863](https://wg21.cmeerw.net/cwg/issue863) * [CWG#864](https://wg21.cmeerw.net/cwg/issue864) * [CWG#865](https://wg21.cmeerw.net/cwg/issue865) * [CWG#869](https://wg21.cmeerw.net/cwg/issue869) * [CWG#872](https://wg21.cmeerw.net/cwg/issue872) * [CWG#873](https://wg21.cmeerw.net/cwg/issue873) * [CWG#874](https://wg21.cmeerw.net/cwg/issue874) * [CWG#876](https://wg21.cmeerw.net/cwg/issue876) * [CWG#877](https://wg21.cmeerw.net/cwg/issue877) * [CWG#879](https://wg21.cmeerw.net/cwg/issue879) * [CWG#880](https://wg21.cmeerw.net/cwg/issue880) * [CWG#882](https://wg21.cmeerw.net/cwg/issue882) * [CWG#883](https://wg21.cmeerw.net/cwg/issue883) * [CWG#884](https://wg21.cmeerw.net/cwg/issue884) * [CWG#886](https://wg21.cmeerw.net/cwg/issue886) * [CWG#887](https://wg21.cmeerw.net/cwg/issue887) * [CWG#888](https://wg21.cmeerw.net/cwg/issue888) * [CWG#891](https://wg21.cmeerw.net/cwg/issue891) * [CWG#892](https://wg21.cmeerw.net/cwg/issue892) * [CWG#896](https://wg21.cmeerw.net/cwg/issue896) * [CWG#898](https://wg21.cmeerw.net/cwg/issue898) * [CWG#899](https://wg21.cmeerw.net/cwg/issue899) * [CWG#904](https://wg21.cmeerw.net/cwg/issue904) * [CWG#905](https://wg21.cmeerw.net/cwg/issue905) * [CWG#906](https://wg21.cmeerw.net/cwg/issue906) * [CWG#908](https://wg21.cmeerw.net/cwg/issue908) * [CWG#910](https://wg21.cmeerw.net/cwg/issue910) * [CWG#913](https://wg21.cmeerw.net/cwg/issue913) * [CWG#915](https://wg21.cmeerw.net/cwg/issue915) * [CWG#919](https://wg21.cmeerw.net/cwg/issue919) * [CWG#920](https://wg21.cmeerw.net/cwg/issue920) * [CWG#921](https://wg21.cmeerw.net/cwg/issue921) * [CWG#922](https://wg21.cmeerw.net/cwg/issue922) * [CWG#923](https://wg21.cmeerw.net/cwg/issue923) * [CWG#924](https://wg21.cmeerw.net/cwg/issue924) * [CWG#926](https://wg21.cmeerw.net/cwg/issue926) * [CWG#927](https://wg21.cmeerw.net/cwg/issue927) * [CWG#928](https://wg21.cmeerw.net/cwg/issue928) * [CWG#929](https://wg21.cmeerw.net/cwg/issue929) * [CWG#930](https://wg21.cmeerw.net/cwg/issue930) * [CWG#931](https://wg21.cmeerw.net/cwg/issue931) * [CWG#932](https://wg21.cmeerw.net/cwg/issue932) * [CWG#933](https://wg21.cmeerw.net/cwg/issue933) * [CWG#934](https://wg21.cmeerw.net/cwg/issue934) * [CWG#935](https://wg21.cmeerw.net/cwg/issue935) * [CWG#936](https://wg21.cmeerw.net/cwg/issue936) * [CWG#938](https://wg21.cmeerw.net/cwg/issue938) * [CWG#939](https://wg21.cmeerw.net/cwg/issue939) * [CWG#940](https://wg21.cmeerw.net/cwg/issue940) * [CWG#941](https://wg21.cmeerw.net/cwg/issue941) * [CWG#942](https://wg21.cmeerw.net/cwg/issue942) * [CWG#945](https://wg21.cmeerw.net/cwg/issue945) * [CWG#946](https://wg21.cmeerw.net/cwg/issue946) * [CWG#948](https://wg21.cmeerw.net/cwg/issue948) * [CWG#950](https://wg21.cmeerw.net/cwg/issue950) * [CWG#951](https://wg21.cmeerw.net/cwg/issue951) * [CWG#953](https://wg21.cmeerw.net/cwg/issue953) * [CWG#955](https://wg21.cmeerw.net/cwg/issue955) * [CWG#956](https://wg21.cmeerw.net/cwg/issue956) * [CWG#957](https://wg21.cmeerw.net/cwg/issue957) * [CWG#959](https://wg21.cmeerw.net/cwg/issue959) * [CWG#960](https://wg21.cmeerw.net/cwg/issue960) * [CWG#961](https://wg21.cmeerw.net/cwg/issue961) * [CWG#962](https://wg21.cmeerw.net/cwg/issue962) * [CWG#963](https://wg21.cmeerw.net/cwg/issue963) * [CWG#964](https://wg21.cmeerw.net/cwg/issue964) * [CWG#965](https://wg21.cmeerw.net/cwg/issue965) * [CWG#966](https://wg21.cmeerw.net/cwg/issue966) * [CWG#968](https://wg21.cmeerw.net/cwg/issue968) * [CWG#969](https://wg21.cmeerw.net/cwg/issue969) * [CWG#970](https://wg21.cmeerw.net/cwg/issue970) * [CWG#971](https://wg21.cmeerw.net/cwg/issue971) * [CWG#972](https://wg21.cmeerw.net/cwg/issue972) * [CWG#973](https://wg21.cmeerw.net/cwg/issue973) * [CWG#976](https://wg21.cmeerw.net/cwg/issue976) * [CWG#978](https://wg21.cmeerw.net/cwg/issue978) * [CWG#979](https://wg21.cmeerw.net/cwg/issue979) * [CWG#980](https://wg21.cmeerw.net/cwg/issue980) * [CWG#981](https://wg21.cmeerw.net/cwg/issue981) * [CWG#983](https://wg21.cmeerw.net/cwg/issue983) * [CWG#984](https://wg21.cmeerw.net/cwg/issue984) * [CWG#985](https://wg21.cmeerw.net/cwg/issue985) * [CWG#986](https://wg21.cmeerw.net/cwg/issue986) * [CWG#988](https://wg21.cmeerw.net/cwg/issue988) * [CWG#989](https://wg21.cmeerw.net/cwg/issue989) * [CWG#990](https://wg21.cmeerw.net/cwg/issue990) * [CWG#991](https://wg21.cmeerw.net/cwg/issue991) * [CWG#993](https://wg21.cmeerw.net/cwg/issue993) * [CWG#994](https://wg21.cmeerw.net/cwg/issue994) * [CWG#995](https://wg21.cmeerw.net/cwg/issue995) * [CWG#996](https://wg21.cmeerw.net/cwg/issue996) * [CWG#997](https://wg21.cmeerw.net/cwg/issue997) * [CWG#999](https://wg21.cmeerw.net/cwg/issue999) * [CWG#1000](https://wg21.cmeerw.net/cwg/issue1000) * [CWG#1004](https://wg21.cmeerw.net/cwg/issue1004) * [CWG#1006](https://wg21.cmeerw.net/cwg/issue1006) * [CWG#1009](https://wg21.cmeerw.net/cwg/issue1009) * [CWG#1010](https://wg21.cmeerw.net/cwg/issue1010) * [CWG#1011](https://wg21.cmeerw.net/cwg/issue1011) * [CWG#1012](https://wg21.cmeerw.net/cwg/issue1012) * [CWG#1015](https://wg21.cmeerw.net/cwg/issue1015) * [CWG#1016](https://wg21.cmeerw.net/cwg/issue1016) * [CWG#1017](https://wg21.cmeerw.net/cwg/issue1017) * [CWG#1018](https://wg21.cmeerw.net/cwg/issue1018) * [CWG#1020](https://wg21.cmeerw.net/cwg/issue1020) * [CWG#1022](https://wg21.cmeerw.net/cwg/issue1022) * [CWG#1025](https://wg21.cmeerw.net/cwg/issue1025) * [CWG#1029](https://wg21.cmeerw.net/cwg/issue1029) * [CWG#1030](https://wg21.cmeerw.net/cwg/issue1030) * [CWG#1031](https://wg21.cmeerw.net/cwg/issue1031) * [CWG#1032](https://wg21.cmeerw.net/cwg/issue1032) * [CWG#1033](https://wg21.cmeerw.net/cwg/issue1033) * [CWG#1034](https://wg21.cmeerw.net/cwg/issue1034) * [CWG#1035](https://wg21.cmeerw.net/cwg/issue1035) * [CWG#1036](https://wg21.cmeerw.net/cwg/issue1036) * [CWG#1037](https://wg21.cmeerw.net/cwg/issue1037) * [CWG#1042](https://wg21.cmeerw.net/cwg/issue1042) * [CWG#1043](https://wg21.cmeerw.net/cwg/issue1043) * [CWG#1044](https://wg21.cmeerw.net/cwg/issue1044) * [CWG#1047](https://wg21.cmeerw.net/cwg/issue1047) * [CWG#1051](https://wg21.cmeerw.net/cwg/issue1051) * [CWG#1054](https://wg21.cmeerw.net/cwg/issue1054) * [CWG#1055](https://wg21.cmeerw.net/cwg/issue1055) * [CWG#1056](https://wg21.cmeerw.net/cwg/issue1056) * [CWG#1057](https://wg21.cmeerw.net/cwg/issue1057) * [CWG#1060](https://wg21.cmeerw.net/cwg/issue1060) * [CWG#1061](https://wg21.cmeerw.net/cwg/issue1061) * [CWG#1062](https://wg21.cmeerw.net/cwg/issue1062) * [CWG#1063](https://wg21.cmeerw.net/cwg/issue1063) * [CWG#1064](https://wg21.cmeerw.net/cwg/issue1064) * [CWG#1065](https://wg21.cmeerw.net/cwg/issue1065) * [CWG#1066](https://wg21.cmeerw.net/cwg/issue1066) * [CWG#1068](https://wg21.cmeerw.net/cwg/issue1068) * [CWG#1069](https://wg21.cmeerw.net/cwg/issue1069) * [CWG#1070](https://wg21.cmeerw.net/cwg/issue1070) * [CWG#1071](https://wg21.cmeerw.net/cwg/issue1071) * [CWG#1072](https://wg21.cmeerw.net/cwg/issue1072) * [CWG#1073](https://wg21.cmeerw.net/cwg/issue1073) * [CWG#1074](https://wg21.cmeerw.net/cwg/issue1074) * [CWG#1075](https://wg21.cmeerw.net/cwg/issue1075) * [CWG#1079](https://wg21.cmeerw.net/cwg/issue1079) * [CWG#1080](https://wg21.cmeerw.net/cwg/issue1080) * [CWG#1081](https://wg21.cmeerw.net/cwg/issue1081) * [CWG#1082](https://wg21.cmeerw.net/cwg/issue1082) * [CWG#1083](https://wg21.cmeerw.net/cwg/issue1083) * [CWG#1086](https://wg21.cmeerw.net/cwg/issue1086) * [CWG#1087](https://wg21.cmeerw.net/cwg/issue1087) * [CWG#1088](https://wg21.cmeerw.net/cwg/issue1088) * [CWG#1090](https://wg21.cmeerw.net/cwg/issue1090) * [CWG#1091](https://wg21.cmeerw.net/cwg/issue1091) * [CWG#1094](https://wg21.cmeerw.net/cwg/issue1094) * [CWG#1095](https://wg21.cmeerw.net/cwg/issue1095) * [CWG#1096](https://wg21.cmeerw.net/cwg/issue1096) * [CWG#1098](https://wg21.cmeerw.net/cwg/issue1098) * [CWG#1099](https://wg21.cmeerw.net/cwg/issue1099) * [CWG#1100](https://wg21.cmeerw.net/cwg/issue1100) * [CWG#1101](https://wg21.cmeerw.net/cwg/issue1101) * [CWG#1102](https://wg21.cmeerw.net/cwg/issue1102) * [CWG#1103](https://wg21.cmeerw.net/cwg/issue1103) * [CWG#1104](https://wg21.cmeerw.net/cwg/issue1104) * [CWG#1105](https://wg21.cmeerw.net/cwg/issue1105) * [CWG#1106](https://wg21.cmeerw.net/cwg/issue1106) * [CWG#1107](https://wg21.cmeerw.net/cwg/issue1107) * [CWG#1109](https://wg21.cmeerw.net/cwg/issue1109) * [CWG#1111](https://wg21.cmeerw.net/cwg/issue1111) * [CWG#1112](https://wg21.cmeerw.net/cwg/issue1112) * [CWG#1113](https://wg21.cmeerw.net/cwg/issue1113) * [CWG#1114](https://wg21.cmeerw.net/cwg/issue1114) * [CWG#1115](https://wg21.cmeerw.net/cwg/issue1115) * [CWG#1117](https://wg21.cmeerw.net/cwg/issue1117) * [CWG#1119](https://wg21.cmeerw.net/cwg/issue1119) * [CWG#1120](https://wg21.cmeerw.net/cwg/issue1120) * [CWG#1121](https://wg21.cmeerw.net/cwg/issue1121) * [CWG#1122](https://wg21.cmeerw.net/cwg/issue1122) * [CWG#1123](https://wg21.cmeerw.net/cwg/issue1123) * [CWG#1125](https://wg21.cmeerw.net/cwg/issue1125) * [CWG#1126](https://wg21.cmeerw.net/cwg/issue1126) * [CWG#1127](https://wg21.cmeerw.net/cwg/issue1127) * [CWG#1128](https://wg21.cmeerw.net/cwg/issue1128) * [CWG#1129](https://wg21.cmeerw.net/cwg/issue1129) * [CWG#1130](https://wg21.cmeerw.net/cwg/issue1130) * [CWG#1131](https://wg21.cmeerw.net/cwg/issue1131) * [CWG#1133](https://wg21.cmeerw.net/cwg/issue1133) * [CWG#1134](https://wg21.cmeerw.net/cwg/issue1134) * [CWG#1135](https://wg21.cmeerw.net/cwg/issue1135) * [CWG#1136](https://wg21.cmeerw.net/cwg/issue1136) * [CWG#1137](https://wg21.cmeerw.net/cwg/issue1137) * [CWG#1138](https://wg21.cmeerw.net/cwg/issue1138) * [CWG#1139](https://wg21.cmeerw.net/cwg/issue1139) * [CWG#1140](https://wg21.cmeerw.net/cwg/issue1140) * [CWG#1142](https://wg21.cmeerw.net/cwg/issue1142) * [CWG#1144](https://wg21.cmeerw.net/cwg/issue1144) * [CWG#1145](https://wg21.cmeerw.net/cwg/issue1145) * [CWG#1146](https://wg21.cmeerw.net/cwg/issue1146) * [CWG#1147](https://wg21.cmeerw.net/cwg/issue1147) * [CWG#1148](https://wg21.cmeerw.net/cwg/issue1148) * [CWG#1149](https://wg21.cmeerw.net/cwg/issue1149) * [CWG#1151](https://wg21.cmeerw.net/cwg/issue1151) * [CWG#1152](https://wg21.cmeerw.net/cwg/issue1152) * [CWG#1153](https://wg21.cmeerw.net/cwg/issue1153) * [CWG#1154](https://wg21.cmeerw.net/cwg/issue1154) * [CWG#1155](https://wg21.cmeerw.net/cwg/issue1155) * [CWG#1156](https://wg21.cmeerw.net/cwg/issue1156) * [CWG#1158](https://wg21.cmeerw.net/cwg/issue1158) * [CWG#1159](https://wg21.cmeerw.net/cwg/issue1159) * [CWG#1160](https://wg21.cmeerw.net/cwg/issue1160) * [CWG#1161](https://wg21.cmeerw.net/cwg/issue1161) * [CWG#1164](https://wg21.cmeerw.net/cwg/issue1164) * [CWG#1165](https://wg21.cmeerw.net/cwg/issue1165) * [CWG#1166](https://wg21.cmeerw.net/cwg/issue1166) * [CWG#1167](https://wg21.cmeerw.net/cwg/issue1167) * [CWG#1168](https://wg21.cmeerw.net/cwg/issue1168) * [CWG#1169](https://wg21.cmeerw.net/cwg/issue1169) * [CWG#1170](https://wg21.cmeerw.net/cwg/issue1170) * [CWG#1171](https://wg21.cmeerw.net/cwg/issue1171) * [CWG#1173](https://wg21.cmeerw.net/cwg/issue1173) * [CWG#1174](https://wg21.cmeerw.net/cwg/issue1174) * [CWG#1175](https://wg21.cmeerw.net/cwg/issue1175) * [CWG#1176](https://wg21.cmeerw.net/cwg/issue1176) * [CWG#1177](https://wg21.cmeerw.net/cwg/issue1177) * [CWG#1178](https://wg21.cmeerw.net/cwg/issue1178) * [CWG#1180](https://wg21.cmeerw.net/cwg/issue1180) * [CWG#1181](https://wg21.cmeerw.net/cwg/issue1181) * [CWG#1182](https://wg21.cmeerw.net/cwg/issue1182) * [CWG#1183](https://wg21.cmeerw.net/cwg/issue1183) * [CWG#1184](https://wg21.cmeerw.net/cwg/issue1184) * [CWG#1185](https://wg21.cmeerw.net/cwg/issue1185) * [CWG#1186](https://wg21.cmeerw.net/cwg/issue1186) * [CWG#1187](https://wg21.cmeerw.net/cwg/issue1187) * [CWG#1188](https://wg21.cmeerw.net/cwg/issue1188) * [CWG#1189](https://wg21.cmeerw.net/cwg/issue1189) * [CWG#1190](https://wg21.cmeerw.net/cwg/issue1190) * [CWG#1191](https://wg21.cmeerw.net/cwg/issue1191) * [CWG#1192](https://wg21.cmeerw.net/cwg/issue1192) * [CWG#1193](https://wg21.cmeerw.net/cwg/issue1193) * [CWG#1194](https://wg21.cmeerw.net/cwg/issue1194) * [CWG#1195](https://wg21.cmeerw.net/cwg/issue1195) * [CWG#1196](https://wg21.cmeerw.net/cwg/issue1196) * [CWG#1197](https://wg21.cmeerw.net/cwg/issue1197) * [CWG#1198](https://wg21.cmeerw.net/cwg/issue1198) * [CWG#1199](https://wg21.cmeerw.net/cwg/issue1199) * [CWG#1201](https://wg21.cmeerw.net/cwg/issue1201) * [CWG#1202](https://wg21.cmeerw.net/cwg/issue1202) * [CWG#1204](https://wg21.cmeerw.net/cwg/issue1204) * [CWG#1206](https://wg21.cmeerw.net/cwg/issue1206) * [CWG#1207](https://wg21.cmeerw.net/cwg/issue1207) * [CWG#1208](https://wg21.cmeerw.net/cwg/issue1208) * [CWG#1210](https://wg21.cmeerw.net/cwg/issue1210) * [CWG#1212](https://wg21.cmeerw.net/cwg/issue1212) * [CWG#1214](https://wg21.cmeerw.net/cwg/issue1214) * [CWG#1215](https://wg21.cmeerw.net/cwg/issue1215) * [CWG#1216](https://wg21.cmeerw.net/cwg/issue1216) * [CWG#1218](https://wg21.cmeerw.net/cwg/issue1218) * [CWG#1219](https://wg21.cmeerw.net/cwg/issue1219) * [CWG#1220](https://wg21.cmeerw.net/cwg/issue1220) * [CWG#1224](https://wg21.cmeerw.net/cwg/issue1224) * [CWG#1225](https://wg21.cmeerw.net/cwg/issue1225) * [CWG#1229](https://wg21.cmeerw.net/cwg/issue1229) * [CWG#1231](https://wg21.cmeerw.net/cwg/issue1231) * [CWG#1232](https://wg21.cmeerw.net/cwg/issue1232) * [CWG#1233](https://wg21.cmeerw.net/cwg/issue1233) * [CWG#1234](https://wg21.cmeerw.net/cwg/issue1234) * [CWG#1235](https://wg21.cmeerw.net/cwg/issue1235) * [CWG#1236](https://wg21.cmeerw.net/cwg/issue1236) * [CWG#1237](https://wg21.cmeerw.net/cwg/issue1237) * [CWG#1238](https://wg21.cmeerw.net/cwg/issue1238) * [CWG#1239](https://wg21.cmeerw.net/cwg/issue1239) * [CWG#1240](https://wg21.cmeerw.net/cwg/issue1240) * [CWG#1241](https://wg21.cmeerw.net/cwg/issue1241) * [CWG#1242](https://wg21.cmeerw.net/cwg/issue1242) * [CWG#1243](https://wg21.cmeerw.net/cwg/issue1243) * [CWG#1244](https://wg21.cmeerw.net/cwg/issue1244) * [CWG#1245](https://wg21.cmeerw.net/cwg/issue1245) * [CWG#1246](https://wg21.cmeerw.net/cwg/issue1246) * [LWG#23](http://wg21.link/lwg23) * [LWG#44](http://wg21.link/lwg44) * [LWG#49](http://wg21.link/lwg49) * [LWG#76](http://wg21.link/lwg76) * [LWG#91](http://wg21.link/lwg91) * [LWG#92](http://wg21.link/lwg92) * [LWG#98](http://wg21.link/lwg98) * [LWG#103](http://wg21.link/lwg103) * [LWG#109](http://wg21.link/lwg109) * [LWG#117](http://wg21.link/lwg117) * [LWG#118](http://wg21.link/lwg118) * [LWG#120](http://wg21.link/lwg120) * [LWG#123](http://wg21.link/lwg123) * [LWG#130](http://wg21.link/lwg130) * [LWG#136](http://wg21.link/lwg136) * [LWG#149](http://wg21.link/lwg149) * [LWG#153](http://wg21.link/lwg153) * [LWG#165](http://wg21.link/lwg165) * [LWG#167](http://wg21.link/lwg167) * [LWG#171](http://wg21.link/lwg171) * [LWG#179](http://wg21.link/lwg179) * [LWG#180](http://wg21.link/lwg180) * [LWG#182](http://wg21.link/lwg182) * [LWG#183](http://wg21.link/lwg183) * [LWG#184](http://wg21.link/lwg184) * [LWG#185](http://wg21.link/lwg185) * [LWG#186](http://wg21.link/lwg186) * [LWG#187](http://wg21.link/lwg187) * [LWG#198](http://wg21.link/lwg198) * [LWG#200](http://wg21.link/lwg200) * [LWG#201](http://wg21.link/lwg201) * [LWG#202](http://wg21.link/lwg202) * [LWG#206](http://wg21.link/lwg206) * [LWG#214](http://wg21.link/lwg214) * [LWG#221](http://wg21.link/lwg221) * [LWG#225](http://wg21.link/lwg225) * [LWG#226](http://wg21.link/lwg226) * [LWG#228](http://wg21.link/lwg228) * [LWG#229](http://wg21.link/lwg229) * [LWG#230](http://wg21.link/lwg230) * [LWG#231](http://wg21.link/lwg231) * [LWG#232](http://wg21.link/lwg232) * [LWG#233](http://wg21.link/lwg233) * [LWG#234](http://wg21.link/lwg234) * [LWG#235](http://wg21.link/lwg235) * [LWG#237](http://wg21.link/lwg237) * [LWG#238](http://wg21.link/lwg238) * [LWG#239](http://wg21.link/lwg239) * [LWG#240](http://wg21.link/lwg240) * [LWG#241](http://wg21.link/lwg241) * [LWG#242](http://wg21.link/lwg242) * [LWG#243](http://wg21.link/lwg243) * [LWG#247](http://wg21.link/lwg247) * [LWG#248](http://wg21.link/lwg248) * [LWG#250](http://wg21.link/lwg250) * [LWG#251](http://wg21.link/lwg251) * [LWG#252](http://wg21.link/lwg252) * [LWG#253](http://wg21.link/lwg253) * [LWG#254](http://wg21.link/lwg254) * [LWG#256](http://wg21.link/lwg256) * [LWG#258](http://wg21.link/lwg258) * [LWG#259](http://wg21.link/lwg259) * [LWG#260](http://wg21.link/lwg260) * [LWG#261](http://wg21.link/lwg261) * [LWG#262](http://wg21.link/lwg262) * [LWG#263](http://wg21.link/lwg263) * [LWG#264](http://wg21.link/lwg264) * [LWG#265](http://wg21.link/lwg265) * [LWG#266](http://wg21.link/lwg266) * [LWG#268](http://wg21.link/lwg268) * [LWG#270](http://wg21.link/lwg270) * [LWG#271](http://wg21.link/lwg271) * [LWG#272](http://wg21.link/lwg272) * [LWG#273](http://wg21.link/lwg273) * [LWG#274](http://wg21.link/lwg274) * [LWG#275](http://wg21.link/lwg275) * [LWG#276](http://wg21.link/lwg276) * [LWG#278](http://wg21.link/lwg278) * [LWG#280](http://wg21.link/lwg280) * [LWG#281](http://wg21.link/lwg281) * [LWG#282](http://wg21.link/lwg282) * [LWG#283](http://wg21.link/lwg283) * [LWG#284](http://wg21.link/lwg284) * [LWG#285](http://wg21.link/lwg285) * [LWG#286](http://wg21.link/lwg286) * [LWG#288](http://wg21.link/lwg288) * [LWG#291](http://wg21.link/lwg291) * [LWG#292](http://wg21.link/lwg292) * [LWG#294](http://wg21.link/lwg294) * [LWG#295](http://wg21.link/lwg295) * [LWG#296](http://wg21.link/lwg296) * [LWG#297](http://wg21.link/lwg297) * [LWG#298](http://wg21.link/lwg298) * [LWG#300](http://wg21.link/lwg300) * [LWG#301](http://wg21.link/lwg301) * [LWG#303](http://wg21.link/lwg303) * [LWG#305](http://wg21.link/lwg305) * [LWG#306](http://wg21.link/lwg306) * [LWG#307](http://wg21.link/lwg307) * [LWG#308](http://wg21.link/lwg308) * [LWG#310](http://wg21.link/lwg310) * [LWG#311](http://wg21.link/lwg311) * [LWG#312](http://wg21.link/lwg312) * [LWG#315](http://wg21.link/lwg315) * [LWG#316](http://wg21.link/lwg316) * [LWG#317](http://wg21.link/lwg317) * [LWG#318](http://wg21.link/lwg318) * [LWG#319](http://wg21.link/lwg319) * [LWG#320](http://wg21.link/lwg320) * [LWG#321](http://wg21.link/lwg321) * [LWG#322](http://wg21.link/lwg322) * [LWG#324](http://wg21.link/lwg324) * [LWG#325](http://wg21.link/lwg325) * [LWG#327](http://wg21.link/lwg327) * [LWG#328](http://wg21.link/lwg328) * [LWG#329](http://wg21.link/lwg329) * [LWG#331](http://wg21.link/lwg331) * [LWG#333](http://wg21.link/lwg333) * [LWG#334](http://wg21.link/lwg334) * [LWG#335](http://wg21.link/lwg335) * [LWG#336](http://wg21.link/lwg336) * [LWG#337](http://wg21.link/lwg337) * [LWG#338](http://wg21.link/lwg338) * [LWG#339](http://wg21.link/lwg339) * [LWG#340](http://wg21.link/lwg340) * [LWG#341](http://wg21.link/lwg341) * [LWG#343](http://wg21.link/lwg343) * [LWG#345](http://wg21.link/lwg345) * [LWG#346](http://wg21.link/lwg346) * [LWG#347](http://wg21.link/lwg347) * [LWG#349](http://wg21.link/lwg349) * [LWG#352](http://wg21.link/lwg352) * [LWG#353](http://wg21.link/lwg353) * [LWG#354](http://wg21.link/lwg354) * [LWG#355](http://wg21.link/lwg355) * [LWG#358](http://wg21.link/lwg358) * [LWG#359](http://wg21.link/lwg359) * [LWG#360](http://wg21.link/lwg360) * [LWG#362](http://wg21.link/lwg362) * [LWG#363](http://wg21.link/lwg363) * [LWG#364](http://wg21.link/lwg364) * [LWG#365](http://wg21.link/lwg365) * [LWG#369](http://wg21.link/lwg369) * [LWG#370](http://wg21.link/lwg370) * [LWG#371](http://wg21.link/lwg371) * [LWG#373](http://wg21.link/lwg373) * [LWG#375](http://wg21.link/lwg375) * [LWG#376](http://wg21.link/lwg376) * [LWG#379](http://wg21.link/lwg379) * [LWG#380](http://wg21.link/lwg380) * [LWG#381](http://wg21.link/lwg381) * [LWG#383](http://wg21.link/lwg383) * [LWG#384](http://wg21.link/lwg384) * [LWG#386](http://wg21.link/lwg386) * [LWG#387](http://wg21.link/lwg387) * [LWG#389](http://wg21.link/lwg389) * [LWG#391](http://wg21.link/lwg391) * [LWG#395](http://wg21.link/lwg395) * [LWG#396](http://wg21.link/lwg396) * [LWG#400](http://wg21.link/lwg400) * [LWG#401](http://wg21.link/lwg401) * [LWG#402](http://wg21.link/lwg402) * [LWG#403](http://wg21.link/lwg403) * [LWG#404](http://wg21.link/lwg404) * [LWG#405](http://wg21.link/lwg405) * [LWG#406](http://wg21.link/lwg406) * [LWG#407](http://wg21.link/lwg407) * [LWG#409](http://wg21.link/lwg409) * [LWG#410](http://wg21.link/lwg410) * [LWG#411](http://wg21.link/lwg411) * [LWG#412](http://wg21.link/lwg412) * [LWG#413](http://wg21.link/lwg413) * [LWG#414](http://wg21.link/lwg414) * [LWG#415](http://wg21.link/lwg415) * [LWG#416](http://wg21.link/lwg416) * [LWG#419](http://wg21.link/lwg419) * [LWG#420](http://wg21.link/lwg420) * [LWG#422](http://wg21.link/lwg422) * [LWG#425](http://wg21.link/lwg425) * [LWG#426](http://wg21.link/lwg426) * [LWG#427](http://wg21.link/lwg427) * [LWG#428](http://wg21.link/lwg428) * [LWG#430](http://wg21.link/lwg430) * [LWG#431](http://wg21.link/lwg431) * [LWG#432](http://wg21.link/lwg432) * [LWG#434](http://wg21.link/lwg434) * [LWG#435](http://wg21.link/lwg435) * [LWG#436](http://wg21.link/lwg436) * [LWG#438](http://wg21.link/lwg438) * [LWG#441](http://wg21.link/lwg441) * [LWG#442](http://wg21.link/lwg442) * [LWG#443](http://wg21.link/lwg443) * [LWG#444](http://wg21.link/lwg444) * [LWG#445](http://wg21.link/lwg445) * [LWG#448](http://wg21.link/lwg448) * [LWG#449](http://wg21.link/lwg449) * [LWG#453](http://wg21.link/lwg453) * [LWG#455](http://wg21.link/lwg455) * [LWG#456](http://wg21.link/lwg456) * [LWG#457](http://wg21.link/lwg457) * [LWG#460](http://wg21.link/lwg460) * [LWG#461](http://wg21.link/lwg461) * [LWG#464](http://wg21.link/lwg464) * [LWG#465](http://wg21.link/lwg465) * [LWG#467](http://wg21.link/lwg467) * [LWG#468](http://wg21.link/lwg468) * [LWG#469](http://wg21.link/lwg469) * [LWG#471](http://wg21.link/lwg471) * [LWG#473](http://wg21.link/lwg473) * [LWG#474](http://wg21.link/lwg474) * [LWG#475](http://wg21.link/lwg475) * [LWG#478](http://wg21.link/lwg478) * [LWG#482](http://wg21.link/lwg482) * [LWG#485](http://wg21.link/lwg485) * [LWG#488](http://wg21.link/lwg488) * [LWG#495](http://wg21.link/lwg495) * [LWG#496](http://wg21.link/lwg496) * [LWG#497](http://wg21.link/lwg497) * [LWG#498](http://wg21.link/lwg498) * [LWG#505](http://wg21.link/lwg505) * [LWG#507](http://wg21.link/lwg507) * [LWG#508](http://wg21.link/lwg508) * [LWG#518](http://wg21.link/lwg518) * [LWG#519](http://wg21.link/lwg519) * [LWG#520](http://wg21.link/lwg520) * [LWG#521](http://wg21.link/lwg521) * [LWG#522](http://wg21.link/lwg522) * [LWG#524](http://wg21.link/lwg524) * [LWG#525](http://wg21.link/lwg525) * [LWG#527](http://wg21.link/lwg527) * [LWG#530](http://wg21.link/lwg530) * [LWG#531](http://wg21.link/lwg531) * [LWG#533](http://wg21.link/lwg533) * [LWG#534](http://wg21.link/lwg534) * [LWG#535](http://wg21.link/lwg535) * [LWG#537](http://wg21.link/lwg537) * [LWG#538](http://wg21.link/lwg538) * [LWG#539](http://wg21.link/lwg539) * [LWG#540](http://wg21.link/lwg540) * [LWG#541](http://wg21.link/lwg541) * [LWG#542](http://wg21.link/lwg542) * [LWG#543](http://wg21.link/lwg543) * [LWG#545](http://wg21.link/lwg545) * [LWG#550](http://wg21.link/lwg550) * [LWG#551](http://wg21.link/lwg551) * [LWG#552](http://wg21.link/lwg552) * [LWG#556](http://wg21.link/lwg556) * [LWG#559](http://wg21.link/lwg559) * [LWG#561](http://wg21.link/lwg561) * [LWG#562](http://wg21.link/lwg562) * [LWG#563](http://wg21.link/lwg563) * [LWG#564](http://wg21.link/lwg564) * [LWG#565](http://wg21.link/lwg565) * [LWG#566](http://wg21.link/lwg566) * [LWG#567](http://wg21.link/lwg567) * [LWG#574](http://wg21.link/lwg574) * [LWG#575](http://wg21.link/lwg575) * [LWG#576](http://wg21.link/lwg576) * [LWG#577](http://wg21.link/lwg577) * [LWG#578](http://wg21.link/lwg578) * [LWG#581](http://wg21.link/lwg581) * [LWG#586](http://wg21.link/lwg586) * [LWG#589](http://wg21.link/lwg589) * [LWG#593](http://wg21.link/lwg593) * [LWG#594](http://wg21.link/lwg594) * [LWG#595](http://wg21.link/lwg595) * [LWG#596](http://wg21.link/lwg596) * [LWG#607](http://wg21.link/lwg607) * [LWG#608](http://wg21.link/lwg608) * [LWG#609](http://wg21.link/lwg609) * [LWG#610](http://wg21.link/lwg610) * [LWG#611](http://wg21.link/lwg611) * [LWG#612](http://wg21.link/lwg612) * [LWG#613](http://wg21.link/lwg613) * [LWG#616](http://wg21.link/lwg616) * [LWG#618](http://wg21.link/lwg618) * [LWG#619](http://wg21.link/lwg619) * [LWG#620](http://wg21.link/lwg620) * [LWG#621](http://wg21.link/lwg621) * [LWG#622](http://wg21.link/lwg622) * [LWG#623](http://wg21.link/lwg623) * [LWG#624](http://wg21.link/lwg624) * [LWG#625](http://wg21.link/lwg625) * [LWG#628](http://wg21.link/lwg628) * [LWG#629](http://wg21.link/lwg629) * [LWG#630](http://wg21.link/lwg630) * [LWG#634](http://wg21.link/lwg634) * [LWG#635](http://wg21.link/lwg635) * [LWG#638](http://wg21.link/lwg638) * [LWG#640](http://wg21.link/lwg640) * [LWG#643](http://wg21.link/lwg643) * [LWG#646](http://wg21.link/lwg646) * [LWG#650](http://wg21.link/lwg650) * [LWG#651](http://wg21.link/lwg651) * [LWG#652](http://wg21.link/lwg652) * [LWG#654](http://wg21.link/lwg654) * [LWG#655](http://wg21.link/lwg655) * [LWG#658](http://wg21.link/lwg658) * [LWG#659](http://wg21.link/lwg659) * [LWG#660](http://wg21.link/lwg660) * [LWG#661](http://wg21.link/lwg661) * [LWG#664](http://wg21.link/lwg664) * [LWG#665](http://wg21.link/lwg665) * [LWG#666](http://wg21.link/lwg666) * [LWG#671](http://wg21.link/lwg671) * [LWG#672](http://wg21.link/lwg672) * [LWG#673](http://wg21.link/lwg673) * [LWG#674](http://wg21.link/lwg674) * [LWG#675](http://wg21.link/lwg675) * [LWG#676](http://wg21.link/lwg676) * [LWG#677](http://wg21.link/lwg677) * [LWG#678](http://wg21.link/lwg678) * [LWG#679](http://wg21.link/lwg679) * [LWG#680](http://wg21.link/lwg680) * [LWG#681](http://wg21.link/lwg681) * [LWG#682](http://wg21.link/lwg682) * [LWG#685](http://wg21.link/lwg685) * [LWG#687](http://wg21.link/lwg687) * [LWG#688](http://wg21.link/lwg688) * [LWG#689](http://wg21.link/lwg689) * [LWG#691](http://wg21.link/lwg691) * [LWG#692](http://wg21.link/lwg692) * [LWG#693](http://wg21.link/lwg693) * [LWG#694](http://wg21.link/lwg694) * [LWG#695](http://wg21.link/lwg695) * [LWG#696](http://wg21.link/lwg696) * [LWG#697](http://wg21.link/lwg697) * [LWG#698](http://wg21.link/lwg698) * [LWG#699](http://wg21.link/lwg699) * [LWG#700](http://wg21.link/lwg700) * [LWG#703](http://wg21.link/lwg703) * [LWG#704](http://wg21.link/lwg704) * [LWG#705](http://wg21.link/lwg705) * [LWG#706](http://wg21.link/lwg706) * [LWG#709](http://wg21.link/lwg709) * [LWG#710](http://wg21.link/lwg710) * [LWG#711](http://wg21.link/lwg711) * [LWG#712](http://wg21.link/lwg712) * [LWG#713](http://wg21.link/lwg713) * [LWG#714](http://wg21.link/lwg714) * [LWG#715](http://wg21.link/lwg715) * [LWG#716](http://wg21.link/lwg716) * [LWG#719](http://wg21.link/lwg719) * [LWG#720](http://wg21.link/lwg720) * [LWG#722](http://wg21.link/lwg722) * [LWG#723](http://wg21.link/lwg723) * [LWG#724](http://wg21.link/lwg724) * [LWG#727](http://wg21.link/lwg727) * [LWG#728](http://wg21.link/lwg728) * [LWG#732](http://wg21.link/lwg732) * [LWG#734](http://wg21.link/lwg734) * [LWG#740](http://wg21.link/lwg740) * [LWG#742](http://wg21.link/lwg742) * [LWG#743](http://wg21.link/lwg743) * [LWG#744](http://wg21.link/lwg744) * [LWG#746](http://wg21.link/lwg746) * [LWG#749](http://wg21.link/lwg749) * [LWG#752](http://wg21.link/lwg752) * [LWG#753](http://wg21.link/lwg753) * [LWG#755](http://wg21.link/lwg755) * [LWG#756](http://wg21.link/lwg756) * [LWG#758](http://wg21.link/lwg758) * [LWG#759](http://wg21.link/lwg759) * [LWG#761](http://wg21.link/lwg761) * [LWG#762](http://wg21.link/lwg762) * [LWG#765](http://wg21.link/lwg765) * [LWG#766](http://wg21.link/lwg766) * [LWG#767](http://wg21.link/lwg767) * [LWG#768](http://wg21.link/lwg768) * [LWG#769](http://wg21.link/lwg769) * [LWG#770](http://wg21.link/lwg770) * [LWG#771](http://wg21.link/lwg771) * [LWG#772](http://wg21.link/lwg772) * [LWG#774](http://wg21.link/lwg774) * [LWG#775](http://wg21.link/lwg775) * [LWG#776](http://wg21.link/lwg776) * [LWG#777](http://wg21.link/lwg777) * [LWG#778](http://wg21.link/lwg778) * [LWG#779](http://wg21.link/lwg779) * [LWG#780](http://wg21.link/lwg780) * [LWG#781](http://wg21.link/lwg781) * [LWG#782](http://wg21.link/lwg782) * [LWG#783](http://wg21.link/lwg783) * [LWG#786](http://wg21.link/lwg786) * [LWG#787](http://wg21.link/lwg787) * [LWG#788](http://wg21.link/lwg788) * [LWG#789](http://wg21.link/lwg789) * [LWG#792](http://wg21.link/lwg792) * [LWG#793](http://wg21.link/lwg793) * [LWG#794](http://wg21.link/lwg794) * [LWG#798](http://wg21.link/lwg798) * [LWG#800](http://wg21.link/lwg800) * [LWG#801](http://wg21.link/lwg801) * [LWG#803](http://wg21.link/lwg803) * [LWG#804](http://wg21.link/lwg804) * [LWG#805](http://wg21.link/lwg805) * [LWG#806](http://wg21.link/lwg806) * [LWG#807](http://wg21.link/lwg807) * [LWG#808](http://wg21.link/lwg808) * [LWG#809](http://wg21.link/lwg809) * [LWG#810](http://wg21.link/lwg810) * [LWG#811](http://wg21.link/lwg811) * [LWG#813](http://wg21.link/lwg813) * [LWG#814](http://wg21.link/lwg814) * [LWG#815](http://wg21.link/lwg815) * [LWG#816](http://wg21.link/lwg816) * [LWG#817](http://wg21.link/lwg817) * [LWG#818](http://wg21.link/lwg818) * [LWG#819](http://wg21.link/lwg819) * [LWG#820](http://wg21.link/lwg820) * [LWG#821](http://wg21.link/lwg821) * [LWG#823](http://wg21.link/lwg823) * [LWG#824](http://wg21.link/lwg824) * [LWG#825](http://wg21.link/lwg825) * [LWG#827](http://wg21.link/lwg827) * [LWG#828](http://wg21.link/lwg828) * [LWG#829](http://wg21.link/lwg829) * [LWG#834](http://wg21.link/lwg834) * [LWG#835](http://wg21.link/lwg835) * [LWG#836](http://wg21.link/lwg836) * [LWG#838](http://wg21.link/lwg838) * [LWG#842](http://wg21.link/lwg842) * [LWG#843](http://wg21.link/lwg843) * [LWG#844](http://wg21.link/lwg844) * [LWG#845](http://wg21.link/lwg845) * [LWG#846](http://wg21.link/lwg846) * [LWG#847](http://wg21.link/lwg847) * [LWG#848](http://wg21.link/lwg848) * [LWG#850](http://wg21.link/lwg850) * [LWG#852](http://wg21.link/lwg852) * [LWG#853](http://wg21.link/lwg853) * [LWG#854](http://wg21.link/lwg854) * [LWG#856](http://wg21.link/lwg856) * [LWG#857](http://wg21.link/lwg857) * [LWG#858](http://wg21.link/lwg858) * [LWG#859](http://wg21.link/lwg859) * [LWG#860](http://wg21.link/lwg860) * [LWG#861](http://wg21.link/lwg861) * [LWG#865](http://wg21.link/lwg865) * [LWG#866](http://wg21.link/lwg866) * [LWG#868](http://wg21.link/lwg868) * [LWG#869](http://wg21.link/lwg869) * [LWG#870](http://wg21.link/lwg870) * [LWG#871](http://wg21.link/lwg871) * [LWG#872](http://wg21.link/lwg872) * [LWG#874](http://wg21.link/lwg874) * [LWG#875](http://wg21.link/lwg875) * [LWG#876](http://wg21.link/lwg876) * [LWG#878](http://wg21.link/lwg878) * [LWG#880](http://wg21.link/lwg880) * [LWG#881](http://wg21.link/lwg881) * [LWG#882](http://wg21.link/lwg882) * [LWG#883](http://wg21.link/lwg883) * [LWG#884](http://wg21.link/lwg884) * [LWG#885](http://wg21.link/lwg885) * [LWG#886](http://wg21.link/lwg886) * [LWG#888](http://wg21.link/lwg888) * [LWG#889](http://wg21.link/lwg889) * [LWG#890](http://wg21.link/lwg890) * [LWG#891](http://wg21.link/lwg891) * [LWG#893](http://wg21.link/lwg893) * [LWG#894](http://wg21.link/lwg894) * [LWG#896](http://wg21.link/lwg896) * [LWG#897](http://wg21.link/lwg897) * [LWG#898](http://wg21.link/lwg898) * [LWG#899](http://wg21.link/lwg899) * [LWG#900](http://wg21.link/lwg900) * [LWG#904](http://wg21.link/lwg904) * [LWG#907](http://wg21.link/lwg907) * [LWG#908](http://wg21.link/lwg908) * [LWG#909](http://wg21.link/lwg909) * [LWG#911](http://wg21.link/lwg911) * [LWG#920](http://wg21.link/lwg920) * [LWG#921](http://wg21.link/lwg921) * [LWG#922](http://wg21.link/lwg922) * [LWG#923](http://wg21.link/lwg923) * [LWG#924](http://wg21.link/lwg924) * [LWG#925](http://wg21.link/lwg925) * [LWG#929](http://wg21.link/lwg929) * [LWG#931](http://wg21.link/lwg931) * [LWG#932](http://wg21.link/lwg932) * [LWG#934](http://wg21.link/lwg934) * [LWG#938](http://wg21.link/lwg938) * [LWG#939](http://wg21.link/lwg939) * [LWG#940](http://wg21.link/lwg940) * [LWG#943](http://wg21.link/lwg943) * [LWG#944](http://wg21.link/lwg944) * [LWG#947](http://wg21.link/lwg947) * [LWG#948](http://wg21.link/lwg948) * [LWG#949](http://wg21.link/lwg949) * [LWG#950](http://wg21.link/lwg950) * [LWG#951](http://wg21.link/lwg951) * [LWG#953](http://wg21.link/lwg953) * [LWG#954](http://wg21.link/lwg954) * [LWG#956](http://wg21.link/lwg956) * [LWG#957](http://wg21.link/lwg957) * [LWG#958](http://wg21.link/lwg958) * [LWG#960](http://wg21.link/lwg960) * [LWG#962](http://wg21.link/lwg962) * [LWG#963](http://wg21.link/lwg963) * [LWG#964](http://wg21.link/lwg964) * [LWG#965](http://wg21.link/lwg965) * [LWG#966](http://wg21.link/lwg966) * [LWG#967](http://wg21.link/lwg967) * [LWG#968](http://wg21.link/lwg968) * [LWG#970](http://wg21.link/lwg970) * [LWG#974](http://wg21.link/lwg974) * [LWG#975](http://wg21.link/lwg975) * [LWG#976](http://wg21.link/lwg976) * [LWG#978](http://wg21.link/lwg978) * [LWG#981](http://wg21.link/lwg981) * [LWG#982](http://wg21.link/lwg982) * [LWG#983](http://wg21.link/lwg983) * [LWG#984](http://wg21.link/lwg984) * [LWG#985](http://wg21.link/lwg985) * [LWG#986](http://wg21.link/lwg986) * [LWG#987](http://wg21.link/lwg987) * [LWG#990](http://wg21.link/lwg990) * [LWG#991](http://wg21.link/lwg991) * [LWG#993](http://wg21.link/lwg993) * [LWG#994](http://wg21.link/lwg994) * [LWG#997](http://wg21.link/lwg997) * [LWG#998](http://wg21.link/lwg998) * [LWG#999](http://wg21.link/lwg999) * [LWG#1004](http://wg21.link/lwg1004) * [LWG#1006](http://wg21.link/lwg1006) * [LWG#1011](http://wg21.link/lwg1011) * [LWG#1012](http://wg21.link/lwg1012) * [LWG#1014](http://wg21.link/lwg1014) * [LWG#1019](http://wg21.link/lwg1019) * [LWG#1021](http://wg21.link/lwg1021) * [LWG#1030](http://wg21.link/lwg1030) * [LWG#1033](http://wg21.link/lwg1033) * [LWG#1034](http://wg21.link/lwg1034) * [LWG#1037](http://wg21.link/lwg1037) * [LWG#1038](http://wg21.link/lwg1038) * [LWG#1039](http://wg21.link/lwg1039) * [LWG#1040](http://wg21.link/lwg1040) * [LWG#1043](http://wg21.link/lwg1043) * [LWG#1044](http://wg21.link/lwg1044) * [LWG#1045](http://wg21.link/lwg1045) * [LWG#1046](http://wg21.link/lwg1046) * [LWG#1047](http://wg21.link/lwg1047) * [LWG#1048](http://wg21.link/lwg1048) * [LWG#1049](http://wg21.link/lwg1049) * [LWG#1050](http://wg21.link/lwg1050) * [LWG#1054](http://wg21.link/lwg1054) * [LWG#1055](http://wg21.link/lwg1055) * [LWG#1065](http://wg21.link/lwg1065) * [LWG#1066](http://wg21.link/lwg1066) * [LWG#1070](http://wg21.link/lwg1070) * [LWG#1071](http://wg21.link/lwg1071) * [LWG#1073](http://wg21.link/lwg1073) * [LWG#1075](http://wg21.link/lwg1075) * [LWG#1079](http://wg21.link/lwg1079) * [LWG#1088](http://wg21.link/lwg1088) * [LWG#1089](http://wg21.link/lwg1089) * [LWG#1090](http://wg21.link/lwg1090) * [LWG#1093](http://wg21.link/lwg1093) * [LWG#1094](http://wg21.link/lwg1094) * [LWG#1095](http://wg21.link/lwg1095) * [LWG#1097](http://wg21.link/lwg1097) * [LWG#1098](http://wg21.link/lwg1098) * [LWG#1100](http://wg21.link/lwg1100) * [LWG#1103](http://wg21.link/lwg1103) * [LWG#1104](http://wg21.link/lwg1104) * [LWG#1106](http://wg21.link/lwg1106) * [LWG#1108](http://wg21.link/lwg1108) * [LWG#1110](http://wg21.link/lwg1110) * [LWG#1113](http://wg21.link/lwg1113) * [LWG#1114](http://wg21.link/lwg1114) * [LWG#1116](http://wg21.link/lwg1116) * [LWG#1117](http://wg21.link/lwg1117) * [LWG#1118](http://wg21.link/lwg1118) * [LWG#1122](http://wg21.link/lwg1122) * [LWG#1123](http://wg21.link/lwg1123) * [LWG#1126](http://wg21.link/lwg1126) * [LWG#1129](http://wg21.link/lwg1129) * [LWG#1130](http://wg21.link/lwg1130) * [LWG#1131](http://wg21.link/lwg1131) * [LWG#1133](http://wg21.link/lwg1133) * [LWG#1134](http://wg21.link/lwg1134) * [LWG#1135](http://wg21.link/lwg1135) * [LWG#1136](http://wg21.link/lwg1136) * [LWG#1137](http://wg21.link/lwg1137) * [LWG#1138](http://wg21.link/lwg1138) * [LWG#1143](http://wg21.link/lwg1143) * [LWG#1144](http://wg21.link/lwg1144) * [LWG#1145](http://wg21.link/lwg1145) * [LWG#1146](http://wg21.link/lwg1146) * [LWG#1147](http://wg21.link/lwg1147) * [LWG#1151](http://wg21.link/lwg1151) * [LWG#1152](http://wg21.link/lwg1152) * [LWG#1157](http://wg21.link/lwg1157) * [LWG#1158](http://wg21.link/lwg1158) * [LWG#1159](http://wg21.link/lwg1159) * [LWG#1160](http://wg21.link/lwg1160) * [LWG#1161](http://wg21.link/lwg1161) * [LWG#1162](http://wg21.link/lwg1162) * [LWG#1163](http://wg21.link/lwg1163) * [LWG#1165](http://wg21.link/lwg1165) * [LWG#1166](http://wg21.link/lwg1166) * [LWG#1170](http://wg21.link/lwg1170) * [LWG#1171](http://wg21.link/lwg1171) * [LWG#1172](http://wg21.link/lwg1172) * [LWG#1174](http://wg21.link/lwg1174) * [LWG#1177](http://wg21.link/lwg1177) * [LWG#1178](http://wg21.link/lwg1178) * [LWG#1180](http://wg21.link/lwg1180) * [LWG#1181](http://wg21.link/lwg1181) * [LWG#1182](http://wg21.link/lwg1182) * [LWG#1183](http://wg21.link/lwg1183) * [LWG#1185](http://wg21.link/lwg1185) * [LWG#1187](http://wg21.link/lwg1187) * [LWG#1189](http://wg21.link/lwg1189) * [LWG#1191](http://wg21.link/lwg1191) * [LWG#1192](http://wg21.link/lwg1192) * [LWG#1193](http://wg21.link/lwg1193) * [LWG#1194](http://wg21.link/lwg1194) * [LWG#1195](http://wg21.link/lwg1195) * [LWG#1196](http://wg21.link/lwg1196) * [LWG#1197](http://wg21.link/lwg1197) * [LWG#1198](http://wg21.link/lwg1198) * [LWG#1199](http://wg21.link/lwg1199) * [LWG#1204](http://wg21.link/lwg1204) * [LWG#1205](http://wg21.link/lwg1205) * [LWG#1206](http://wg21.link/lwg1206) * [LWG#1207](http://wg21.link/lwg1207) * [LWG#1208](http://wg21.link/lwg1208) * [LWG#1209](http://wg21.link/lwg1209) * [LWG#1210](http://wg21.link/lwg1210) * [LWG#1211](http://wg21.link/lwg1211) * [LWG#1212](http://wg21.link/lwg1212) * [LWG#1215](http://wg21.link/lwg1215) * [LWG#1216](http://wg21.link/lwg1216) * [LWG#1218](http://wg21.link/lwg1218) * [LWG#1220](http://wg21.link/lwg1220) * [LWG#1221](http://wg21.link/lwg1221) * [LWG#1222](http://wg21.link/lwg1222) * [LWG#1225](http://wg21.link/lwg1225) * [LWG#1226](http://wg21.link/lwg1226) * [LWG#1227](http://wg21.link/lwg1227) * [LWG#1229](http://wg21.link/lwg1229) * [LWG#1231](http://wg21.link/lwg1231) * [LWG#1234](http://wg21.link/lwg1234) * [LWG#1237](http://wg21.link/lwg1237) * [LWG#1240](http://wg21.link/lwg1240) * [LWG#1241](http://wg21.link/lwg1241) * [LWG#1244](http://wg21.link/lwg1244) * [LWG#1245](http://wg21.link/lwg1245) * [LWG#1247](http://wg21.link/lwg1247) * [LWG#1248](http://wg21.link/lwg1248) * [LWG#1249](http://wg21.link/lwg1249) * [LWG#1250](http://wg21.link/lwg1250) * [LWG#1252](http://wg21.link/lwg1252) * [LWG#1253](http://wg21.link/lwg1253) * [LWG#1254](http://wg21.link/lwg1254) * [LWG#1255](http://wg21.link/lwg1255) * [LWG#1256](http://wg21.link/lwg1256) * [LWG#1257](http://wg21.link/lwg1257) * [LWG#1258](http://wg21.link/lwg1258) * [LWG#1260](http://wg21.link/lwg1260) * [LWG#1261](http://wg21.link/lwg1261) * [LWG#1262](http://wg21.link/lwg1262) * [LWG#1264](http://wg21.link/lwg1264) * [LWG#1266](http://wg21.link/lwg1266) * [LWG#1267](http://wg21.link/lwg1267) * [LWG#1268](http://wg21.link/lwg1268) * [LWG#1269](http://wg21.link/lwg1269) * [LWG#1270](http://wg21.link/lwg1270) * [LWG#1271](http://wg21.link/lwg1271) * [LWG#1272](http://wg21.link/lwg1272) * [LWG#1273](http://wg21.link/lwg1273) * [LWG#1274](http://wg21.link/lwg1274) * [LWG#1275](http://wg21.link/lwg1275) * [LWG#1276](http://wg21.link/lwg1276) * [LWG#1277](http://wg21.link/lwg1277) * [LWG#1278](http://wg21.link/lwg1278) * [LWG#1279](http://wg21.link/lwg1279) * [LWG#1280](http://wg21.link/lwg1280) * [LWG#1281](http://wg21.link/lwg1281) * [LWG#1283](http://wg21.link/lwg1283) * [LWG#1284](http://wg21.link/lwg1284) * [LWG#1285](http://wg21.link/lwg1285) * [LWG#1286](http://wg21.link/lwg1286) * [LWG#1287](http://wg21.link/lwg1287) * [LWG#1288](http://wg21.link/lwg1288) * [LWG#1290](http://wg21.link/lwg1290) * [LWG#1291](http://wg21.link/lwg1291) * [LWG#1292](http://wg21.link/lwg1292) * [LWG#1293](http://wg21.link/lwg1293) * [LWG#1294](http://wg21.link/lwg1294) * [LWG#1295](http://wg21.link/lwg1295) * [LWG#1297](http://wg21.link/lwg1297) * [LWG#1298](http://wg21.link/lwg1298) * [LWG#1299](http://wg21.link/lwg1299) * [LWG#1300](http://wg21.link/lwg1300) * [LWG#1303](http://wg21.link/lwg1303) * [LWG#1304](http://wg21.link/lwg1304) * [LWG#1305](http://wg21.link/lwg1305) * [LWG#1306](http://wg21.link/lwg1306) * [LWG#1307](http://wg21.link/lwg1307) * [LWG#1309](http://wg21.link/lwg1309) * [LWG#1310](http://wg21.link/lwg1310) * [LWG#1311](http://wg21.link/lwg1311) * [LWG#1312](http://wg21.link/lwg1312) * [LWG#1316](http://wg21.link/lwg1316) * [LWG#1319](http://wg21.link/lwg1319) * [LWG#1321](http://wg21.link/lwg1321) * [LWG#1322](http://wg21.link/lwg1322) * [LWG#1323](http://wg21.link/lwg1323) * [LWG#1324](http://wg21.link/lwg1324) * [LWG#1325](http://wg21.link/lwg1325) * [LWG#1326](http://wg21.link/lwg1326) * [LWG#1327](http://wg21.link/lwg1327) * [LWG#1328](http://wg21.link/lwg1328) * [LWG#1329](http://wg21.link/lwg1329) * [LWG#1332](http://wg21.link/lwg1332) * [LWG#1333](http://wg21.link/lwg1333) * [LWG#1334](http://wg21.link/lwg1334) * [LWG#1335](http://wg21.link/lwg1335) * [LWG#1337](http://wg21.link/lwg1337) * [LWG#1338](http://wg21.link/lwg1338) * [LWG#1339](http://wg21.link/lwg1339) * [LWG#1340](http://wg21.link/lwg1340) * [LWG#1344](http://wg21.link/lwg1344) * [LWG#1345](http://wg21.link/lwg1345) * [LWG#1346](http://wg21.link/lwg1346) * [LWG#1347](http://wg21.link/lwg1347) * [LWG#1349](http://wg21.link/lwg1349) * [LWG#1353](http://wg21.link/lwg1353) * [LWG#1354](http://wg21.link/lwg1354) * [LWG#1355](http://wg21.link/lwg1355) * [LWG#1356](http://wg21.link/lwg1356) * [LWG#1357](http://wg21.link/lwg1357) * [LWG#1360](http://wg21.link/lwg1360) * [LWG#1362](http://wg21.link/lwg1362) * [LWG#1363](http://wg21.link/lwg1363) * [LWG#1364](http://wg21.link/lwg1364) * [LWG#1365](http://wg21.link/lwg1365) * [LWG#1366](http://wg21.link/lwg1366) * [LWG#1367](http://wg21.link/lwg1367) * [LWG#1368](http://wg21.link/lwg1368) * [LWG#1370](http://wg21.link/lwg1370) * [LWG#1372](http://wg21.link/lwg1372) * [LWG#1377](http://wg21.link/lwg1377) * [LWG#1378](http://wg21.link/lwg1378) * [LWG#1379](http://wg21.link/lwg1379) * [LWG#1380](http://wg21.link/lwg1380) * [LWG#1381](http://wg21.link/lwg1381) * [LWG#1382](http://wg21.link/lwg1382) * [LWG#1383](http://wg21.link/lwg1383) * [LWG#1384](http://wg21.link/lwg1384) * [LWG#1385](http://wg21.link/lwg1385) * [LWG#1386](http://wg21.link/lwg1386) * [LWG#1387](http://wg21.link/lwg1387) * [LWG#1388](http://wg21.link/lwg1388) * [LWG#1389](http://wg21.link/lwg1389) * [LWG#1390](http://wg21.link/lwg1390) * [LWG#1391](http://wg21.link/lwg1391) * [LWG#1392](http://wg21.link/lwg1392) * [LWG#1393](http://wg21.link/lwg1393) * [LWG#1394](http://wg21.link/lwg1394) * [LWG#1397](http://wg21.link/lwg1397) * [LWG#1399](http://wg21.link/lwg1399) * [LWG#1400](http://wg21.link/lwg1400) * [LWG#1401](http://wg21.link/lwg1401) * [LWG#1402](http://wg21.link/lwg1402) * [LWG#1403](http://wg21.link/lwg1403) * [LWG#1404](http://wg21.link/lwg1404) * [LWG#1405](http://wg21.link/lwg1405) * [LWG#1407](http://wg21.link/lwg1407) * [LWG#1408](http://wg21.link/lwg1408) * [LWG#1409](http://wg21.link/lwg1409) * [LWG#1410](http://wg21.link/lwg1410) * [LWG#1412](http://wg21.link/lwg1412) * [LWG#1414](http://wg21.link/lwg1414) * [LWG#1416](http://wg21.link/lwg1416) * [LWG#1417](http://wg21.link/lwg1417) * [LWG#1418](http://wg21.link/lwg1418) * [LWG#1420](http://wg21.link/lwg1420) * [LWG#1421](http://wg21.link/lwg1421) * [LWG#1423](http://wg21.link/lwg1423) * [LWG#1424](http://wg21.link/lwg1424) * [LWG#1425](http://wg21.link/lwg1425) * [LWG#1426](http://wg21.link/lwg1426) * [LWG#1427](http://wg21.link/lwg1427) * [LWG#1428](http://wg21.link/lwg1428) * [LWG#1429](http://wg21.link/lwg1429) * [LWG#1430](http://wg21.link/lwg1430) * [LWG#1431](http://wg21.link/lwg1431) * [LWG#1432](http://wg21.link/lwg1432) * [LWG#1435](http://wg21.link/lwg1435) * [LWG#1436](http://wg21.link/lwg1436) * [LWG#1437](http://wg21.link/lwg1437) * [LWG#1438](http://wg21.link/lwg1438) * [LWG#1439](http://wg21.link/lwg1439) * [LWG#1440](http://wg21.link/lwg1440) * [LWG#1441](http://wg21.link/lwg1441) * [LWG#1445](http://wg21.link/lwg1445) * [LWG#1447](http://wg21.link/lwg1447) * [LWG#1448](http://wg21.link/lwg1448) * [LWG#1449](http://wg21.link/lwg1449) * [LWG#1453](http://wg21.link/lwg1453) * [LWG#1455](http://wg21.link/lwg1455) * [LWG#1457](http://wg21.link/lwg1457) * [LWG#1460](http://wg21.link/lwg1460) * [LWG#1462](http://wg21.link/lwg1462) * [LWG#1464](http://wg21.link/lwg1464) * [LWG#1465](http://wg21.link/lwg1465) * [LWG#1466](http://wg21.link/lwg1466) * [LWG#1467](http://wg21.link/lwg1467) * [LWG#1468](http://wg21.link/lwg1468) * [LWG#1469](http://wg21.link/lwg1469) * [LWG#1474](http://wg21.link/lwg1474) * [LWG#1478](http://wg21.link/lwg1478) * [LWG#1479](http://wg21.link/lwg1479) * [LWG#1480](http://wg21.link/lwg1480) * [LWG#1481](http://wg21.link/lwg1481) * [LWG#1482](http://wg21.link/lwg1482) * [LWG#1487](http://wg21.link/lwg1487) * [LWG#1490](http://wg21.link/lwg1490) * [LWG#1491](http://wg21.link/lwg1491) * [LWG#1492](http://wg21.link/lwg1492) * [LWG#1494](http://wg21.link/lwg1494) * [LWG#1497](http://wg21.link/lwg1497) * [LWG#1498](http://wg21.link/lwg1498) * [LWG#1501](http://wg21.link/lwg1501) * [LWG#1502](http://wg21.link/lwg1502) * [LWG#1504](http://wg21.link/lwg1504) * [LWG#1505](http://wg21.link/lwg1505) * [LWG#1507](http://wg21.link/lwg1507) * [LWG#1508](http://wg21.link/lwg1508) * [LWG#1513](http://wg21.link/lwg1513) * [LWG#1514](http://wg21.link/lwg1514) * [LWG#1515](http://wg21.link/lwg1515) * [LWG#1516](http://wg21.link/lwg1516) * [LWG#1517](http://wg21.link/lwg1517) * [LWG#1518](http://wg21.link/lwg1518) * [LWG#1519](http://wg21.link/lwg1519) * [LWG#1520](http://wg21.link/lwg1520) * [LWG#1522](http://wg21.link/lwg1522) * [LWG#1523](http://wg21.link/lwg1523) * [LWG#1524](http://wg21.link/lwg1524) * [LWG#1525](http://wg21.link/lwg1525) * [LWG#2000](http://wg21.link/lwg2000) * [LWG#2001](http://wg21.link/lwg2001) * [LWG#2002](http://wg21.link/lwg2002) * [LWG#2004](http://wg21.link/lwg2004) * [LWG#2007](http://wg21.link/lwg2007) * [LWG#2008](http://wg21.link/lwg2008) * [LWG#2014](http://wg21.link/lwg2014) * [LWG#2017](http://wg21.link/lwg2017) * [LWG#2019](http://wg21.link/lwg2019) * [LWG#2020](http://wg21.link/lwg2020) * [LWG#2022](http://wg21.link/lwg2022) * [LWG#2023](http://wg21.link/lwg2023) * [LWG#2024](http://wg21.link/lwg2024) * [LWG#2025](http://wg21.link/lwg2025) * [LWG#2027](http://wg21.link/lwg2027) * [LWG#2029](http://wg21.link/lwg2029) * [LWG#2030](http://wg21.link/lwg2030) * [LWG#2031](http://wg21.link/lwg2031) * [LWG#2032](http://wg21.link/lwg2032) * [LWG#2034](http://wg21.link/lwg2034) * [LWG#2037](http://wg21.link/lwg2037) * [LWG#2041](http://wg21.link/lwg2041) * [LWG#2042](http://wg21.link/lwg2042) | Compiler support ----------------- Main Article: [C++11 compiler support](compiler_support#C.2B.2B11_features "cpp/compiler support"). ### C++11 core language features | C++11 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | HP aCC | Digital Mars C++ | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | C99 [preprocessor](preprocessor "cpp/preprocessor") | [N1653](https://wg21.link/N1653) | 4.3 | Yes | 19.0 (2015)\* (partial)\*19.26\* | Yes | 4.1 | 11.1 | 10.1 | 5.9 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | Yes | | [`static_assert`](language/static_assert "cpp/language/static assert") | [N1720](https://wg21.link/N1720) | 4.3 | 2.9 | 16.0\* | Yes | 4.1 | 11.0 | 11.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | 8.52 | | Right angle brackets | [N1757](https://wg21.link/N1757) | 4.3 | Yes | 14.0\* | Yes | 4.1 | 11.0 | 12.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | | | | Extended [`friend` declarations](language/friend "cpp/language/friend") | [N1791](https://wg21.link/N1791) | 4.7 | 2.9 | 16.0\* (partial)18.0\* | Yes | 4.1 | 11.1\*12.0 | 11.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [`long long`](language/types#long_long "cpp/language/types") | [N1811](https://wg21.link/N1811) | Yes | Yes | 14.0\* | Yes | Yes | Yes | Yes | Yes | Yes | 8.4 | 2015 | 7.0 | Yes | Yes | | Compiler support for [type traits](meta#Type_traits "cpp/meta") | [N1836](https://wg21.link/N1836)[N2518](https://wg21.link/N2518)\*[N2984](https://wg21.link/N2984)[N3142](https://wg21.link/N3142) | 4.3\*4.8\*5 | 3.0 | 14.0\*(partial)\*19.0 (2015)\* | Yes | 4.0 | 10.0 | 13.1.3 | 5.13 | Yes | 8.4 | 2015 | | 6.16 | | | [`auto`](language/auto "cpp/language/auto") | [N1984](https://wg21.link/N1984) | 4.4 | Yes | 16.0\* | Yes | 3.9 | 11.0 (v0.9)12.0 | 11.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [Delegating constructors](language/initializer_list#Delegating_constructor "cpp/language/initializer list") | [N1986](https://wg21.link/N1986) | 4.7 | 3.0 | 18.0\* | Yes | 4.7 | 14.0 | 11.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | `extern template` | [N1987](https://wg21.link/N1987) | 3.3 | Yes | 12.0\* | Yes | 3.9 | 9.0 | 11.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [`constexpr`](language/constexpr "cpp/language/constexpr") | [N2235](https://wg21.link/N2235) | 4.6 | 3.1 | 19.0 (2015)\* | Yes | 4.6 | 13.0\*14.0 | 12.1\*13.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | [Template aliases](language/type_alias "cpp/language/type alias") | [N2258](https://wg21.link/N2258) | 4.7 | 3.0 | 18.0\* | Yes | 4.2 | 12.1 | 13.1.1\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | | | [`char16_t`](language/types#char16_t "cpp/language/types") and [`char32_t`](language/types#char32_t "cpp/language/types") | [N2249](https://wg21.link/N2249) | 4.4 | 2.9 | 19.0 (2015)\* | Yes | 4.4 | 12.1\*14.0 | 13.1.1\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | 8.52 | | [`alignas`](language/alignas "cpp/language/alignas") | [N2341](https://wg21.link/N2341) | 4.8 | 3.0 | 19.0 (2015)\* | Yes | 4.8 | 15.0 | 13.1.2\* | 5.13 | Yes | 8.6 | 2015 | 7.0 | | | | [`alignof`](language/alignof "cpp/language/alignof") | [N2341](https://wg21.link/N2341) | 4.5 | 2.9 | 19.0 (2015)\* | Yes | 4.8 | 15.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | | | | Defaulted and deleted functions | [N2346](https://wg21.link/N2346) | 4.4 | 3.0 | 18.0\* | Yes | 4.1 | 12.0 | 13.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [Strongly-typed `enum`](language/enum#Scoped_enumerations "cpp/language/enum") | [N2347](https://wg21.link/N2347) | 4.4 | 2.9 | 17.0\* | Yes | 4.0 | 13.0 | 12.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [Atomic operations](thread#Atomic_operations "cpp/thread") | [N2427](https://wg21.link/N2427) | 4.4 | 3.1 | 17.0\* | Yes | Yes | 13.0 | 13.1.2\* | 5.14 | Yes | 8.4 | 2015 | | | | | [`nullptr`](language/nullptr "cpp/language/nullptr") | [N2431](https://wg21.link/N2431) | 4.6 | 2.9 | 16.0\* | Yes | 4.2 | 12.1 | 13.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | 8.52 | | Explicit [conversion operators](language/cast_operator "cpp/language/cast operator") | [N2437](https://wg21.link/N2437) | 4.5 | 3.0 | 18.0\* | Yes | 4.4 | 13.0 | 12.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | | | ref-qualifiers | [N2439](https://wg21.link/N2439) | 4.8.1 | 2.9 | 19.0 (2015)\* | Yes | 4.7 | 14.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | Unicode [string literals](language/string_literal "cpp/language/string literal") | [N2442](https://wg21.link/N2442) | 4.4 | 3.0 | 19.0 (2015)\* | Yes | 4.7 | 11.0\* | 10.1\*13.1.1\* | 5.7 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | 8.52 | | Raw [string literals](language/string_literal "cpp/language/string literal") | [N2442](https://wg21.link/N2442) | 4.5 | Yes | 18.0\* | Yes | 4.7 | 14.0 | 13.1.1\*, except AIX xlC 13.1.3 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | 8.52 | | [Inline namespaces](language/namespace#Inline_namespaces "cpp/language/namespace") | [N2535](https://wg21.link/N2535) | 4.4 | 2.9 | 19.0 (2015)\* | Yes | 4.5 | 14.0 | 11.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | [Inheriting constructors](language/using_declaration#Inheriting_constructors "cpp/language/using declaration") | [N2540](https://wg21.link/N2540) | 4.8 | 3.3 | 19.0 (2015)\* | Yes | 4.8 | 15.0 | 13.1.1\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | | | | [Trailing function return types](language/function#Function_declaration "cpp/language/function") | [N2541](https://wg21.link/N2541) | 4.4 | 2.9 | 16.0\* | Yes | 4.1 | 12.0 | 12.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | | | Unrestricted [unions](language/union "cpp/language/union") | [N2544](https://wg21.link/N2544) | 4.6 | 3.0 | 19.0 (2015)\* | Yes | 4.6 | 14.0\* | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | [Variadic templates](language/parameter_pack "cpp/language/parameter pack") | [N2242](https://wg21.link/N2242)[N2555](https://wg21.link/N2555) | 4.3 (N2242)4.4 | 2.9 | 18.0\* | Yes | 4.3 (N2242)4.3 | 12.1 | 11.1 (N2242) | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | | | [Expression SFINAE](language/sfinae#Expression_SFINAE "cpp/language/sfinae") | [N2634](https://wg21.link/N2634) | 4.4 | 2.9 | 19.14\* | Yes | 4.2 | 12.1 | | | Yes | 8.4 | 2015 | 7.0 | | | | Local and unnamed types as template parameters | [N2657](https://wg21.link/N2657) | 4.5 | 2.9 | 16.0\* | Yes | 4.2 | 12.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | | | [Thread-local storage](language/storage_duration "cpp/language/storage duration") | [N2659](https://wg21.link/N2659) | 4.4 (partial)4.8 | 3.3\* | 16.0\* (partial)19.0 (2015)\* | Yes | 4.8 | 11.1 (partial)15.0\* | 10.1 (partial)\*13.1.2 (partial)\* | 5.9 (partial) | Yes | 8.4 | 2015 | | | 8.52 (partial) | | Dynamic initialization and destruction with concurrency ([magic statics](language/storage_duration#Static_local_variables "cpp/language/storage duration")) | [N2660](https://wg21.link/N2660) | 4.3 | 2.9 | 19.0 (2015)\* | Yes | Yes | 11.1\* | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | | A.06.25 | | | Garbage Collection and Reachability-Based Leak Detection | [N2670](https://wg21.link/N2670) | | | | | | | | | | | | | | | | [Initializer lists](language/list_initialization "cpp/language/list initialization") | [N2672](https://wg21.link/N2672) | 4.4 | 3.1 | 18.0\* | Yes | 4.5 | 13.0 (partial)14.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | [Non-static data member initializers](language/data_members#Member_initialization "cpp/language/data members") | [N2756](https://wg21.link/N2756) | 4.7 | 3.0 | 18.0\* | Yes | 4.6 | 14.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | [Attributes](language/attributes "cpp/language/attributes") | [N2761](https://wg21.link/N2761) | 4.8 | 3.3 | 19.0 (2015)\* | Yes | 4.2 | 12.1 | 13.1.1\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.27 | | | [Forward (opaque) `enum` declarations](language/enum "cpp/language/enum") | [N2764](https://wg21.link/N2764) | 4.6 | 3.1 | 17.0\* | Yes | 4.5 | 11.1 (partial)14.0 | 12.1 | 5.13 | Yes | 8.4 | 2015 | 7.0 | | | | [User-defined literals](language/user_literal "cpp/language/user literal") | [N2765](https://wg21.link/N2765) | 4.7 | 3.1 | 19.0 (2015)\* | Yes | 4.8 | 15.0 | 13.1.2\* | 5.14 | Yes | 8.4 | 2015 | 7.0 | | | | [Rvalue references](language/reference#Rvalue_references "cpp/language/reference") | [N2118](https://wg21.link/N2118)[N2844](https://wg21.link/N2844)[CWG1138](https://cplusplus.github.io/CWG/issues/1138.html) | 4.3 (N2118)4.5 | 2.9 | 16.0\* (N2844)17.0\* | Yes | 4.5 | 11.1 (N2118)12.0 (N2844)14.0 | 12.1 | 5.13 | Yes | 8.4 | 2015 | 7.0\* | A.06.25 | | | [Lambda expressions](language/lambda "cpp/language/lambda") | [N2550](https://wg21.link/N2550)[N2658](https://wg21.link/N2658)[N2927](https://wg21.link/N2927) | 4.5 | 3.1 | 16.0\* (N2658)17.0\* | Yes | 4.1 | 12.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [Range-for loop](language/range-for "cpp/language/range-for") | [N2930](https://wg21.link/N2930)[N3271](https://wg21.link/N3271) | 4.6 | 3.0 | 17.0\* | Yes | 4.5 | 13.0 | 13.1.2\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | [`noexcept`](language/noexcept_spec "cpp/language/noexcept spec") | [N3050](https://wg21.link/N3050) | 4.6 | 3.0 | 19.0 (2015)\* | Yes | 4.5 | 14.0 | 13.1.1\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.28 | | | Defaulted move [special](language/member_functions#Special_member_functions "cpp/language/member functions") [member](language/move_constructor "cpp/language/move constructor") [functions](language/move_assignment "cpp/language/move assignment") | [N3053](https://wg21.link/N3053) | 4.6 | 3.0 | 19.0 (2015)\* | Yes | 4.5 | 14.0 | | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | | | [`override`](language/override "cpp/language/override") and [`final`](language/final "cpp/language/final") | [N2928](https://wg21.link/N2928)[N3206](https://wg21.link/N3206)[N3272](https://wg21.link/N3272) | 4.7 | 2.9 | 14.0\* (partial)17.0\* | Yes | 4.8 | 12.0 (N2928)14.0 | 13.1.1\* | 5.13 | Yes | 8.4 | 2015 | 7.0 | | | | [`decltype`](language/decltype "cpp/language/decltype") | [N2343](https://wg21.link/N2343)[N3276](https://wg21.link/N3276) | 4.3 (N2343)4.8.1 | 2.9 | 16.0\* | Yes | 4.2 (N2343)4.8 | 11.0 (N2343)12.0 | 11.1 (N2343) | 5.13 | Yes | 8.4 | 2015 | 7.0 | A.06.25 | 8.52 (N2343) | | C++11 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | HP aCC | Digital Mars C++ | ### C++11 library features | C++11 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [Type traits](meta#Type_traits "cpp/meta") | [N1836](https://wg21.link/N1836)[N2240](https://wg21.link/N2240)[N2244](https://wg21.link/N2244)[N2255](https://wg21.link/N2255)[N2342](https://wg21.link/N2342)[N2984](https://wg21.link/N2984)[N3142](https://wg21.link/N3142) | 4.3\*4.8\*5 | 3.0 | 14.0\*(partial)\*19.0 (2015)\* | Yes | 5.13 | Yes | 8.4 | | [Garbage Collection](header/memory#Functions "cpp/header/memory") and Reachability-Based Leak Detection ([library support](memory#Garbage_collector_support "cpp/memory")) | [N2670](https://wg21.link/N2670) | 6(no-op) | 3.4(no-op) | 19.0 (2015)\*(no-op) | Yes(no-op) | | | | | [Money, Time, and hexfloat I/O manipulators](io/manip "cpp/io/manip") | [N2071](https://wg21.link/N2071)[N2072](https://wg21.link/N2072) | 5 | 3.8 | 19.0 (2015)\* | Yes | 5.15 | | | | Disallowing [COW (copy-on-write)](language/acronyms "cpp/language/acronyms") `[string](string/basic_string "cpp/string/basic string")` | [N2668](https://wg21.link/N2668) | 5 | Yes | Yes | Yes | Yes | | | | [Regular expressions library](regex "cpp/regex") | [N1429](https://wg21.link/N1429) | 4.9 | ? | ? | | | | | | C++11 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | *\** - hover over a cell with the version number to see notes. ### External links * [Working c++11 examples](https://github.com/makelinux/examples/blob/HEAD/cpp/11.cpp)
programming_docs
cpp Ranges library (C++20) Ranges library (C++20) ====================== The ranges library is an extension and generalization of the algorithms and iterator libraries that makes them more powerful by making them composable and less error-prone. The library creates and manipulates range *views*, lightweight objects that indirectly represent iterable sequences (*ranges*). Ranges are an abstraction on top of. * [begin, end) iterator pairs, e.g. ranges made by implicit conversion from containers. All algorithms that take iterator pairs now have overloads that accept ranges (e.g [`ranges::sort`](algorithm/ranges/sort "cpp/algorithm/ranges/sort")) * [start, size) counted sequences, e.g. range returned by [`views::counted`](ranges/view_counted "cpp/ranges/view counted") * [start, predicate) conditionally-terminated sequences, e.g. range returned by [`views::take_while`](ranges/take_while_view "cpp/ranges/take while view") * [start..) unbounded sequences, e.g. range returned by [`views::iota`](ranges/iota_view "cpp/ranges/iota view") The ranges library includes [range algorithms](algorithm/ranges "cpp/algorithm/ranges"), which are applied to ranges eagerly, and [range adaptors](ranges#Range_adaptors "cpp/ranges"), which are applied to views lazily. Adaptors can be composed into pipelines, so that their actions take place as the view is iterated. | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` namespace std { namespace views = ranges::views; } ``` | | (since C++20) | The namespace alias `std::views` is provided as a shorthand for `std::ranges::views`. | Defined in namespace `std::ranges` | | --- | | Range access | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | Defined in header `[<iterator>](header/iterator "cpp/header/iterator")` | | [ranges::begin](ranges/begin "cpp/ranges/begin") (C++20) | returns an iterator to the beginning of a range (customization point object) | | [ranges::end](ranges/end "cpp/ranges/end") (C++20) | returns a sentinel indicating the end of a range (customization point object) | | [ranges::cbegin](ranges/cbegin "cpp/ranges/cbegin") (C++20) | returns an iterator to the beginning of a read-only range (customization point object) | | [ranges::cend](ranges/cend "cpp/ranges/cend") (C++20) | returns a sentinel indicating the end of a read-only range (customization point object) | | [ranges::rbegin](ranges/rbegin "cpp/ranges/rbegin") (C++20) | returns a reverse iterator to a range (customization point object) | | [ranges::rend](ranges/rend "cpp/ranges/rend") (C++20) | returns a reverse end iterator to a range (customization point object) | | [ranges::crbegin](ranges/crbegin "cpp/ranges/crbegin") (C++20) | returns a reverse iterator to a read-only range (customization point object) | | [ranges::crend](ranges/crend "cpp/ranges/crend") (C++20) | returns a reverse end iterator to a read-only range (customization point object) | | [ranges::size](ranges/size "cpp/ranges/size") (C++20) | returns an integer equal to the size of a range (customization point object) | | [ranges::ssize](ranges/ssize "cpp/ranges/ssize") (C++20) | returns a signed integer equal to the size of a range (customization point object) | | [ranges::empty](ranges/empty "cpp/ranges/empty") (C++20) | checks whether a range is empty (customization point object) | | [ranges::data](ranges/data "cpp/ranges/data") (C++20) | obtains a pointer to the beginning of a contiguous range (customization point object) | | [ranges::cdata](ranges/cdata "cpp/ranges/cdata") (C++20) | obtains a pointer to the beginning of a read-only contiguous range (customization point object) | | Range primitives | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | [ranges::iterator\_tranges::const\_iterator\_tranges::sentinel\_tranges::range\_difference\_tranges::range\_size\_t ranges::range\_value\_tranges::range\_reference\_tranges::range\_const\_reference\_tranges::range\_rvalue\_reference\_t](ranges/iterator_t "cpp/ranges/iterator t") (C++20)(C++23)(C++20)(C++20)(C++20)(C++20)(C++20)(C++23)(C++20) | obtains associated types of a range (alias template) | | Dangling iterator handling | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | [ranges::dangling](ranges/dangling "cpp/ranges/dangling") (C++20) | a placeholder type indicating that an iterator or a `subrange` should not be returned since it would be dangling (class) | | [ranges::borrowed\_iterator\_tranges::borrowed\_subrange\_t](ranges/borrowed_iterator_t "cpp/ranges/borrowed iterator t") (C++20) | obtains iterator type or `subrange` type of a [`borrowed_range`](ranges/borrowed_range "cpp/ranges/borrowed range") (alias template) | | Range concepts | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | [ranges::range](ranges/range "cpp/ranges/range") (C++20) | specifies that a type is a range, that is, it provides a `begin` iterator and an `end` sentinel (concept) | | [ranges::borrowed\_range](ranges/borrowed_range "cpp/ranges/borrowed range") (C++20) | specifies that a type is a [`range`](ranges/range "cpp/ranges/range") and iterators obtained from an expression of it can be safely returned without danger of dangling (concept) | | [ranges::sized\_range](ranges/sized_range "cpp/ranges/sized range") (C++20) | specifies that a range knows its size in constant time (concept) | | [ranges::view](ranges/view "cpp/ranges/view") (C++20) | specifies that a range is a view, that is, it has constant time copy/move/assignment (concept) | | [ranges::input\_range](ranges/input_range "cpp/ranges/input range") (C++20) | specifies a range whose iterator type satisfies [`input_iterator`](iterator/input_iterator "cpp/iterator/input iterator") (concept) | | [ranges::output\_range](ranges/output_range "cpp/ranges/output range") (C++20) | specifies a range whose iterator type satisfies [`output_iterator`](iterator/output_iterator "cpp/iterator/output iterator") (concept) | | [ranges::forward\_range](ranges/forward_range "cpp/ranges/forward range") (C++20) | specifies a range whose iterator type satisfies [`forward_iterator`](iterator/forward_iterator "cpp/iterator/forward iterator") (concept) | | [ranges::bidirectional\_range](ranges/bidirectional_range "cpp/ranges/bidirectional range") (C++20) | specifies a range whose iterator type satisfies [`bidirectional_iterator`](iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator") (concept) | | [ranges::random\_access\_range](ranges/random_access_range "cpp/ranges/random access range") (C++20) | specifies a range whose iterator type satisfies [`random_access_iterator`](iterator/random_access_iterator "cpp/iterator/random access iterator") (concept) | | [ranges::contiguous\_range](ranges/contiguous_range "cpp/ranges/contiguous range") (C++20) | specifies a range whose iterator type satisfies [`contiguous_iterator`](iterator/contiguous_iterator "cpp/iterator/contiguous iterator") (concept) | | [ranges::common\_range](ranges/common_range "cpp/ranges/common range") (C++20) | specifies that a range has identical iterator and sentinel types (concept) | | [ranges::viewable\_range](ranges/viewable_range "cpp/ranges/viewable range") (C++20) | specifies the requirements for a [`range`](ranges/range "cpp/ranges/range") to be safely convertible to a [`view`](ranges/view "cpp/ranges/view") (concept) | | [ranges::constant\_range](ranges/constant_range "cpp/ranges/constant range") (C++23) | specifies that a range has read-only elements (concept) | | Range conversions | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | [ranges::to](ranges/to "cpp/ranges/to") (C++23) | constructs a non-view range from another range (function template) | | Views | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | [ranges::view\_interface](ranges/view_interface "cpp/ranges/view interface") (C++20) | helper class template for defining a [`view`](ranges/view "cpp/ranges/view"), using the [curiously recurring template pattern](language/crtp "cpp/language/crtp") (class template) | | [ranges::subrange](ranges/subrange "cpp/ranges/subrange") (C++20) | combines an iterator-sentinel pair into a [`view`](ranges/view "cpp/ranges/view") (class template) | ### Range factories | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | --- | | Defined in namespace `std::ranges` | | [ranges::empty\_viewviews::empty](ranges/empty_view "cpp/ranges/empty view") (C++20) | an empty [`view`](ranges/view "cpp/ranges/view") with no elements (class template) (variable template) | | [ranges::single\_viewviews::single](ranges/single_view "cpp/ranges/single view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") that contains a single element of a specified value (class template) (customization point object) | | [ranges::iota\_viewviews::iota](ranges/iota_view "cpp/ranges/iota view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of a sequence generated by repeatedly incrementing an initial value (class template) (customization point object) | | [ranges::basic\_istream\_viewviews::istream](ranges/basic_istream_view "cpp/ranges/basic istream view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of the elements obtained by successive application of `operator>>` on the associated input stream (class template) (customization point object) | | [ranges::repeat\_viewviews::repeat](ranges/repeat_view "cpp/ranges/repeat view") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of a generated sequence by repeatedly producing the same value (class template) (customization point object) | | [ranges::cartesian\_product\_viewviews::cartesian\_product](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/cartesian_product_view&action=edit&redlink=1 "cpp/ranges/cartesian product view (page does not exist)") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of tuples of results calculated by the n-ary cartesian product of the adapted views (class template) (customization point object) | ### Range adaptors | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | --- | | Defined in namespace `std::ranges` | | [views::all\_tviews::all](ranges/all_view "cpp/ranges/all view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") that includes all elements of a [`range`](ranges/range "cpp/ranges/range") (alias template) (range adaptor object) | | [ranges::ref\_view](ranges/ref_view "cpp/ranges/ref view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") of the elements of some other [`range`](ranges/range "cpp/ranges/range") (class template) | | [ranges::owning\_view](ranges/owning_view "cpp/ranges/owning view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") with unique ownership of some [`range`](ranges/range "cpp/ranges/range") (class template) | | [ranges::filter\_viewviews::filter](ranges/filter_view "cpp/ranges/filter view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") that consists of the elements of a [`range`](ranges/range "cpp/ranges/range") that satisfies a predicate (class template) (range adaptor object) | | [ranges::transform\_viewviews::transform](ranges/transform_view "cpp/ranges/transform view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") of a sequence that applies a transformation function to each element (class template) (range adaptor object) | | [ranges::take\_viewviews::take](ranges/take_view "cpp/ranges/take view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of the first N elements of another [`view`](ranges/view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::take\_while\_viewviews::take\_while](ranges/take_while_view "cpp/ranges/take while view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of the initial elements of another [`view`](ranges/view "cpp/ranges/view"), until the first element on which a predicate returns false (class template) (range adaptor object) | | [ranges::drop\_viewviews::drop](ranges/drop_view "cpp/ranges/drop view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of elements of another [`view`](ranges/view "cpp/ranges/view"), skipping the first N elements (class template) (range adaptor object) | | [ranges::drop\_while\_viewviews::drop\_while](ranges/drop_while_view "cpp/ranges/drop while view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of the elements of another [`view`](ranges/view "cpp/ranges/view"), skipping the initial subsequence of elements until the first element where the predicate returns false (class template) (range adaptor object) | | [ranges::join\_viewviews::join](ranges/join_view "cpp/ranges/join view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") consisting of the sequence obtained from flattening a [`view`](ranges/view "cpp/ranges/view") of [`range`s](ranges/range "cpp/ranges/range") (class template) (range adaptor object) | | [ranges::split\_viewviews::split](ranges/split_view "cpp/ranges/split view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](ranges/view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) | | [ranges::lazy\_split\_viewviews::lazy\_split](ranges/lazy_split_view "cpp/ranges/lazy split view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](ranges/view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) | | [views::counted](ranges/view_counted "cpp/ranges/view counted") (C++20) | creates a subrange from an iterator and a count (customization point object) | | [ranges::common\_viewviews::common](ranges/common_view "cpp/ranges/common view") (C++20) | converts a [`view`](ranges/view "cpp/ranges/view") into a [`common_range`](ranges/common_range "cpp/ranges/common range") (class template) (range adaptor object) | | [ranges::reverse\_viewviews::reverse](ranges/reverse_view "cpp/ranges/reverse view") (C++20) | a [`view`](ranges/view "cpp/ranges/view") that iterates over the elements of another bidirectional view in reverse order (class template) (range adaptor object) | | [ranges::elements\_viewviews::elements](ranges/elements_view "cpp/ranges/elements view") (C++20) | takes a [`view`](ranges/view "cpp/ranges/view") consisting of tuple-like values and a number N and produces a [`view`](ranges/view "cpp/ranges/view") of N'th element of each tuple (class template) (range adaptor object) | | [ranges::keys\_viewviews::keys](ranges/keys_view "cpp/ranges/keys view") (C++20) | takes a [`view`](ranges/view "cpp/ranges/view") consisting of pair-like values and produces a [`view`](ranges/view "cpp/ranges/view") of the first elements of each pair (class template) (range adaptor object) | | [ranges::values\_viewviews::values](ranges/values_view "cpp/ranges/values view") (C++20) | takes a [`view`](ranges/view "cpp/ranges/view") consisting of pair-like values and produces a [`view`](ranges/view "cpp/ranges/view") of the second elements of each pair (class template) (range adaptor object) | | [ranges::zip\_viewviews::zip](ranges/zip_view "cpp/ranges/zip view") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of tuples of references to corresponding elements of the adapted views (class template) (customization point object) | | [ranges::zip\_transform\_viewviews::zip\_transform](ranges/zip_transform_view "cpp/ranges/zip transform view") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of tuples of results of application of a transformation function to corresponding elements of the adapted views (class template) (customization point object) | | [ranges::adjacent\_viewviews::adjacent](ranges/adjacent_view "cpp/ranges/adjacent view") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of tuples of references to adjacent elements of the adapted view (class template) (range adaptor object) | | [ranges::adjacent\_transform\_viewviews::adjacent\_transform](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_transform_view&action=edit&redlink=1 "cpp/ranges/adjacent transform view (page does not exist)") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of tuples of results of application of a transformation function to adjacent elements of the adapted view (class template) (range adaptor object) | | [ranges::join\_with\_viewviews::join\_with](ranges/join_with_view "cpp/ranges/join with view") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of the sequence obtained from flattening a view of ranges, with the delimiter in between elements (class template) (range adaptor object) | | [ranges::slide\_viewviews::slide](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/slide_view&action=edit&redlink=1 "cpp/ranges/slide view (page does not exist)") (C++23) | a [`view`](ranges/view "cpp/ranges/view") whose Mth element is a [`view`](ranges/view "cpp/ranges/view") over the Mth through (M + N - 1)th elements of another [`view`](ranges/view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::chunk\_viewviews::chunk](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/chunk_view&action=edit&redlink=1 "cpp/ranges/chunk view (page does not exist)") (C++23) | a range of [`view`s](ranges/view "cpp/ranges/view") that are `N`-sized non-overlapping successive chunks of the elements of another [`view`](ranges/view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::chunk\_by\_viewviews::chunk\_by](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/chunk_by_view&action=edit&redlink=1 "cpp/ranges/chunk by view (page does not exist)") (C++23) | splits the [`view`](ranges/view "cpp/ranges/view") into subranges between each pair of adjacent elements for which the given predicate returns `false` (class template) (range adaptor object) | | [ranges::as\_const\_viewviews::as\_const](ranges/as_const_view "cpp/ranges/as const view") (C++23) | converts a [`view`](ranges/view "cpp/ranges/view") into a [`constant_range`](ranges/constant_range "cpp/ranges/constant range") (class template) (range adaptor object) | | [ranges::as\_rvalue\_viewviews::as\_rvalue](ranges/as_rvalue_view "cpp/ranges/as rvalue view") (C++23) | a [`view`](ranges/view "cpp/ranges/view") of a sequence that casts each element to an rvalue (class template) (range adaptor object) | | [ranges::stride\_viewviews::stride](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/stride_view&action=edit&redlink=1 "cpp/ranges/stride view (page does not exist)") (C++23) | a [`view`](ranges/view "cpp/ranges/view") consisting of elements of another [`view`](ranges/view "cpp/ranges/view"), advancing over N elements at a time (class template) (range adaptor object) | Some range adaptors wrap their elements or function objects with the [*copyable wrapper*](ranges/copyable_wrapper "cpp/ranges/copyable wrapper"). #### Range adaptor closure objects *Range adaptor closure objects* are objects whose type is the same as one of the following objects (ignoring cv-qualification): * unary range adaptor objects, | | | | --- | --- | | * objects of user-defined types other than unary range adaptor objects who meet the requirements of implementing a range adaptor closure object, | (since C++23) | * the results of binding trailing arguments by range adaptor objects, and * the results of chaining two range adaptor closure objects by `operator|`. Range adaptor closure objects take one [`viewable_range`](ranges/viewable_range "cpp/ranges/viewable range") (until C++23) [`range`](ranges/range "cpp/ranges/range") (since C++23) as its only argument and returns a [`view`](ranges/view "cpp/ranges/view") (until C++23). They are callable via the pipe operator: if `C` is a range adaptor closure object and `R` is a [`viewable_range`](ranges/viewable_range "cpp/ranges/viewable range") (until C++23) [`range`](ranges/range "cpp/ranges/range") (since C++23), these two expressions are equivalent (both well-formed or ill-formed): ``` C(R) R | C ``` This call forwards the bound arguments (if any) to the associated range adaptor object. The bound arguments (if any) are identically treated as lvalue or rvalue and cv-qualified to `C`. Two range adaptor closure objects can be chained by `operator|` to produce another range adaptor closure object: if `C` and `D` are range adaptor closure objects, then `C | D` is also a range adaptor closure object if it is valid. The bound arguments of `C | D` is determined as follows: * there is a subobject in the result object of the same type (cv-qualification discarded) for every subobject in both operands that is a bound argument, * such a bound argument is direct-non-list-initialized with the source subobject in its containing operand, where the source is identically treated as lvalue or rvalue and cv-qualified to the operand, * the result is valid if and only if the initialization of all bound arguments are valid. The effect and validity of the `operator()` of the result is determined as follows: given a [`viewable_range`](ranges/viewable_range "cpp/ranges/viewable range") (until C++23) [`range`](ranges/range "cpp/ranges/range") (since C++23) `R`, these two expressions are equivalent (both well-formed or ill-formed): ``` R | C | D // (R | C) | D R | (C | D) ``` Notes: `operator()` is unsupported for volatile-qualified or const-volatile-qualified version of range adaptor object closure types. | | | | --- | --- | | Let `t` be the object of type `T`, then `t` is a range adaptor closure object if all the requirements are met:* `t` is a unary function object that takes one [`range`](ranges/range "cpp/ranges/range") argument. * `T` has exactly one public base class `ranges::range_adaptor_closure<T>`, and `T` has no base classes of type `ranges::range_adaptor_closure<U>` for any other type `U`. * `T` does not satisfy [`range`](ranges/range "cpp/ranges/range"). | (since C++23) | #### Range adaptor objects *Range adaptor objects* are customization point objects that accept [`viewable_range`](ranges/viewable_range "cpp/ranges/viewable range") as their first arguments and return a [`view`](ranges/view "cpp/ranges/view"). Some range adaptor objects are unary, i.e. they take one [`viewable_range`](ranges/viewable_range "cpp/ranges/viewable range") as their only argument. Other range adaptor objects take a [`viewable_range`](ranges/viewable_range "cpp/ranges/viewable range") and other trailing arguments. If a range adaptor object takes more than one argument, it also supports partial application: let. * `a` be such a range adaptor object, and * `args...` be arguments (generally suitable for trailing arguments), expression `a(args...)` has following properties: * it is valid if and only if for every argument `e` in `args...` such that `E` is `decltype((e))`, `[std::is\_constructible\_v](http://en.cppreference.com/w/cpp/types/is_constructible)<[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<E>, E>` is `true`, * when the call is valid, its result object stores a subobject of type `[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<E>` direct-non-list-initialized with `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<E>(e)`, for every argument `e` in `args...` (in other words, range adaptor objects bind arguments by value), and * the result object is a [range adaptor closure object](#Range_adaptor_closure_objects). Like other customization point objects, let. * `a` be an object of the cv-unqualified version of the type of any range adaptor objects, * `args...` be any group of arguments that satisfies the constraints of the `operator()` of the type of `a`, calls to. * `a(args...)`, * `[std::as\_const](http://en.cppreference.com/w/cpp/utility/as_const)(a)(args...)`, * `std::move(a)(args...)`, and * `std::move([std::as\_const](http://en.cppreference.com/w/cpp/utility/as_const)(a))(args...)` are all equivalent. The result object of each of these expressions is either a [`view`](ranges/view "cpp/ranges/view") object or a range adaptor closure object. Notes: `operator()` is unsupported for volatile-qualified or const-volatile-qualified version of range adaptor object types. Arrays and functions are converted to pointers while binding. ### Helper concepts Following exposition-only concepts are used for several types, but they are not parts of the interface of standard library. | | | | | --- | --- | --- | | ``` template<class R> concept __SimpleView = // exposition only ranges::view<R> && ranges::range<const R> && std::same_as<std::ranges::iterator_t<R>, std::ranges::iterator_t<const R>> && std::same_as<std::ranges::sentinel_t<R>, std::ranges::sentinel_t<const R>>; ``` | | | ### Notes | [Feature-test](utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_ranges`](feature_test#Library_features "cpp/feature test") | `201911L` | (C++20) | | [`__cpp_lib_ranges`](feature_test#Library_features "cpp/feature test") | `202106L` | (C++20) | | [`__cpp_lib_ranges`](feature_test#Library_features "cpp/feature test") | `202110L` | (C++20) | | [`__cpp_lib_ranges`](feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | | [`__cpp_lib_ranges_as_const`](feature_test#Library_features "cpp/feature test") | `202207L` | (C++23) | | [`__cpp_lib_ranges_as_rvalue`](feature_test#Library_features "cpp/feature test") | `202207L` | (C++23) | | [`__cpp_lib_ranges_cartesian_product`](feature_test#Library_features "cpp/feature test") | `202207L` | (C++23) | | [`__cpp_lib_ranges_chunk`](feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | | [`__cpp_lib_ranges_chunk_by`](feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | | [`__cpp_lib_ranges_join_with`](feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | | [`__cpp_lib_ranges_repeat`](feature_test#Library_features "cpp/feature test") | `202207L` | (C++23) | | [`__cpp_lib_ranges_slide`](feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | | [`__cpp_lib_ranges_stride`](feature_test#Library_features "cpp/feature test") | `202207L` | (C++23) | | [`__cpp_lib_ranges_to_container`](feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | | [`__cpp_lib_ranges_zip`](feature_test#Library_features "cpp/feature test") | `202110L` | (C++23) | ### Example ``` #include <ranges> #include <iostream> int main() { auto const ints = {0,1,2,3,4,5}; auto even = [](int i) { return 0 == i % 2; }; auto square = [](int i) { return i * i; }; // "pipe" syntax of composing the views: for (int i : ints | std::views::filter(even) | std::views::transform(square)) { std::cout << i << ' '; } std::cout << '\n'; // a traditional "functional" composing syntax: for (int i : std::views::transform(std::views::filter(ints, even), square)) { std::cout << i << ' '; } } ``` Output: ``` 0 4 16 0 4 16 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3509](https://cplusplus.github.io/LWG/issue3509) | C++20 | it was unclear how range adaptor objects bound trailing arguments | they are bound by value |
programming_docs
cpp Localization library Localization library ==================== The locale facility includes internationalization support for character classification and string collation, numeric, monetary, and date/time formatting and parsing, and message retrieval. Locale settings control the behavior of stream I/O, regular expression library, and other components of the C++ standard library. ### Locales | Defined in header `[<locale>](header/locale "cpp/header/locale")` | | --- | | Locales and facets | | [locale](locale/locale "cpp/locale/locale") | set of polymorphic facets that encapsulate cultural differences (class) | | [use\_facet](locale/use_facet "cpp/locale/use facet") | obtains a facet from a locale (function template) | | [has\_facet](locale/has_facet "cpp/locale/has facet") | checks if a locale implements a specific facet (function template) | | Character classification | | [isspace(std::locale)](locale/isspace "cpp/locale/isspace") | checks if a character is classified as whitespace by a locale (function template) | | [isblank(std::locale)](locale/isblank "cpp/locale/isblank") (C++11) | checks if a character is classified as a blank character by a locale (function template) | | [iscntrl(std::locale)](locale/iscntrl "cpp/locale/iscntrl") | checks if a character is classified as a control character by a locale (function template) | | [isupper(std::locale)](locale/isupper "cpp/locale/isupper") | checks if a character is classified as uppercase by a locale (function template) | | [islower(std::locale)](locale/islower "cpp/locale/islower") | checks if a character is classified as lowercase by a locale (function template) | | [isalpha(std::locale)](locale/isalpha "cpp/locale/isalpha") | checks if a character is classified as alphabetic by a locale (function template) | | [isdigit(std::locale)](locale/isdigit "cpp/locale/isdigit") | checks if a character is classified as a digit by a locale (function template) | | [ispunct(std::locale)](locale/ispunct "cpp/locale/ispunct") | checks if a character is classified as punctuation by a locale (function template) | | [isxdigit(std::locale)](locale/isxdigit "cpp/locale/isxdigit") | checks if a character is classified as a hexadecimal digit by a locale (function template) | | [isalnum(std::locale)](locale/isalnum "cpp/locale/isalnum") | checks if a character is classified as alphanumeric by a locale (function template) | | [isprint(std::locale)](locale/isprint "cpp/locale/isprint") | checks if a character is classified as printable by a locale (function template) | | [isgraph(std::locale)](locale/isgraph "cpp/locale/isgraph") | checks if a character is classfied as graphical by a locale (function template) | | Character conversions | | [toupper(std::locale)](locale/toupper "cpp/locale/toupper") | converts a character to uppercase using the ctype facet of a locale (function template) | | [tolower(std::locale)](locale/tolower "cpp/locale/tolower") | converts a character to lowercase using the ctype facet of a locale (function template) | | String and stream conversions | | [wstring\_convert](locale/wstring_convert "cpp/locale/wstring convert") (C++11)(deprecated in C++17) | performs conversions between a wide string and a byte string (class template) | | [wbuffer\_convert](locale/wbuffer_convert "cpp/locale/wbuffer convert") (C++11)(deprecated in C++17) | performs conversion between a byte stream buffer and a wide stream buffer (class template) | | Facet category base classes | | [ctype\_base](locale/ctype_base "cpp/locale/ctype base") | defines character classification categories (class) | | [codecvt\_base](locale/codecvt_base "cpp/locale/codecvt base") | defines character conversion errors (class) | | [messages\_base](locale/messages_base "cpp/locale/messages base") | defines messages catalog type (class) | | [time\_base](locale/time_base "cpp/locale/time base") | defines date format constants (class) | | [money\_base](locale/money_base "cpp/locale/money base") | defines monetary formatting patterns (class) | | Facet categories | | [ctype](locale/ctype "cpp/locale/ctype") | defines character classification tables (class template) | | [ctype<char>](locale/ctype_char "cpp/locale/ctype char") | specialization of `[std::ctype](locale/ctype "cpp/locale/ctype")` for type `char` (class template specialization) | | [codecvt](locale/codecvt "cpp/locale/codecvt") | converts between character encodings, including UTF-8, UTF-16, UTF-32 (class template) | | [collate](locale/collate "cpp/locale/collate") | defines lexicographical comparison and hashing of strings (class template) | | [messages](locale/messages "cpp/locale/messages") | implements retrieval of strings from message catalogs (class template) | | [time\_get](locale/time_get "cpp/locale/time get") | parses time/date values from an input character sequence into `struct [std::tm](http://en.cppreference.com/w/cpp/chrono/c/tm)` (class template) | | [time\_put](locale/time_put "cpp/locale/time put") | formats contents of `struct [std::tm](http://en.cppreference.com/w/cpp/chrono/c/tm)` for output as character sequence (class template) | | [num\_get](locale/num_get "cpp/locale/num get") | parses numeric values from an input character sequence (class template) | | [num\_put](locale/num_put "cpp/locale/num put") | formats numeric values for output as character sequence (class template) | | [numpunct](locale/numpunct "cpp/locale/numpunct") | defines numeric punctuation rules (class template) | | [money\_get](locale/money_get "cpp/locale/money get") | parses and constructs a monetary value from an input character sequence (class template) | | [money\_put](locale/money_put "cpp/locale/money put") | formats a monetary value for output as a character sequence (class template) | | [moneypunct](locale/moneypunct "cpp/locale/moneypunct") | defines monetary formatting parameters used by `[std::money\_get](locale/money_get "cpp/locale/money get")` and `[std::money\_put](locale/money_put "cpp/locale/money put")` (class template) | | Locale-specific facet categories | | [ctype\_byname](locale/ctype_byname "cpp/locale/ctype byname") | represents the system-supplied `[std::ctype](locale/ctype "cpp/locale/ctype")` for the named locale (class template) | | [codecvt\_byname](locale/codecvt_byname "cpp/locale/codecvt byname") | represents the system-supplied `[std::codecvt](locale/codecvt "cpp/locale/codecvt")` for the named locale (class template) | | [messages\_byname](locale/messages_byname "cpp/locale/messages byname") | represents the system-supplied `[std::messages](locale/messages "cpp/locale/messages")` for the named locale (class template) | | [collate\_byname](locale/collate_byname "cpp/locale/collate byname") | represents the system-supplied `[std::collate](locale/collate "cpp/locale/collate")` for the named locale (class template) | | [time\_get\_byname](locale/time_get_byname "cpp/locale/time get byname") | represents the system-supplied `[std::time\_get](locale/time_get "cpp/locale/time get")` for the named locale (class template) | | [time\_put\_byname](locale/time_put_byname "cpp/locale/time put byname") | represents the system-supplied `[std::time\_put](locale/time_put "cpp/locale/time put")` for the named locale (class template) | | [numpunct\_byname](locale/numpunct_byname "cpp/locale/numpunct byname") | represents the system-supplied `[std::numpunct](locale/numpunct "cpp/locale/numpunct")` for the named locale (class template) | | [moneypunct\_byname](locale/moneypunct_byname "cpp/locale/moneypunct byname") | represents the system-supplied `[std::moneypunct](locale/moneypunct "cpp/locale/moneypunct")` for the named locale (class template) | ### Locale-independent unicode conversion facets | Defined in header `[<codecvt>](header/codecvt "cpp/header/codecvt")` | | --- | | [codecvt\_utf8](locale/codecvt_utf8 "cpp/locale/codecvt utf8") (C++11)(deprecated in C++17) | converts between UTF-8 and UCS2/UCS4 (class template) | | [codecvt\_utf16](locale/codecvt_utf16 "cpp/locale/codecvt utf16") (C++11)(deprecated in C++17) | converts between UTF-16 and UCS2/UCS4 (class template) | | [codecvt\_utf8\_utf16](locale/codecvt_utf8_utf16 "cpp/locale/codecvt utf8 utf16") (C++11)(deprecated in C++17) | converts between UTF-8 and UTF-16 (class template) | | [codecvt\_mode](locale/codecvt_mode "cpp/locale/codecvt mode") (C++11)(deprecated in C++17) | tags to alter behavior of the standard codecvt facets (enum) | ### C library locales | Defined in header `[<clocale>](header/clocale "cpp/header/clocale")` | | --- | | [setlocale](locale/setlocale "cpp/locale/setlocale") | gets and sets the current C locale (function) | | [LC\_ALLLC\_COLLATELC\_CTYPELC\_MONETARYLC\_NUMERICLC\_TIME](locale/lc_categories "cpp/locale/LC categories") | locale categories for `[std::setlocale](locale/setlocale "cpp/locale/setlocale")` (macro constant) | | [localeconv](locale/localeconv "cpp/locale/localeconv") | queries numeric and monetary formatting details of the current locale (function) | | [lconv](locale/lconv "cpp/locale/lconv") | formatting details, returned by `[std::localeconv](locale/localeconv "cpp/locale/localeconv")` (class) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/locale "c/locale") for Localization support | cpp Iterator library Iterator library ================ The iterator library provides definitions for five (until C++17)six (since C++17) kinds of iterators as well as iterator traits, adaptors, and utility functions. ### Iterator categories There are five (until C++17)six (since C++17) kinds of iterators: [LegacyInputIterator](named_req/inputiterator "cpp/named req/InputIterator"), [LegacyOutputIterator](named_req/outputiterator "cpp/named req/OutputIterator"), [LegacyForwardIterator](named_req/forwarditerator "cpp/named req/ForwardIterator"), [LegacyBidirectionalIterator](named_req/bidirectionaliterator "cpp/named req/BidirectionalIterator"), [LegacyRandomAccessIterator](named_req/randomaccessiterator "cpp/named req/RandomAccessIterator"), and [LegacyContiguousIterator](named_req/contiguousiterator "cpp/named req/ContiguousIterator") (since C++17). Instead of being defined by specific types, each category of iterator is defined by the operations that can be performed on it. This definition means that any type that supports the necessary operations can be used as an iterator -- for example, a pointer supports all of the operations required by [LegacyRandomAccessIterator](named_req/randomaccessiterator "cpp/named req/RandomAccessIterator"), so a pointer can be used anywhere a [LegacyRandomAccessIterator](named_req/randomaccessiterator "cpp/named req/RandomAccessIterator") is expected. All of the iterator categories (except [LegacyOutputIterator](named_req/outputiterator "cpp/named req/OutputIterator")) can be organized into a hierarchy, where more powerful iterator categories (e.g. [LegacyRandomAccessIterator](named_req/randomaccessiterator "cpp/named req/RandomAccessIterator")) support the operations of less powerful categories (e.g. [LegacyInputIterator](named_req/inputiterator "cpp/named req/InputIterator")). If an iterator falls into one of these categories and also satisfies the requirements of [LegacyOutputIterator](named_req/outputiterator "cpp/named req/OutputIterator"), then it is called a *mutable* iterator and supports *both* input and output. Non-mutable iterators are called *constant* iterators. | Iterator category | Defined operations | | --- | --- | | [LegacyContiguousIterator](named_req/contiguousiterator "cpp/named req/ContiguousIterator") | [LegacyRandomAccessIterator](named_req/randomaccessiterator "cpp/named req/RandomAccessIterator") | [LegacyBidirectionalIterator](named_req/bidirectionaliterator "cpp/named req/BidirectionalIterator") | [LegacyForwardIterator](named_req/forwarditerator "cpp/named req/ForwardIterator") | [LegacyInputIterator](named_req/inputiterator "cpp/named req/InputIterator") | * read * increment (without multiple passes) | | | * increment (with multiple passes) | | | * decrement | | | * random access | | | * contiguous storage | | Iterators that fall into one of the above categories and also meet the requirements of [LegacyOutputIterator](named_req/outputiterator "cpp/named req/OutputIterator") are called mutable iterators. | | [LegacyOutputIterator](named_req/outputiterator "cpp/named req/OutputIterator") | | * write * increment (without multiple passes) | Note: [LegacyContiguousIterator](named_req/contiguousiterator "cpp/named req/ContiguousIterator") category was only formally specified in C++17, but the iterators of `[std::vector](container/vector "cpp/container/vector")`, `[std::basic\_string](string/basic_string "cpp/string/basic string")`, `[std::array](container/array "cpp/container/array")`, and `[std::valarray](numeric/valarray "cpp/numeric/valarray")`, as well as pointers into C arrays are often treated as a separate category in pre-C++17 code. ### C++20 iterator concepts C++20 introduces a new system of iterators based on [concepts](language/constraints "cpp/language/constraints") that are different from C++17 iterators. While the basic taxonomy remains similar, the requirements for individual iterator categories are somewhat different. | Defined in namespace `std` | | --- | | [indirectly\_readable](iterator/indirectly_readable "cpp/iterator/indirectly readable") (C++20) | specifies that a type is indirectly readable by applying operator `*` (concept) | | [indirectly\_writable](iterator/indirectly_writable "cpp/iterator/indirectly writable") (C++20) | specifies that a value can be written to an iterator's referenced object (concept) | | [weakly\_incrementable](iterator/weakly_incrementable "cpp/iterator/weakly incrementable") (C++20) | specifies that a [`semiregular`](concepts/semiregular "cpp/concepts/semiregular") type can be incremented with pre- and post-increment operators (concept) | | [incrementable](iterator/incrementable "cpp/iterator/incrementable") (C++20) | specifies that the increment operation on a [`weakly_incrementable`](iterator/weakly_incrementable "cpp/iterator/weakly incrementable") type is equality-preserving and that the type is [`equality_comparable`](concepts/equality_comparable "cpp/concepts/equality comparable") (concept) | | [input\_or\_output\_iterator](iterator/input_or_output_iterator "cpp/iterator/input or output iterator") (C++20) | specifies that objects of a type can be incremented and dereferenced (concept) | | [sentinel\_for](iterator/sentinel_for "cpp/iterator/sentinel for") (C++20) | specifies a type is a sentinel for an [`input_or_output_iterator`](iterator/input_or_output_iterator "cpp/iterator/input or output iterator") type (concept) | | [sized\_sentinel\_for](iterator/sized_sentinel_for "cpp/iterator/sized sentinel for") (C++20) | specifies that the `-` operator can be applied to an iterator and a sentinel to calculate their difference in constant time (concept) | | [input\_iterator](iterator/input_iterator "cpp/iterator/input iterator") (C++20) | specifies that a type is an input iterator, that is, its referenced values can be read and it can be both pre- and post-incremented (concept) | | [output\_iterator](iterator/output_iterator "cpp/iterator/output iterator") (C++20) | specifies that a type is an output iterator for a given value type, that is, values of that type can be written to it and it can be both pre- and post-incremented (concept) | | [forward\_iterator](iterator/forward_iterator "cpp/iterator/forward iterator") (C++20) | specifies that an [`input_iterator`](iterator/input_iterator "cpp/iterator/input iterator") is a forward iterator, supporting equality comparison and multi-pass (concept) | | [bidirectional\_iterator](iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator") (C++20) | specifies that a [`forward_iterator`](iterator/forward_iterator "cpp/iterator/forward iterator") is a bidirectional iterator, supporting movement backwards (concept) | | [random\_access\_iterator](iterator/random_access_iterator "cpp/iterator/random access iterator") (C++20) | specifies that a [`bidirectional_iterator`](iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator") is a random-access iterator, supporting advancement in constant time and subscripting (concept) | | [contiguous\_iterator](iterator/contiguous_iterator "cpp/iterator/contiguous iterator") (C++20) | specifies that a [`random_access_iterator`](iterator/random_access_iterator "cpp/iterator/random access iterator") is a contiguous iterator, referring to elements that are contiguous in memory (concept) | ### Iterator associated types | Defined in namespace `std` | | --- | | [incrementable\_traits](iterator/incrementable_traits "cpp/iterator/incrementable traits") (C++20) | computes the difference type of a [`weakly_incrementable`](iterator/weakly_incrementable "cpp/iterator/weakly incrementable") type (class template) | | [indirectly\_readable\_traits](iterator/indirectly_readable_traits "cpp/iterator/indirectly readable traits") (C++20) | computes the value type of an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type (class template) | | [iter\_value\_titer\_reference\_titer\_const\_reference\_titer\_difference\_titer\_rvalue\_reference\_titer\_common\_reference\_t](iterator/iter_t "cpp/iterator/iter t") (C++20)(C++20)(C++23)(C++20)(C++20)(C++20) | computes the associated types of an iterator (alias template) | ### Iterator primitives | | | | --- | --- | | [iterator\_traits](iterator/iterator_traits "cpp/iterator/iterator traits") | provides uniform interface to the properties of an iterator (class template) | | [input\_iterator\_tagoutput\_iterator\_tagforward\_iterator\_tagbidirectional\_iterator\_tagrandom\_access\_iterator\_tagcontiguous\_iterator\_tag](iterator/iterator_tags "cpp/iterator/iterator tags") (C++20) | empty class types used to indicate iterator categories (class) | | [iterator](iterator/iterator "cpp/iterator/iterator") (deprecated in C++17) | base class to ease the definition of required types for simple iterators (class template) | ### Iterator customization points | Defined in namespace `std::ranges` | | --- | | [iter\_move](iterator/ranges/iter_move "cpp/iterator/ranges/iter move") (C++20) | casts the result of dereferencing an object to its associated rvalue reference type (customization point object) | | [iter\_swap](iterator/ranges/iter_swap "cpp/iterator/ranges/iter swap") (C++20) | swaps the values referenced by two dereferenceable objects (customization point object) | ### Algorithm concepts and utilities C++20 also provides a set of concepts and related utility templates designed to ease constraining common algorithm operations. | Defined in header `[<iterator>](header/iterator "cpp/header/iterator")` | | --- | | Defined in namespace `std` | | Indirect callable concepts | | [indirectly\_unary\_invocableindirectly\_regular\_unary\_invocable](iterator/indirectly_unary_invocable "cpp/iterator/indirectly unary invocable") (C++20)(C++20) | specifies that a callable type can be invoked with the result of dereferencing an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type (concept) | | [indirect\_unary\_predicate](iterator/indirect_unary_predicate "cpp/iterator/indirect unary predicate") (C++20) | specifies that a callable type, when invoked with the result of dereferencing an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type, satisfies [`predicate`](concepts/predicate "cpp/concepts/predicate") (concept) | | [indirect\_binary\_predicate](iterator/indirect_binary_predicate "cpp/iterator/indirect binary predicate") (C++20) | specifies that a callable type, when invoked with the result of dereferencing two [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") types, satisfies [`predicate`](concepts/predicate "cpp/concepts/predicate") (concept) | | [indirect\_equivalence\_relation](iterator/indirect_equivalence_relation "cpp/iterator/indirect equivalence relation") (C++20) | specifies that a callable type, when invoked with the result of dereferencing two [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") types, satisfies [`equivalence_relation`](concepts/equivalence_relation "cpp/concepts/equivalence relation") (concept) | | [indirect\_strict\_weak\_order](iterator/indirect_strict_weak_order "cpp/iterator/indirect strict weak order") (C++20) | specifies that a callable type, when invoked with the result of dereferencing two [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") types, satisfies [`strict_weak_order`](concepts/strict_weak_order "cpp/concepts/strict weak order") (concept) | | Common algorithm requirements | | [indirectly\_movable](iterator/indirectly_movable "cpp/iterator/indirectly movable") (C++20) | specifies that values may be moved from an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type to an [`indirectly_writable`](iterator/indirectly_writable "cpp/iterator/indirectly writable") type (concept) | | [indirectly\_movable\_storable](iterator/indirectly_movable_storable "cpp/iterator/indirectly movable storable") (C++20) | specifies that values may be moved from an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type to an [`indirectly_writable`](iterator/indirectly_writable "cpp/iterator/indirectly writable") type and that the move may be performed via an intermediate object (concept) | | [indirectly\_copyable](iterator/indirectly_copyable "cpp/iterator/indirectly copyable") (C++20) | specifies that values may be copied from an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type to an [`indirectly_writable`](iterator/indirectly_writable "cpp/iterator/indirectly writable") type (concept) | | [indirectly\_copyable\_storable](iterator/indirectly_copyable_storable "cpp/iterator/indirectly copyable storable") (C++20) | specifies that values may be copied from an [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") type to an [`indirectly_writable`](iterator/indirectly_writable "cpp/iterator/indirectly writable") type and that the copy may be performed via an intermediate object (concept) | | [indirectly\_swappable](iterator/indirectly_swappable "cpp/iterator/indirectly swappable") (C++20) | specifies that the values referenced by two [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") types can be swapped (concept) | | [indirectly\_comparable](iterator/indirectly_comparable "cpp/iterator/indirectly comparable") (C++20) | specifies that the values referenced by two [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") types can be compared (concept) | | [permutable](iterator/permutable "cpp/iterator/permutable") (C++20) | specifies the common requirements of algorithms that reorder elements in place (concept) | | [mergeable](iterator/mergeable "cpp/iterator/mergeable") (C++20) | specifies the requirements of algorithms that merge sorted sequences into an output sequence by copying elements (concept) | | [sortable](iterator/sortable "cpp/iterator/sortable") (C++20) | specifies the common requirements of algorithms that permute sequences into ordered sequences (concept) | | Utilities | | [indirect\_result\_t](iterator/indirect_result_t "cpp/iterator/indirect result t") (C++20) | computes the result of invoking a callable object on the result of dereferencing some set of [`indirectly_readable`](iterator/indirectly_readable "cpp/iterator/indirectly readable") types (alias template) | | [projected](iterator/projected "cpp/iterator/projected") (C++20) | helper template for specifying the constraints on algorithms that accept projections (class template) | ### Iterator adaptors | | | | --- | --- | | [reverse\_iterator](iterator/reverse_iterator "cpp/iterator/reverse iterator") | iterator adaptor for reverse-order traversal (class template) | | [make\_reverse\_iterator](iterator/make_reverse_iterator "cpp/iterator/make reverse iterator") (C++14) | creates a `[std::reverse\_iterator](iterator/reverse_iterator "cpp/iterator/reverse iterator")` of type inferred from the argument (function template) | | [move\_iterator](iterator/move_iterator "cpp/iterator/move iterator") (C++11) | iterator adaptor which dereferences to an rvalue reference (class template) | | [move\_sentinel](iterator/move_sentinel "cpp/iterator/move sentinel") (C++20) | sentinel adaptor for use with `[std::move\_iterator](iterator/move_iterator "cpp/iterator/move iterator")` (class template) | | [make\_move\_iterator](iterator/make_move_iterator "cpp/iterator/make move iterator") (C++11) | creates a `[std::move\_iterator](iterator/move_iterator "cpp/iterator/move iterator")` of type inferred from the argument (function template) | | [common\_iterator](iterator/common_iterator "cpp/iterator/common iterator") (C++20) | adapts an iterator type and its sentinel into a common iterator type (class template) | | [default\_sentinel\_t](iterator/default_sentinel_t "cpp/iterator/default sentinel t") (C++20) | default sentinel for use with iterators that know the bound of their range (class) | | [counted\_iterator](iterator/counted_iterator "cpp/iterator/counted iterator") (C++20) | iterator adaptor that tracks the distance to the end of the range (class template) | | [unreachable\_sentinel\_t](iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t") (C++20) | sentinel that always compares unequal to any [`weakly_incrementable`](iterator/weakly_incrementable "cpp/iterator/weakly incrementable") type (class) | | [back\_insert\_iterator](iterator/back_insert_iterator "cpp/iterator/back insert iterator") | iterator adaptor for insertion at the end of a container (class template) | | [back\_inserter](iterator/back_inserter "cpp/iterator/back inserter") | creates a `[std::back\_insert\_iterator](iterator/back_insert_iterator "cpp/iterator/back insert iterator")` of type inferred from the argument (function template) | | [front\_insert\_iterator](iterator/front_insert_iterator "cpp/iterator/front insert iterator") | iterator adaptor for insertion at the front of a container (class template) | | [front\_inserter](iterator/front_inserter "cpp/iterator/front inserter") | creates a `[std::front\_insert\_iterator](iterator/front_insert_iterator "cpp/iterator/front insert iterator")` of type inferred from the argument (function template) | | [insert\_iterator](iterator/insert_iterator "cpp/iterator/insert iterator") | iterator adaptor for insertion into a container (class template) | | [inserter](iterator/inserter "cpp/iterator/inserter") | creates a `[std::insert\_iterator](iterator/insert_iterator "cpp/iterator/insert iterator")` of type inferred from the argument (function template) | ### Stream iterators | | | | --- | --- | | [istream\_iterator](iterator/istream_iterator "cpp/iterator/istream iterator") | input iterator that reads from `[std::basic\_istream](io/basic_istream "cpp/io/basic istream")` (class template) | | [ostream\_iterator](iterator/ostream_iterator "cpp/iterator/ostream iterator") | output iterator that writes to `[std::basic\_ostream](io/basic_ostream "cpp/io/basic ostream")` (class template) | | [istreambuf\_iterator](iterator/istreambuf_iterator "cpp/iterator/istreambuf iterator") | input iterator that reads from `[std::basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf")` (class template) | | [ostreambuf\_iterator](iterator/ostreambuf_iterator "cpp/iterator/ostreambuf iterator") | output iterator that writes to `[std::basic\_streambuf](io/basic_streambuf "cpp/io/basic streambuf")` (class template) | ### Iterator operations | Defined in header `[<iterator>](header/iterator "cpp/header/iterator")` | | --- | | [advance](iterator/advance "cpp/iterator/advance") | advances an iterator by given distance (function template) | | [distance](iterator/distance "cpp/iterator/distance") | returns the distance between two iterators (function template) | | [next](iterator/next "cpp/iterator/next") (C++11) | increment an iterator (function template) | | [prev](iterator/prev "cpp/iterator/prev") (C++11) | decrement an iterator (function template) | | [ranges::advance](iterator/ranges/advance "cpp/iterator/ranges/advance") (C++20) | advances an iterator by given distance or to a given bound (niebloid) | | [ranges::distance](iterator/ranges/distance "cpp/iterator/ranges/distance") (C++20) | returns the distance between an iterator and a sentinel, or between the beginning and end of a range (niebloid) | | [ranges::next](iterator/ranges/next "cpp/iterator/ranges/next") (C++20) | increment an iterator by a given distance or to a bound (niebloid) | | [ranges::prev](iterator/ranges/prev "cpp/iterator/ranges/prev") (C++20) | decrement an iterator by a given distance or to a bound (niebloid) | ### Range access These non-member functions provide a generic interface for containers, plain arrays, and `[std::initializer\_list](utility/initializer_list "cpp/utility/initializer list")`. | Defined in header `[<array>](header/array "cpp/header/array")` | | --- | | Defined in header `[<deque>](header/deque "cpp/header/deque")` | | Defined in header `[<forward\_list>](header/forward_list "cpp/header/forward list")` | | Defined in header `[<iterator>](header/iterator "cpp/header/iterator")` | | Defined in header `[<list>](header/list "cpp/header/list")` | | Defined in header `[<map>](header/map "cpp/header/map")` | | Defined in header `[<regex>](header/regex "cpp/header/regex")` | | Defined in header `[<set>](header/set "cpp/header/set")` | | Defined in header `[<span>](header/span "cpp/header/span")` | | Defined in header `[<string>](header/string "cpp/header/string")` | | Defined in header `[<string\_view>](header/string_view "cpp/header/string view")` | | Defined in header `[<unordered\_map>](header/unordered_map "cpp/header/unordered map")` | | Defined in header `[<unordered\_set>](header/unordered_set "cpp/header/unordered set")` | | Defined in header `[<vector>](header/vector "cpp/header/vector")` | | Defined in namespace `std` | | [begincbegin](iterator/begin "cpp/iterator/begin") (C++11)(C++14) | returns an iterator to the beginning of a container or array (function template) | | [endcend](iterator/end "cpp/iterator/end") (C++11)(C++14) | returns an iterator to the end of a container or array (function template) | | [rbegincrbegin](iterator/rbegin "cpp/iterator/rbegin") (C++14) | returns a reverse iterator to the beginning of a container or array (function template) | | [rendcrend](iterator/rend "cpp/iterator/rend") (C++14) | returns a reverse end iterator for a container or array (function template) | | [sizessize](iterator/size "cpp/iterator/size") (C++17)(C++20) | returns the size of a container or array (function template) | | [empty](iterator/empty "cpp/iterator/empty") (C++17) | checks whether the container is empty (function template) | | [data](iterator/data "cpp/iterator/data") (C++17) | obtains the pointer to the underlying array (function template) |
programming_docs
cpp Concepts library (since C++20) Concepts library (since C++20) ============================== The concepts library provides definitions of fundamental library concepts that can be used to perform compile-time validation of template arguments and perform function dispatch based on properties of types. These concepts provide a foundation for equational reasoning in programs. Most concepts in the standard library impose both syntactic and semantic requirements. It is said that a standard concept is *satisfied* if its syntactic requirements are met, and is *modeled* if it is satisfied and its semantic requirements (if any) are also met. In general, only the syntactic requirements can be checked by the compiler. If the validity or meaning of a program depends whether a sequence of template arguments models a concept, and the concept is satisfied but not modeled, or if a semantic requirement is not met at the point of use, the program is ill-formed, [no diagnostic required](language/ndr "cpp/language/ndr"). | Defined in namespace `std` | | --- | | Core language concepts | | Defined in header `[<concepts>](header/concepts "cpp/header/concepts")` | | [same\_as](concepts/same_as "cpp/concepts/same as") (C++20) | specifies that a type is the same as another type (concept) | | [derived\_from](concepts/derived_from "cpp/concepts/derived from") (C++20) | specifies that a type is derived from another type (concept) | | [convertible\_to](concepts/convertible_to "cpp/concepts/convertible to") (C++20) | specifies that a type is implicitly convertible to another type (concept) | | [common\_reference\_with](concepts/common_reference_with "cpp/concepts/common reference with") (C++20) | specifies that two types share a common reference type (concept) | | [common\_with](concepts/common_with "cpp/concepts/common with") (C++20) | specifies that two types share a common type (concept) | | [integral](concepts/integral "cpp/concepts/integral") (C++20) | specifies that a type is an integral type (concept) | | [signed\_integral](concepts/signed_integral "cpp/concepts/signed integral") (C++20) | specifies that a type is an integral type that is signed (concept) | | [unsigned\_integral](concepts/unsigned_integral "cpp/concepts/unsigned integral") (C++20) | specifies that a type is an integral type that is unsigned (concept) | | [floating\_point](concepts/floating_point "cpp/concepts/floating point") (C++20) | specifies that a type is a floating-point type (concept) | | [assignable\_from](concepts/assignable_from "cpp/concepts/assignable from") (C++20) | specifies that a type is assignable from another type (concept) | | [swappableswappable\_with](concepts/swappable "cpp/concepts/swappable") (C++20) | specifies that a type can be swapped or that two types can be swapped with each other (concept) | | [destructible](concepts/destructible "cpp/concepts/destructible") (C++20) | specifies that an object of the type can be destroyed (concept) | | [constructible\_from](concepts/constructible_from "cpp/concepts/constructible from") (C++20) | specifies that a variable of the type can be constructed from or bound to a set of argument types (concept) | | [default\_initializable](concepts/default_initializable "cpp/concepts/default initializable") (C++20) | specifies that an object of a type can be default constructed (concept) | | [move\_constructible](concepts/move_constructible "cpp/concepts/move constructible") (C++20) | specifies that an object of a type can be move constructed (concept) | | [copy\_constructible](concepts/copy_constructible "cpp/concepts/copy constructible") (C++20) | specifies that an object of a type can be copy constructed and move constructed (concept) | | Comparison concepts | | Defined in header `[<concepts>](header/concepts "cpp/header/concepts")` | | [*boolean-testable*](concepts/boolean-testable "cpp/concepts/boolean-testable") (C++20) | specifies that a type can be used in Boolean contexts (exposition-only concept) | | [equality\_comparableequality\_comparable\_with](concepts/equality_comparable "cpp/concepts/equality comparable") (C++20) | specifies that operator `==` is an equivalence relation (concept) | | [totally\_orderedtotally\_ordered\_with](concepts/totally_ordered "cpp/concepts/totally ordered") (C++20) | specifies that the comparison operators on the type yield a total order (concept) | | Defined in header `[<compare>](header/compare "cpp/header/compare")` | | [three\_way\_comparablethree\_way\_comparable\_with](utility/compare/three_way_comparable "cpp/utility/compare/three way comparable") (C++20) | specifies that operator `<=>` produces consistent result on given types (concept) | | Object concepts | | Defined in header `[<concepts>](header/concepts "cpp/header/concepts")` | | [movable](concepts/movable "cpp/concepts/movable") (C++20) | specifies that an object of a type can be moved and swapped (concept) | | [copyable](concepts/copyable "cpp/concepts/copyable") (C++20) | specifies that an object of a type can be copied, moved, and swapped (concept) | | [semiregular](concepts/semiregular "cpp/concepts/semiregular") (C++20) | specifies that an object of a type can be copied, moved, swapped, and default constructed (concept) | | [regular](concepts/regular "cpp/concepts/regular") (C++20) | specifies that a type is regular, that is, it is both [`semiregular`](concepts/semiregular "cpp/concepts/semiregular") and [`equality_comparable`](concepts/equality_comparable "cpp/concepts/equality comparable") (concept) | | Callable concepts | | Defined in header `[<concepts>](header/concepts "cpp/header/concepts")` | | [invocableregular\_invocable](concepts/invocable "cpp/concepts/invocable") (C++20) | specifies that a callable type can be invoked with a given set of argument types (concept) | | [predicate](concepts/predicate "cpp/concepts/predicate") (C++20) | specifies that a callable type is a Boolean predicate (concept) | | [relation](concepts/relation "cpp/concepts/relation") (C++20) | specifies that a callable type is a binary relation (concept) | | [equivalence\_relation](concepts/equivalence_relation "cpp/concepts/equivalence relation") (C++20) | specifies that a [`relation`](concepts/relation "cpp/concepts/relation") imposes an equivalence relation (concept) | | [strict\_weak\_order](concepts/strict_weak_order "cpp/concepts/strict weak order") (C++20) | specifies that a [`relation`](concepts/relation "cpp/concepts/relation") imposes a strict weak ordering (concept) | Additional concepts can be found in [the iterators library](iterator#C.2B.2B20_iterator_concepts "cpp/iterator"), [the algorithms library](iterator#Algorithm_concepts_and_utilities "cpp/iterator"), and [the ranges library](ranges#Range_concepts "cpp/ranges"). ### See also * [Named Requirements](named_req "cpp/named req") cpp Comments Comments ======== Comments serve as a sort of in-code documentation. When inserted into a program, they are effectively ignored by the compiler; they are solely intended to be used as notes by the humans that read source code. Although specific documentation is not part of the C++ standard, several utilities exist that parse comments with different documentation formats. ### Syntax | | | | | --- | --- | --- | | `/*` comment `*/` | (1) | | | `//` comment | (2) | | 1) Often known as "C-style" or "multi-line" comments. 2) Often known as "C++-style" or "single-line" comments. All comments are removed from the program at [translation phase 3](language/translation_phases "cpp/language/translation phases") by replacing each comment with a single whitespace character. ### C-style C-style comments are usually used to comment large blocks of text, however, they can be used to comment single lines. To insert a C-style comment, simply surround text with `/*` and `*/`; this will cause the contents of the comment to be ignored by the compiler. Although it is not part of the C++ standard, `/**` and `*/` are often used to indicate documentation blocks; this is legal because the second asterisk is simply treated as part of the comment. C-style comments cannot be nested. ### C++-style C++-style comments are usually used to comment single lines, however, multiple C++-style comments can be placed together to form multi-line comments. C++-style comments tell the compiler to ignore all content between `//` and a new line. ### Notes Because comments [are removed](language/translation_phases "cpp/language/translation phases") before the preprocessor stage, a macro cannot be used to form a comment and an unterminated C-style comment doesn't spill over from an #include'd file. Besides commenting out, other mechanisms used for source code exclusion are. ``` #if 0 std::cout << "this will not be executed or even compiled\n"; #endif ``` and. ``` if(false) { std::cout << "this will not be executed\n"; } ``` ### Example ``` #include <iostream> /* C-style comments can contain multiple lines */ /* or just one */ /************** * you can insert any *, but * you can't make comments nested */ // C++-style comments can comment one line // or, they can // be strung together int main() { // comments are removed before preprocessing, // so ABC is "1", not "1//2134", and "1 hello world" // will be printed #define ABC 1//2134 std::cout << ABC << " hello world\n"; // The below code won't be run // return 1; // The below code will be run return 0; } ``` Output: ``` 1 hello world ``` ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/comment "c/comment") for comment | cpp Dynamic memory management Dynamic memory management ========================= ### Smart pointers Smart pointers enable automatic, exception-safe, object lifetime management. | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | Pointer categories | | [unique\_ptr](memory/unique_ptr "cpp/memory/unique ptr") (C++11) | smart pointer with unique object ownership semantics (class template) | | [shared\_ptr](memory/shared_ptr "cpp/memory/shared ptr") (C++11) | smart pointer with shared object ownership semantics (class template) | | [weak\_ptr](memory/weak_ptr "cpp/memory/weak ptr") (C++11) | weak reference to an object managed by `[std::shared\_ptr](memory/shared_ptr "cpp/memory/shared ptr")` (class template) | | [auto\_ptr](memory/auto_ptr "cpp/memory/auto ptr") (deprecated in C++11)(removed in C++17) | smart pointer with strict object ownership semantics (class template) | | Helper classes | | [owner\_less](memory/owner_less "cpp/memory/owner less") (C++11) | provides mixed-type owner-based ordering of shared and weak pointers (class template) | | [enable\_shared\_from\_this](memory/enable_shared_from_this "cpp/memory/enable shared from this") (C++11) | allows an object to create a `shared_ptr` referring to itself (class template) | | [bad\_weak\_ptr](memory/bad_weak_ptr "cpp/memory/bad weak ptr") (C++11) | exception thrown when accessing a `weak_ptr` which refers to already destroyed object (class) | | [default\_delete](memory/default_delete "cpp/memory/default delete") (C++11) | default deleter for `[unique\_ptr](memory/unique_ptr "cpp/memory/unique ptr")` (class template) | | Smart pointer adaptors | | [out\_ptr\_t](memory/out_ptr_t "cpp/memory/out ptr t") (C++23) | interoperates with foreign pointer setters and resets a smart pointer on destruction (class template) | | [out\_ptr](memory/out_ptr_t/out_ptr "cpp/memory/out ptr t/out ptr") (C++23) | creates an `out_ptr_t` with an associated smart pointer and resetting arguments (function template) | | [inout\_ptr\_t](memory/inout_ptr_t "cpp/memory/inout ptr t") (C++23) | interoperates with foreign pointer setters, obtains the initial pointer value from a smart pointer, and resets it on destruction (class template) | | [inout\_ptr](memory/inout_ptr_t/inout_ptr "cpp/memory/inout ptr t/inout ptr") (C++23) | creates an `inout_ptr_t` with an associated smart pointer and resetting arguments (function template) | ### Allocators Allocators are class templates encapsulating memory allocation strategy. This allows generic containers to decouple memory management from the data itself. | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | [allocator](memory/allocator "cpp/memory/allocator") | the default allocator (class template) | | [allocator\_traits](memory/allocator_traits "cpp/memory/allocator traits") (C++11) | provides information about allocator types (class template) | | [allocation\_result](memory/allocation_result "cpp/memory/allocation result") (C++23) | records the address and the actual size of storage allocated by `allocate_at_least` (class template) | | [allocate\_at\_least](memory/allocate_at_least "cpp/memory/allocate at least") (C++23) | allocates storage at least as large as the requested size via an allocator (function template) | | [allocator\_arg](memory/allocator_arg "cpp/memory/allocator arg") (C++11) | an object of type `[std::allocator\_arg\_t](memory/allocator_arg_t "cpp/memory/allocator arg t")` used to select allocator-aware constructors (constant) | | [uses\_allocator](memory/uses_allocator "cpp/memory/uses allocator") (C++11) | checks if the specified type supports uses-allocator construction (class template) | | [uses\_allocator\_construction\_args](memory/uses_allocator_construction_args "cpp/memory/uses allocator construction args") (C++20) | prepares the argument list matching the flavor of uses-allocator construction required by the given type (function template) | | [make\_obj\_using\_allocator](memory/make_obj_using_allocator "cpp/memory/make obj using allocator") (C++20) | creates an object of the given type by means of uses-allocator construction (function template) | | [uninitialized\_construct\_using\_allocator](memory/uninitialized_construct_using_allocator "cpp/memory/uninitialized construct using allocator") (C++20) | creates an object of the given type at specified memory location by means of uses-allocator construction (function template) | | Defined in header `[<scoped\_allocator>](header/scoped_allocator "cpp/header/scoped allocator")` | | [scoped\_allocator\_adaptor](memory/scoped_allocator_adaptor "cpp/memory/scoped allocator adaptor") (C++11) | implements multi-level allocator for multi-level containers (class template) | | Defined in header `[<memory\_resource>](header/memory_resource "cpp/header/memory resource")` | | Defined in namespace `std::pmr` | | [polymorphic\_allocator](memory/polymorphic_allocator "cpp/memory/polymorphic allocator") (C++17) | an allocator that supports run-time polymorphism based on the `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` it is constructed with (class template) | ### Memory resources (since C++17) Memory resources implement memory allocation strategies that can be used by `[std::pmr::polymorphic\_allocator](memory/polymorphic_allocator "cpp/memory/polymorphic allocator")`. | Defined in header `[<memory\_resource>](header/memory_resource "cpp/header/memory resource")` | | --- | | Defined in namespace `std::pmr` | | [memory\_resource](memory/memory_resource "cpp/memory/memory resource") (C++17) | an abstract interface for classes that encapsulate memory resources (class) | | [new\_delete\_resource](memory/new_delete_resource "cpp/memory/new delete resource") (C++17) | returns a static program-wide `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` that uses the global `[operator new](memory/new/operator_new "cpp/memory/new/operator new")` and `[operator delete](memory/new/operator_delete "cpp/memory/new/operator delete")` to allocate and deallocate memory (function) | | [null\_memory\_resource](memory/null_memory_resource "cpp/memory/null memory resource") (C++17) | returns a static `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` that performs no allocation (function) | | [get\_default\_resource](memory/get_default_resource "cpp/memory/get default resource") (C++17) | gets the default `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` (function) | | [set\_default\_resource](memory/set_default_resource "cpp/memory/set default resource") (C++17) | sets the default `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` (function) | | [pool\_options](memory/pool_options "cpp/memory/pool options") (C++17) | a set of constructor options for pool resources (class) | | [synchronized\_pool\_resource](memory/synchronized_pool_resource "cpp/memory/synchronized pool resource") (C++17) | a thread-safe `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` for managing allocations in pools of different block sizes (class) | | [unsynchronized\_pool\_resource](memory/unsynchronized_pool_resource "cpp/memory/unsynchronized pool resource") (C++17) | a thread-unsafe `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` for managing allocations in pools of different block sizes (class) | | [monotonic\_buffer\_resource](memory/monotonic_buffer_resource "cpp/memory/monotonic buffer resource") (C++17) | a special-purpose `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` that releases the allocated memory only when the resource is destroyed (class) | ### Uninitialized storage Several utilities are provided to create and access raw storage. | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | [raw\_storage\_iterator](memory/raw_storage_iterator "cpp/memory/raw storage iterator") (deprecated in C++17)(removed in C++20) | an iterator that allows standard algorithms to store results in uninitialized memory (class template) | | [get\_temporary\_buffer](memory/get_temporary_buffer "cpp/memory/get temporary buffer") (deprecated in C++17)(removed in C++20) | obtains uninitialized storage (function template) | | [return\_temporary\_buffer](memory/return_temporary_buffer "cpp/memory/return temporary buffer") (deprecated in C++17)(removed in C++20) | frees uninitialized storage (function template) | ### Uninitialized memory algorithms | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | [uninitialized\_copy](memory/uninitialized_copy "cpp/memory/uninitialized copy") | copies a range of objects to an uninitialized area of memory (function template) | | [uninitialized\_copy\_n](memory/uninitialized_copy_n "cpp/memory/uninitialized copy n") (C++11) | copies a number of objects to an uninitialized area of memory (function template) | | [uninitialized\_fill](memory/uninitialized_fill "cpp/memory/uninitialized fill") | copies an object to an uninitialized area of memory, defined by a range (function template) | | [uninitialized\_fill\_n](memory/uninitialized_fill_n "cpp/memory/uninitialized fill n") | copies an object to an uninitialized area of memory, defined by a start and a count (function template) | | [uninitialized\_move](memory/uninitialized_move "cpp/memory/uninitialized move") (C++17) | moves a range of objects to an uninitialized area of memory (function template) | | [uninitialized\_move\_n](memory/uninitialized_move_n "cpp/memory/uninitialized move n") (C++17) | moves a number of objects to an uninitialized area of memory (function template) | | [uninitialized\_default\_construct](memory/uninitialized_default_construct "cpp/memory/uninitialized default construct") (C++17) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a range (function template) | | [uninitialized\_default\_construct\_n](memory/uninitialized_default_construct_n "cpp/memory/uninitialized default construct n") (C++17) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a start and a count (function template) | | [uninitialized\_value\_construct](memory/uninitialized_value_construct "cpp/memory/uninitialized value construct") (C++17) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a range (function template) | | [uninitialized\_value\_construct\_n](memory/uninitialized_value_construct_n "cpp/memory/uninitialized value construct n") (C++17) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a start and a count (function template) | | [destroy\_at](memory/destroy_at "cpp/memory/destroy at") (C++17) | destroys an object at a given address (function template) | | [destroy](memory/destroy "cpp/memory/destroy") (C++17) | destroys a range of objects (function template) | | [destroy\_n](memory/destroy_n "cpp/memory/destroy n") (C++17) | destroys a number of objects in a range (function template) | | [construct\_at](memory/construct_at "cpp/memory/construct at") (C++20) | creates an object at a given address (function template) | ### Constrained uninitialized memory algorithms (since C++20) C++20 provides [constrained](language/constraints "cpp/language/constraints") uninitialized memory algorithms that accept range arguments or iterator-sentinel pairs. | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | Defined in namespace `std::ranges` | | [*no-throw-input-iteratorno-throw-forward-iteratorno-throw-sentinel-forno-throw-input-rangeno-throw-forward-range*](memory/ranges/nothrow_concepts "cpp/memory/ranges/nothrow concepts") (C++20) | Specifies some operations on iterators, sentinels and ranges are non-throwing (exposition-only concept) | | [ranges::uninitialized\_copy](memory/ranges/uninitialized_copy "cpp/memory/ranges/uninitialized copy") (C++20) | copies a range of objects to an uninitialized area of memory (niebloid) | | [ranges::uninitialized\_copy\_n](memory/ranges/uninitialized_copy_n "cpp/memory/ranges/uninitialized copy n") (C++20) | copies a number of objects to an uninitialized area of memory (niebloid) | | [ranges::uninitialized\_fill](memory/ranges/uninitialized_fill "cpp/memory/ranges/uninitialized fill") (C++20) | copies an object to an uninitialized area of memory, defined by a range (niebloid) | | [ranges::uninitialized\_fill\_n](memory/ranges/uninitialized_fill_n "cpp/memory/ranges/uninitialized fill n") (C++20) | copies an object to an uninitialized area of memory, defined by a start and a count (niebloid) | | [ranges::uninitialized\_move](memory/ranges/uninitialized_move "cpp/memory/ranges/uninitialized move") (C++20) | moves a range of objects to an uninitialized area of memory (niebloid) | | [ranges::uninitialized\_move\_n](memory/ranges/uninitialized_move_n "cpp/memory/ranges/uninitialized move n") (C++20) | moves a number of objects to an uninitialized area of memory (niebloid) | | [ranges::uninitialized\_default\_construct](memory/ranges/uninitialized_default_construct "cpp/memory/ranges/uninitialized default construct") (C++20) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a range (niebloid) | | [ranges::uninitialized\_default\_construct\_n](memory/ranges/uninitialized_default_construct_n "cpp/memory/ranges/uninitialized default construct n") (C++20) | constructs objects by [default-initialization](language/default_initialization "cpp/language/default initialization") in an uninitialized area of memory, defined by a start and count (niebloid) | | [ranges::uninitialized\_value\_construct](memory/ranges/uninitialized_value_construct "cpp/memory/ranges/uninitialized value construct") (C++20) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a range (niebloid) | | [ranges::uninitialized\_value\_construct\_n](memory/ranges/uninitialized_value_construct_n "cpp/memory/ranges/uninitialized value construct n") (C++20) | constructs objects by [value-initialization](language/value_initialization "cpp/language/value initialization") in an uninitialized area of memory, defined by a start and a count (niebloid) | | [ranges::destroy\_at](memory/ranges/destroy_at "cpp/memory/ranges/destroy at") (C++20) | destroys an object at a given address (niebloid) | | [ranges::destroy](memory/ranges/destroy "cpp/memory/ranges/destroy") (C++20) | destroys a range of objects (niebloid) | | [ranges::destroy\_n](memory/ranges/destroy_n "cpp/memory/ranges/destroy n") (C++20) | destroys a number of objects in a range (niebloid) | | [ranges::construct\_at](memory/ranges/construct_at "cpp/memory/ranges/construct at") (C++20) | creates an object at a given address (niebloid) | ### Garbage collector support (until C++23) | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | [declare\_reachable](memory/gc/declare_reachable "cpp/memory/gc/declare reachable") (C++11)(removed in C++23) | declares that an object can not be recycled (function) | | [undeclare\_reachable](memory/gc/undeclare_reachable "cpp/memory/gc/undeclare reachable") (C++11)(removed in C++23) | declares that an object can be recycled (function template) | | [declare\_no\_pointers](memory/gc/declare_no_pointers "cpp/memory/gc/declare no pointers") (C++11)(removed in C++23) | declares that a memory area does not contain traceable pointers (function) | | [undeclare\_no\_pointers](memory/gc/undeclare_no_pointers "cpp/memory/gc/undeclare no pointers") (C++11)(removed in C++23) | cancels the effect of `[std::declare\_no\_pointers](memory/gc/declare_no_pointers "cpp/memory/gc/declare no pointers")` (function) | | [pointer\_safety](memory/gc/pointer_safety "cpp/memory/gc/pointer safety") (C++11)(removed in C++23) | lists pointer safety models (enum) | | [get\_pointer\_safety](memory/gc/get_pointer_safety "cpp/memory/gc/get pointer safety") (C++11)(removed in C++23) | returns the current pointer safety model (function) | ### Miscellaneous | Defined in header `[<memory>](header/memory "cpp/header/memory")` | | --- | | [pointer\_traits](memory/pointer_traits "cpp/memory/pointer traits") (C++11) | provides information about pointer-like types (class template) | | [to\_address](memory/to_address "cpp/memory/to address") (C++20) | obtains a raw pointer from a pointer-like type (function template) | | [addressof](memory/addressof "cpp/memory/addressof") (C++11) | obtains actual address of an object, even if the *&* operator is overloaded (function template) | | [align](memory/align "cpp/memory/align") (C++11) | aligns a pointer in a buffer (function) | | [assume\_aligned](memory/assume_aligned "cpp/memory/assume aligned") (C++20) | informs the compiler that a pointer is aligned (function template) | ### [Low level memory management](memory/new "cpp/memory/new") Includes e.g. `[operator new](memory/new/operator_new "cpp/memory/new/operator new")`, `[operator delete](memory/new/operator_delete "cpp/memory/new/operator delete")`, `[std::set\_new\_handler](memory/new/set_new_handler "cpp/memory/new/set new handler")`. | Defined in header `[<new>](header/new "cpp/header/new")` | | --- | ### [C-style memory management](memory/c "cpp/memory/c") Includes e.g. `[std::malloc](memory/c/malloc "cpp/memory/c/malloc")`, `[std::free](memory/c/free "cpp/memory/c/free")`. | Defined in header `[<cstdlib>](header/cstdlib "cpp/header/cstdlib")` | | --- |
programming_docs
cpp Type support (basic types, RTTI) Type support (basic types, RTTI) ================================ See also [type system overview](language/type "cpp/language/type") and [fundamental types defined by the language](language/types "cpp/language/types"). ### Additional basic types and macros | Defined in header `[<cstddef>](header/cstddef "cpp/header/cstddef")` | | --- | | [size\_t](types/size_t "cpp/types/size t") | unsigned integer type returned by the [`sizeof`](language/sizeof "cpp/language/sizeof") operator (typedef) | | [ptrdiff\_t](types/ptrdiff_t "cpp/types/ptrdiff t") | signed integer type returned when subtracting two pointers (typedef) | | [nullptr\_t](types/nullptr_t "cpp/types/nullptr t") (C++11) | the type of the null pointer literal [`nullptr`](language/nullptr "cpp/language/nullptr") (typedef) | | [NULL](types/null "cpp/types/NULL") | implementation-defined null pointer constant (macro constant) | | [max\_align\_t](types/max_align_t "cpp/types/max align t") (C++11) | trivial type with alignment requirement as great as any other scalar type (typedef) | | [offsetof](types/offsetof "cpp/types/offsetof") | byte offset from the beginning of a standard-layout type to specified member (function macro) | | [byte](types/byte "cpp/types/byte") (C++17) | the byte type (enum) | | Defined in header `[<cstdbool>](header/cstdbool "cpp/header/cstdbool")` (deprecated) (until C++20) | | --- | | Defined in header `[<stdbool.h>](header/cstdbool "cpp/header/cstdbool")` | | \_\_bool\_true\_false\_are\_defined (C++11) | C compatibility macro constant, expands to integer literal `1` (macro constant) | | Defined in header `[<cstdalign>](header/cstdalign "cpp/header/cstdalign")` (deprecated) (until C++20) | | Defined in header `[<stdalign.h>](header/cstdalign "cpp/header/cstdalign")` | | \_\_alignas\_is\_defined (C++11) | C compatibility macro constant, expands to integer literal `1` (macro constant) | ### [Fixed width integer types](types/integer "cpp/types/integer") (since C++11) ### Numeric limits | Defined in header `[<limits>](header/limits "cpp/header/limits")` | | --- | | [numeric\_limits](types/numeric_limits "cpp/types/numeric limits") | provides an interface to query properties of all fundamental numeric types. (class template) | #### [C numeric limits interface](types/climits "cpp/types/climits") ### Runtime type identification | Defined in header `[<typeinfo>](header/typeinfo "cpp/header/typeinfo")` | | --- | | [type\_info](types/type_info "cpp/types/type info") | contains some type's information, generated by the implementation. This is the class returned by the [`typeid`](language/typeid "cpp/language/typeid") operator. (class) | | [bad\_typeid](types/bad_typeid "cpp/types/bad typeid") | exception that is thrown if an argument in a [typeid expression](language/typeid "cpp/language/typeid") is null (class) | | [bad\_cast](types/bad_cast "cpp/types/bad cast") | exception that is thrown by an invalid [`dynamic_cast`](language/dynamic_cast "cpp/language/dynamic cast") expression, i.e. a cast of reference type fails (class) | | Defined in header `[<typeindex>](header/typeindex "cpp/header/typeindex")` | | [type\_index](types/type_index "cpp/types/type index") (C++11) | wrapper around a `type_info` object, that can be used as index in associative and unordered associative containers (class) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/types "c/types") for Type support library | cpp Atomic operations library Atomic operations library ========================= The atomic library provides components for fine-grained atomic operations allowing for lockless concurrent programming. Each atomic operation is indivisible with regards to any other atomic operation that involves the same object. Atomic objects are [free of data races](language/memory_model#Threads_and_data_races "cpp/language/memory model"). | Defined in header `[<atomic>](header/atomic "cpp/header/atomic")` | | --- | | Atomic types | | [atomic](atomic/atomic "cpp/atomic/atomic") (C++11) | atomic class template and specializations for bool, integral, and pointer types (class template) | | [atomic\_ref](atomic/atomic_ref "cpp/atomic/atomic ref") (C++20) | provides atomic operations on non-atomic objects (class template) | | Operations on atomic types | | [atomic\_is\_lock\_free](atomic/atomic_is_lock_free "cpp/atomic/atomic is lock free") (C++11) | checks if the atomic type's operations are lock-free (function template) | | [atomic\_storeatomic\_store\_explicit](atomic/atomic_store "cpp/atomic/atomic store") (C++11)(C++11) | atomically replaces the value of the atomic object with a non-atomic argument (function template) | | [atomic\_loadatomic\_load\_explicit](atomic/atomic_load "cpp/atomic/atomic load") (C++11)(C++11) | atomically obtains the value stored in an atomic object (function template) | | [atomic\_exchangeatomic\_exchange\_explicit](atomic/atomic_exchange "cpp/atomic/atomic exchange") (C++11)(C++11) | atomically replaces the value of the atomic object with non-atomic argument and returns the old value of the atomic (function template) | | [atomic\_compare\_exchange\_weakatomic\_compare\_exchange\_weak\_explicitatomic\_compare\_exchange\_strongatomic\_compare\_exchange\_strong\_explicit](atomic/atomic_compare_exchange "cpp/atomic/atomic compare exchange") (C++11)(C++11)(C++11)(C++11) | atomically compares the value of the atomic object with non-atomic argument and performs atomic exchange if equal or atomic load if not (function template) | | [atomic\_fetch\_addatomic\_fetch\_add\_explicit](atomic/atomic_fetch_add "cpp/atomic/atomic fetch add") (C++11)(C++11) | adds a non-atomic value to an atomic object and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_subatomic\_fetch\_sub\_explicit](atomic/atomic_fetch_sub "cpp/atomic/atomic fetch sub") (C++11)(C++11) | subtracts a non-atomic value from an atomic object and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_andatomic\_fetch\_and\_explicit](atomic/atomic_fetch_and "cpp/atomic/atomic fetch and") (C++11)(C++11) | replaces the atomic object with the result of bitwise AND with a non-atomic argument and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_oratomic\_fetch\_or\_explicit](atomic/atomic_fetch_or "cpp/atomic/atomic fetch or") (C++11)(C++11) | replaces the atomic object with the result of bitwise OR with a non-atomic argument and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_xoratomic\_fetch\_xor\_explicit](atomic/atomic_fetch_xor "cpp/atomic/atomic fetch xor") (C++11)(C++11) | replaces the atomic object with the result of bitwise XOR with a non-atomic argument and obtains the previous value of the atomic (function template) | | [atomic\_waitatomic\_wait\_explicit](atomic/atomic_wait "cpp/atomic/atomic wait") (C++20)(C++20) | blocks the thread until notified and the atomic value changes (function template) | | [atomic\_notify\_one](atomic/atomic_notify_one "cpp/atomic/atomic notify one") (C++20) | notifies a thread blocked in atomic\_wait (function template) | | [atomic\_notify\_all](atomic/atomic_notify_all "cpp/atomic/atomic notify all") (C++20) | notifies all threads blocked in atomic\_wait (function template) | | Flag type and operations | | [atomic\_flag](atomic/atomic_flag "cpp/atomic/atomic flag") (C++11) | the lock-free boolean atomic type (class) | | [atomic\_flag\_test\_and\_setatomic\_flag\_test\_and\_set\_explicit](atomic/atomic_flag_test_and_set "cpp/atomic/atomic flag test and set") (C++11)(C++11) | atomically sets the flag to `true` and returns its previous value (function) | | [atomic\_flag\_clearatomic\_flag\_clear\_explicit](atomic/atomic_flag_clear "cpp/atomic/atomic flag clear") (C++11)(C++11) | atomically sets the value of the flag to `false` (function) | | [atomic\_flag\_testatomic\_flag\_test\_explicit](atomic/atomic_flag_test "cpp/atomic/atomic flag test") (C++20)(C++20) | atomically returns the value of the flag (function) | | [atomic\_flag\_waitatomic\_flag\_wait\_explicit](atomic/atomic_flag_wait "cpp/atomic/atomic flag wait") (C++20)(C++20) | blocks the thread until notified and the flag changes (function) | | [atomic\_flag\_notify\_one](atomic/atomic_flag_notify_one "cpp/atomic/atomic flag notify one") (C++20) | notifies a thread blocked in atomic\_flag\_wait (function) | | [atomic\_flag\_notify\_all](atomic/atomic_flag_notify_all "cpp/atomic/atomic flag notify all") (C++20) | notifies all threads blocked in atomic\_flag\_wait (function) | | Initialization | | [atomic\_init](atomic/atomic_init "cpp/atomic/atomic init") (C++11)(deprecated in C++20) | non-atomic initialization of a default-constructed atomic object (function template) | | [ATOMIC\_VAR\_INIT](atomic/atomic_var_init "cpp/atomic/ATOMIC VAR INIT") (C++11)(deprecated in C++20) | constant initialization of an atomic variable of static storage duration (function macro) | | [ATOMIC\_FLAG\_INIT](atomic/atomic_flag_init "cpp/atomic/ATOMIC FLAG INIT") (C++11) | initializes an `[std::atomic\_flag](atomic/atomic_flag "cpp/atomic/atomic flag")` to `false` (macro constant) | | Memory synchronization ordering | | [memory\_order](atomic/memory_order "cpp/atomic/memory order") (C++11) | defines memory ordering constraints for the given atomic operation (enum) | | [kill\_dependency](atomic/kill_dependency "cpp/atomic/kill dependency") (C++11) | removes the specified object from the `[std::memory\_order\_consume](atomic/memory_order "cpp/atomic/memory order")` dependency tree (function template) | | [atomic\_thread\_fence](atomic/atomic_thread_fence "cpp/atomic/atomic thread fence") (C++11) | generic memory order-dependent fence synchronization primitive (function) | | [atomic\_signal\_fence](atomic/atomic_signal_fence "cpp/atomic/atomic signal fence") (C++11) | fence between a thread and a signal handler executed in the same thread (function) | ### C Compatibility for atomic types | Defined in header `[<stdatomic.h>](header/stdatomic.h "cpp/header/stdatomic.h")` | | --- | | [\_Atomic](atomic/atomic "cpp/atomic/atomic") (C++23) | compatibility macro such that `_Atomic(T)` is identical to `[std::atomic](http://en.cppreference.com/w/cpp/atomic/atomic)<T>` (function macro) | | | | | --- | --- | | Neither the `_Atomic` macro, nor any of the non-macro global namespace declarations are provided by any C++ standard library header other than `<stdatomic.h>`. | (since C++23) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/atomic "c/atomic") for Atomic operations library | cpp Utility library Utility library =============== C++ includes a variety of utility libraries that provide functionality ranging from [bit-counting](utility/bitset "cpp/utility/bitset") to [partial function application](utility/functional/bind "cpp/utility/functional/bind"). These libraries can be broadly divided into two groups: * language support libraries, and * general-purpose libraries. Language support ----------------- Language support libraries provide classes and functions that interact closely with language features and support common language idioms. ### [Type support](types "cpp/types") Basic types (e.g. `[std::size\_t](types/size_t "cpp/types/size t")`, `[std::nullptr\_t](types/nullptr_t "cpp/types/nullptr t")`), RTTI (e.g. `[std::type\_info](types/type_info "cpp/types/type info")`), type traits (e.g. `[std::is\_integral](types/is_integral "cpp/types/is integral")`, `[std::rank](types/rank "cpp/types/rank")`). ### Constant evaluation context | Defined in header `[<type\_traits>](header/type_traits "cpp/header/type traits")` | | --- | | [is\_constant\_evaluated](types/is_constant_evaluated "cpp/types/is constant evaluated") (C++20) | detects whether the call occurs within a constant-evaluated context (function) | ### [Implementation properties](utility/feature_test "cpp/utility/feature test") | | | | --- | --- | | The header [`<version>`](header/version "cpp/header/version") supplies implementation-dependent information about the C++ standard library (such as the version number and release date). It also defines the [library feature-test macros](utility/feature_test "cpp/utility/feature test"). | (since C++20) | ### [Program utilities](utility/program "cpp/utility/program") Termination (e.g. `[std::abort](utility/program/abort "cpp/utility/program/abort")`, `[std::atexit](utility/program/atexit "cpp/utility/program/atexit")`), environment (e.g. `[std::system](utility/program/system "cpp/utility/program/system")`), signals (e.g. `[std::raise](utility/program/raise "cpp/utility/program/raise")`). ### [Dynamic memory management](memory "cpp/memory") Smart pointers (e.g. `[std::shared\_ptr](memory/shared_ptr "cpp/memory/shared ptr")`), allocators (e.g. `[std::allocator](memory/allocator "cpp/memory/allocator")` or `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")`), C-style memory management (e.g. `[std::malloc](memory/c/malloc "cpp/memory/c/malloc")`). ### [Error handling](error "cpp/error") Exceptions (e.g. `[std::exception](error/exception "cpp/error/exception")`, `[std::terminate](error/terminate "cpp/error/terminate")`), assertions (e.g. `[assert](error/assert "cpp/error/assert")`). ### [Source code information capture](utility/source_location "cpp/utility/source location") | Defined in header `[<source\_location>](header/source_location "cpp/header/source location")` | | --- | | [source\_location](utility/source_location "cpp/utility/source location") (C++20) | A class representing information about the source code, such as file names, line numbers, and function names (class) | ### [Initializer lists](utility/initializer_list "cpp/utility/initializer list") | Defined in header `[<initializer\_list>](header/initializer_list "cpp/header/initializer list")` | | --- | | [initializer\_list](utility/initializer_list "cpp/utility/initializer list") (C++11) | creates a temporary array in [list-initialization](language/list_initialization "cpp/language/list initialization") and then references it (class template) | ### Three-way comparison | Defined in header `[<compare>](header/compare "cpp/header/compare")` | | --- | | [three\_way\_comparablethree\_way\_comparable\_with](utility/compare/three_way_comparable "cpp/utility/compare/three way comparable") (C++20) | specifies that operator `<=>` produces consistent result on given types (concept) | | [partial\_ordering](utility/compare/partial_ordering "cpp/utility/compare/partial ordering") (C++20) | the result type of 3-way comparison that supports all 6 operators, is not substitutable, and allows incomparable values (class) | | [weak\_ordering](utility/compare/weak_ordering "cpp/utility/compare/weak ordering") (C++20) | the result type of 3-way comparison that supports all 6 operators and is not substitutable (class) | | [strong\_ordering](utility/compare/strong_ordering "cpp/utility/compare/strong ordering") (C++20) | the result type of 3-way comparison that supports all 6 operators and is substitutable (class) | | [is\_eqis\_neqis\_ltis\_lteqis\_gtis\_gteq](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (C++20) | named comparison functions (function) | | [compare\_three\_way](utility/compare/compare_three_way "cpp/utility/compare/compare three way") (C++20) | function object implementing `x <=> y` (class) | | [compare\_three\_way\_result](utility/compare/compare_three_way_result "cpp/utility/compare/compare three way result") (C++20) | obtains the result type of the three-way comparison operator `<=>` on given types (class template) | | [common\_comparison\_category](utility/compare/common_comparison_category "cpp/utility/compare/common comparison category") (C++20) | the strongest comparison category to which all of the given types can be converted (class template) | | [strong\_order](utility/compare/strong_order "cpp/utility/compare/strong order") (C++20) | performs 3-way comparison and produces a result of type `std::strong_ordering` (customization point object) | | [weak\_order](utility/compare/weak_order "cpp/utility/compare/weak order") (C++20) | performs 3-way comparison and produces a result of type `std::weak_ordering` (customization point object) | | [partial\_order](utility/compare/partial_order "cpp/utility/compare/partial order") (C++20) | performs 3-way comparison and produces a result of type `std::partial_ordering` (customization point object) | | [compare\_strong\_order\_fallback](utility/compare/compare_strong_order_fallback "cpp/utility/compare/compare strong order fallback") (C++20) | performs 3-way comparison and produces a result of type `std::strong_ordering`, even if `operator<=>` is unavailable (customization point object) | | [compare\_weak\_order\_fallback](utility/compare/compare_weak_order_fallback "cpp/utility/compare/compare weak order fallback") (C++20) | performs 3-way comparison and produces a result of type `std::weak_ordering`, even if `operator<=>` is unavailable (customization point object) | | [compare\_partial\_order\_fallback](utility/compare/compare_partial_order_fallback "cpp/utility/compare/compare partial order fallback") (C++20) | performs 3-way comparison and produces a result of type `std::partial_ordering`, even if `operator<=>` is unavailable (customization point object) | ### [Coroutine support](coroutine "cpp/coroutine") | | | | --- | --- | | Types for coroutine support, e.g. `[std::coroutine\_traits](coroutine/coroutine_traits "cpp/coroutine/coroutine traits")`, `[std::coroutine\_handle](coroutine/coroutine_handle "cpp/coroutine/coroutine handle")`. | (since C++20) | ### [Variadic functions](utility/variadic "cpp/utility/variadic") Support for functions that take an arbitrary number of parameters (via e.g. `[va\_start](utility/variadic/va_start "cpp/utility/variadic/va start")`, `[va\_arg](utility/variadic/va_arg "cpp/utility/variadic/va arg")`, `[va\_end](utility/variadic/va_end "cpp/utility/variadic/va end")`). General-purpose utilities -------------------------- ### Swap and type operations | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | --- | | [swap](algorithm/swap "cpp/algorithm/swap") | swaps the values of two objects (function template) | | [exchange](utility/exchange "cpp/utility/exchange") (C++14) | replaces the argument with a new value and returns its previous value (function template) | | [forward](utility/forward "cpp/utility/forward") (C++11) | forwards a function argument (function template) | | [forward\_like](utility/forward_like "cpp/utility/forward like") (C++23) | forwards a function argument as if casting it to the value category and constness of the expression of specified type template argument (function template) | | [move](utility/move "cpp/utility/move") (C++11) | obtains an rvalue reference (function template) | | [move\_if\_noexcept](utility/move_if_noexcept "cpp/utility/move if noexcept") (C++11) | obtains an rvalue reference if the move constructor does not throw (function template) | | [as\_const](utility/as_const "cpp/utility/as const") (C++17) | obtains a reference to const to its argument (function template) | | [declval](utility/declval "cpp/utility/declval") (C++11) | obtains a reference to its argument for use in unevaluated context (function template) | | [to\_underlying](utility/to_underlying "cpp/utility/to underlying") (C++23) | converts an enumeration to its underlying type (function template) | | Defined in header `[<concepts>](header/concepts "cpp/header/concepts")` | | [ranges::swap](utility/ranges/swap "cpp/utility/ranges/swap") (C++20) | swaps the values of two objects (customization point object) | ### Integer comparison functions | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | --- | | [cmp\_equalcmp\_not\_equalcmp\_lesscmp\_greatercmp\_less\_equalcmp\_greater\_equal](utility/intcmp "cpp/utility/intcmp") (C++20) | compares two integer values without value change caused by conversion (function template) | | [in\_range](utility/in_range "cpp/utility/in range") (C++20) | checks if an integer value is in the range of a given integer type (function template) | ### Relational operators | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | --- | | Defined in namespace `std::rel_ops` | | [operator!=operator>operator<=operator>=](utility/rel_ops/operator_cmp "cpp/utility/rel ops/operator cmp") (deprecated in C++20) | automatically generates comparison operators based on user-defined `operator==` and `operator<` (function template) | ### [Pairs](utility/pair "cpp/utility/pair") and [tuples](utility/tuple "cpp/utility/tuple") | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | --- | | [pair](utility/pair "cpp/utility/pair") | implements binary tuple, i.e. a pair of values (class template) | | [piecewise\_construct](utility/piecewise_construct "cpp/utility/piecewise construct") (C++11) | an object of type `piecewise_construct_t` used to disambiguate functions for piecewise construction (constant) | | [integer\_sequence](utility/integer_sequence "cpp/utility/integer sequence") (C++14) | implements compile-time sequence of integers (class template) | | Defined in header `[<tuple>](header/tuple "cpp/header/tuple")` | | [tuple](utility/tuple "cpp/utility/tuple") (C++11) | implements fixed size container, which holds elements of possibly different types (class template) | | [apply](utility/apply "cpp/utility/apply") (C++17) | calls a function with a tuple of arguments (function template) | | [make\_from\_tuple](utility/make_from_tuple "cpp/utility/make from tuple") (C++17) | Construct an object with a tuple of arguments (function template) | | Defined in header `[<tuple>](header/tuple "cpp/header/tuple")` | | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | Defined in header `[<array>](header/array "cpp/header/array")` | | Defined in header `[<ranges>](header/ranges "cpp/header/ranges")` | | [tuple\_size](utility/tuple_size "cpp/utility/tuple size") (C++11) | obtains the number of elements of a tuple-like type (class template) | | [tuple\_element](utility/tuple_element "cpp/utility/tuple element") (C++11) | obtains the element types of a tuple-like type (class template) | ### Sum types and type erased wrappers | Defined in header `[<optional>](header/optional "cpp/header/optional")` | | --- | | [optional](utility/optional "cpp/utility/optional") (C++17) | a wrapper that may or may not hold an object (class template) | | Defined in header `[<expected>](header/expected "cpp/header/expected")` | | [expected](utility/expected "cpp/utility/expected") (C++23) | a wrapper that contains either an expected or error value (class template) | | Defined in header `[<variant>](header/variant "cpp/header/variant")` | | [variant](utility/variant "cpp/utility/variant") (C++17) | a type-safe discriminated union (class template) | | Defined in header `[<any>](header/any "cpp/header/any")` | | [any](utility/any "cpp/utility/any") (C++17) | Objects that hold instances of any [CopyConstructible](named_req/copyconstructible "cpp/named req/CopyConstructible") type. (class) | | Defined in header `[<utility>](header/utility "cpp/header/utility")` | | [in\_place in\_place\_type in\_place\_index in\_place\_t in\_place\_type\_t in\_place\_index\_t](utility/in_place "cpp/utility/in place") (C++17) | in-place construction tag (class template) | ### [Bitset](utility/bitset "cpp/utility/bitset") | Defined in header `[<bitset>](header/bitset "cpp/header/bitset")` | | --- | | [bitset](utility/bitset "cpp/utility/bitset") | implements constant length bit array (class template) | ### [Function objects](utility/functional "cpp/utility/functional") Partial function application (e.g. `[std::bind](utility/functional/bind "cpp/utility/functional/bind")`) and related utilities: utilities for binding such as `[std::ref](utility/functional/ref "cpp/utility/functional/ref")` and `[std::placeholders](utility/functional/placeholders "cpp/utility/functional/placeholders")`, polymorphic function wrappers: `[std::function](utility/functional/function "cpp/utility/functional/function")`, predefined functors (e.g. `[std::plus](utility/functional/plus "cpp/utility/functional/plus")`, `[std::equal\_to](utility/functional/equal_to "cpp/utility/functional/equal to")`), pointer-to-member to function converters `[std::mem\_fn](utility/functional/mem_fn "cpp/utility/functional/mem fn")`. ### [Hash support](utility/hash "cpp/utility/hash") | Defined in header `[<functional>](header/functional "cpp/header/functional")` | | --- | | [hash](utility/hash "cpp/utility/hash") (C++11) | hash function object (class template) | ### [Date and time](chrono "cpp/chrono") Time tracking (e.g. `[std::chrono::time\_point](chrono/time_point "cpp/chrono/time point")`, `[std::chrono::duration](chrono/duration "cpp/chrono/duration")`), C-style date and time (e.g. `[std::time](chrono/c/time "cpp/chrono/c/time")`, `[std::clock](chrono/c/clock "cpp/chrono/c/clock")`). ### Elementary string conversions In addition to sophisticated locale-dependent parsers and formatters provided by the [C++ I/O](io "cpp/io") library, the [C I/O](io/c "cpp/io/c") library, [C++ string converters](string/basic_string#Numeric_conversions "cpp/string/basic string"), and [C string converters](string/byte#Conversions_to_numeric_formats "cpp/string/byte"), the header [`<charconv>`](header/charconv "cpp/header/charconv") provides light-weight, locale-independent, non-allocating, non-throwing parsers and formatters for arithmetic types. | Defined in header `[<charconv>](header/charconv "cpp/header/charconv")` | | --- | | [to\_chars](utility/to_chars "cpp/utility/to chars") (C++17) | converts an integer or floating-point value to a character sequence (function) | | [from\_chars](utility/from_chars "cpp/utility/from chars") (C++17) | converts a character sequence to an integer or floating-point value (function) | | [chars\_format](utility/chars_format "cpp/utility/chars format") (C++17) | specifies formatting for `std::to_chars` and `std::from_chars` (enum) | ### [Formatting library](utility/format "cpp/utility/format") Facilities for type-safe string formatting. | Defined in header `[<format>](header/format "cpp/header/format")` | | --- | | [format](utility/format/format "cpp/utility/format/format") (C++20) | stores formatted representation of the arguments in a new string (function template) | | [format\_to](utility/format/format_to "cpp/utility/format/format to") (C++20) | writes out formatted representation of its arguments through an output iterator (function template) | | [format\_to\_n](utility/format/format_to_n "cpp/utility/format/format to n") (C++20) | writes out formatted representation of its arguments through an output iterator, not exceeding specified size (function template) | | [formatted\_size](utility/format/formatted_size "cpp/utility/format/formatted size") (C++20) | determines the number of characters necessary to store the formatted representation of its arguments (function template) | | [vformat](utility/format/vformat "cpp/utility/format/vformat") (C++20) | non-template variant of `[std::format](utility/format/format "cpp/utility/format/format")` using type-erased argument representation (function) | | [vformat\_to](utility/format/vformat_to "cpp/utility/format/vformat to") (C++20) | non-template variant of `[std::format\_to](utility/format/format_to "cpp/utility/format/format to")` using type-erased argument representation (function template) | | [formatter](utility/format/formatter "cpp/utility/format/formatter") (C++20) | class template that defines formatting rules for a given type (class template) | | [format\_error](utility/format/format_error "cpp/utility/format/format error") (C++20) | exception type thrown on formatting errors (class) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/utility "c/utility") for Utility library |
programming_docs
cpp Date and time utilities Date and time utilities ======================= C++ includes support for two types of time manipulation: * The `chrono` library, a flexible collection of types that track time with varying degrees of precision (e.g. `[std::chrono::time\_point](chrono/time_point "cpp/chrono/time point")`). * C-style date and time library (e.g. `[std::time](chrono/c/time "cpp/chrono/c/time")`) ### [`std::chrono` library](header/chrono "cpp/header/chrono") The `chrono` library defines three main types as well as utility functions and common typedefs. * clocks * time points * durations #### Clocks A clock consists of a starting point (or epoch) and a tick rate. For example, a clock may have an epoch of January 1, 1970 and tick every second. C++ defines several clock types: | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [system\_clock](chrono/system_clock "cpp/chrono/system clock") (C++11) | wall clock time from the system-wide realtime clock (class) | | [steady\_clock](chrono/steady_clock "cpp/chrono/steady clock") (C++11) | monotonic clock that will never be adjusted (class) | | [high\_resolution\_clock](chrono/high_resolution_clock "cpp/chrono/high resolution clock") (C++11) | the clock with the shortest tick period available (class) | | [is\_clockis\_clock\_v](chrono/is_clock "cpp/chrono/is clock") (C++20) | determines if a type is a [Clock](named_req/clock "cpp/named req/Clock") (class template) (variable template) | | [utc\_clock](chrono/utc_clock "cpp/chrono/utc clock") (C++20) | [Clock](named_req/clock "cpp/named req/Clock") for Coordinated Universal Time (UTC) (class) | | [tai\_clock](chrono/tai_clock "cpp/chrono/tai clock") (C++20) | [Clock](named_req/clock "cpp/named req/Clock") for International Atomic Time (TAI) (class) | | [gps\_clock](chrono/gps_clock "cpp/chrono/gps clock") (C++20) | [Clock](named_req/clock "cpp/named req/Clock") for GPS time (class) | | [file\_clock](chrono/file_clock "cpp/chrono/file clock") (C++20) | [Clock](named_req/clock "cpp/named req/Clock") used for [file time](filesystem/file_time_type "cpp/filesystem/file time type") (typedef) | | [local\_t](chrono/local_t "cpp/chrono/local t") (C++20) | pseudo-clock representing local time (class) | #### Time point A time point is a duration of time that has passed since the epoch of a specific clock. | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [time\_point](chrono/time_point "cpp/chrono/time point") (C++11) | a point in time (class template) | | [clock\_time\_conversion](chrono/clock_time_conversion "cpp/chrono/clock time conversion") (C++20) | traits class defining how to convert time points of one clock to another (class template) | | [clock\_cast](chrono/clock_cast "cpp/chrono/clock cast") (C++20) | convert time points of one clock to another (function template) | #### Duration A duration consists of a span of time, defined as some number of ticks of some time unit. For example, "42 seconds" could be represented by a duration consisting of 42 ticks of a 1-second time unit. | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [duration](chrono/duration "cpp/chrono/duration") (C++11) | a time interval (class template) | #### Time of day `hh_mm_ss` splits a duration representing time elapsed since midnight into hours, minutes, seconds, and fractional seconds, as applicable. It is primarily a formatting tool. | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [hh\_mm\_ss](chrono/hh_mm_ss "cpp/chrono/hh mm ss") (C++20) | represents a time of day (class template) | | [is\_amis\_pmmake12make24](chrono/hour_fun "cpp/chrono/hour fun") (C++20) | translates between a 12h/24h format time of day (function) | #### Calendar | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [last\_spec](chrono/last_spec "cpp/chrono/last spec") (C++20) | tag class indicating the *last* day or weekday in a month (class) | | [day](chrono/day "cpp/chrono/day") (C++20) | represents a day of a month (class) | | [month](chrono/month "cpp/chrono/month") (C++20) | represents a month of a year (class) | | [year](chrono/year "cpp/chrono/year") (C++20) | represents a year in the Gregorian calendar (class) | | [weekday](chrono/weekday "cpp/chrono/weekday") (C++20) | represents a day of the week in the Gregorian calendar (class) | | [weekday\_indexed](chrono/weekday_indexed "cpp/chrono/weekday indexed") (C++20) | represents the n-th [`weekday`](chrono/weekday "cpp/chrono/weekday") of a month (class) | | [weekday\_last](chrono/weekday_last "cpp/chrono/weekday last") (C++20) | represents the last [`weekday`](chrono/weekday "cpp/chrono/weekday") of a month (class) | | [month\_day](chrono/month_day "cpp/chrono/month day") (C++20) | represents a specific [`day`](chrono/day "cpp/chrono/day") of a specific [`month`](chrono/month "cpp/chrono/month") (class) | | [month\_day\_last](chrono/month_day_last "cpp/chrono/month day last") (C++20) | represents the last day of a specific [`month`](chrono/month "cpp/chrono/month") (class) | | [month\_weekday](chrono/month_weekday "cpp/chrono/month weekday") (C++20) | represents the n-th [`weekday`](chrono/weekday "cpp/chrono/weekday") of a specific [`month`](chrono/month "cpp/chrono/month") (class) | | [month\_weekday\_last](chrono/month_weekday_last "cpp/chrono/month weekday last") (C++20) | represents the last [`weekday`](chrono/weekday "cpp/chrono/weekday") of a specific [`month`](chrono/month "cpp/chrono/month") (class) | | [year\_month](chrono/year_month "cpp/chrono/year month") (C++20) | represents a specific [`month`](chrono/month "cpp/chrono/month") of a specific [`year`](chrono/year "cpp/chrono/year") (class) | | [year\_month\_day](chrono/year_month_day "cpp/chrono/year month day") (C++20) | represents a specific [`year`](chrono/year "cpp/chrono/year"), [`month`](chrono/month "cpp/chrono/month"), and [`day`](chrono/day "cpp/chrono/day") (class) | | [year\_month\_day\_last](chrono/year_month_day_last "cpp/chrono/year month day last") (C++20) | represents the last day of a specific [`year`](chrono/year "cpp/chrono/year") and [`month`](chrono/month "cpp/chrono/month") (class) | | [year\_month\_weekday](chrono/year_month_weekday "cpp/chrono/year month weekday") (C++20) | represents the n-th [`weekday`](chrono/weekday "cpp/chrono/weekday") of a specific [`year`](chrono/year "cpp/chrono/year") and [`month`](chrono/month "cpp/chrono/month") (class) | | [year\_month\_weekday\_last](chrono/year_month_weekday_last "cpp/chrono/year month weekday last") (C++20) | represents the last [`weekday`](chrono/weekday "cpp/chrono/weekday") of a specific [`year`](chrono/year "cpp/chrono/year") and [`month`](chrono/month "cpp/chrono/month") (class) | | [operator/](chrono/operator_slash "cpp/chrono/operator slash") (C++20) | conventional syntax for Gregorian calendar date creation (function) | #### Time zone | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [tzdb](chrono/tzdb "cpp/chrono/tzdb") (C++20) | describes a copy of the IANA time zone database (class) | | [tzdb\_list](chrono/tzdb_list "cpp/chrono/tzdb list") (C++20) | represents a linked list of [`tzdb`](chrono/tzdb "cpp/chrono/tzdb") (class) | | [get\_tzdbget\_tzdb\_listreload\_tzdbremote\_version](chrono/tzdb_functions "cpp/chrono/tzdb functions") (C++20) | accesses and controls the global time zone database information (function) | | [locate\_zone](chrono/locate_zone "cpp/chrono/locate zone") (C++20) | locates a [`time_zone`](chrono/time_zone "cpp/chrono/time zone") based on its name (function) | | [current\_zone](chrono/current_zone "cpp/chrono/current zone") (C++20) | returns the current [`time_zone`](chrono/time_zone "cpp/chrono/time zone") (function) | | [time\_zone](chrono/time_zone "cpp/chrono/time zone") (C++20) | represents a time zone (class) | | [sys\_info](chrono/sys_info "cpp/chrono/sys info") (C++20) | represents information about a time zone at a particular time point (class) | | [local\_info](chrono/local_info "cpp/chrono/local info") (C++20) | represents information about a local time to UNIX time conversion (class) | | [choose](chrono/choose "cpp/chrono/choose") (C++20) | selects how an ambiguous local time should be resolved (enum) | | [zoned\_traits](chrono/zoned_traits "cpp/chrono/zoned traits") (C++20) | traits class for time zone pointers used by [`zoned_time`](chrono/zoned_time "cpp/chrono/zoned time") (class template) | | [zoned\_time](chrono/zoned_time "cpp/chrono/zoned time") (C++20) | represents a time zone and a time point (class) | | [leap\_second](chrono/leap_second "cpp/chrono/leap second") (C++20) | contains information about a leap second insertion (class) | | [leap\_second\_info](chrono/utc_clock/leap_second_info "cpp/chrono/utc clock/leap second info") (C++20) | leap second insertion information (class) | | [get\_leap\_second\_info](chrono/utc_clock/get_leap_second_info "cpp/chrono/utc clock/get leap second info") (C++20) | obtains leap second insertion information from a `utc_time` object (function template) | | [time\_zone\_link](chrono/time_zone_link "cpp/chrono/time zone link") (C++20) | represents an alternative name for a time zone (class) | | [nonexistent\_local\_time](chrono/nonexistent_local_time "cpp/chrono/nonexistent local time") (C++20) | exception thrown to report that a local time is nonexistent (class) | | [ambiguous\_local\_time](chrono/ambiguous_local_time "cpp/chrono/ambiguous local time") (C++20) | exception thrown to report that a local time is ambiguous (class) | #### `chrono` I/O | Defined in header `[<chrono>](header/chrono "cpp/header/chrono")` | | --- | | Defined in namespace `std::chrono` | | [parse](chrono/parse "cpp/chrono/parse") (C++20) | parses a `chrono` object from a stream (function template) | ### Notes | [Feature-test](utility/feature_test "cpp/utility/feature test") macro | Comment | | --- | --- | | [`__cpp_lib_chrono`](feature_test#Library_features "cpp/feature test") | for `std::chrono` library | ### [C-style date and time library](chrono/c "cpp/chrono/c") Also provided are the C-style date and time functions, such as `[std::time\_t](chrono/c/time_t "cpp/chrono/c/time t")`, `[std::difftime](chrono/c/difftime "cpp/chrono/c/difftime")`, and `[CLOCKS\_PER\_SEC](chrono/c/clocks_per_sec "cpp/chrono/c/CLOCKS PER SEC")`. ### Example This example displays information about the execution time of a function call: ``` #include <iostream> #include <chrono> long fibonacci(unsigned n) { if (n < 2) return n; return fibonacci(n-1) + fibonacci(n-2); } int main() { auto start = std::chrono::steady_clock::now(); std::cout << "f(42) = " << fibonacci(42) << '\n'; auto end = std::chrono::steady_clock::now(); std::chrono::duration<double> elapsed_seconds = end-start; std::cout << "elapsed time: " << elapsed_seconds.count() << "s\n"; } ``` Possible output: ``` f(42) = 267914296 elapsed time: 1.88232s ``` cpp C++ language C++ language ============ This is a reference of the core C++ language constructs. | | | | | --- | --- | --- | | **[Basic concepts](language/basic_concepts "cpp/language/basic concepts")**. [Comments](comment "cpp/comment") [ASCII chart](language/ascii "cpp/language/ascii") [Punctuation](language/punctuators "cpp/language/punctuators") [Names and identifiers](language/identifiers "cpp/language/identifiers") [Types](language/type "cpp/language/type") – [Fundamental types](language/types "cpp/language/types") [Object](language/object "cpp/language/object") – [Scope](language/scope "cpp/language/scope") – [Lifetime](language/lifetime "cpp/language/lifetime") [Definitions and ODR](language/definition "cpp/language/definition") [Name lookup](language/lookup "cpp/language/lookup") [qualified](language/qualified_lookup "cpp/language/qualified lookup") – [unqualified](language/unqualified_lookup "cpp/language/unqualified lookup") ([ADL](language/adl "cpp/language/adl")) [As-if rule](language/as_if "cpp/language/as if") [Undefined behavior](language/ub "cpp/language/ub") [Memory model and data races](language/memory_model "cpp/language/memory model") [Character sets and encodings](language/charset "cpp/language/charset") [Phases of translation](language/translation_phases "cpp/language/translation phases") [The `main` function](language/main_function "cpp/language/main function") [Modules](language/modules "cpp/language/modules") (C++20). **[Keywords](keyword "cpp/keyword")**. **[Preprocessor](preprocessor "cpp/preprocessor")**. [`#if` - `#ifdef` - `#ifndef` - `#elif`](preprocessor/conditional "cpp/preprocessor/conditional") [`#elifdef` - `#elifndef`](preprocessor/conditional "cpp/preprocessor/conditional") (C++23) [`#define` - `#` - `##`](preprocessor/replace "cpp/preprocessor/replace") [`#include`](preprocessor/include "cpp/preprocessor/include") - [`#pragma`](preprocessor/impl "cpp/preprocessor/impl") [`#line`](preprocessor/line "cpp/preprocessor/line") - [`#error` - `#warning`](preprocessor/error "cpp/preprocessor/error") (C++23). **[Expressions](language/expressions "cpp/language/expressions")**. [Value categories](language/value_category "cpp/language/value category") [Evaluation order and sequencing](language/eval_order "cpp/language/eval order") [Constant expressions](language/constant_expression "cpp/language/constant expression") [Operators](language/expressions#Operators "cpp/language/expressions") [assignment](language/operator_assignment "cpp/language/operator assignment") – [arithmetic](language/operator_arithmetic "cpp/language/operator arithmetic") [increment and decrement](language/operator_incdec "cpp/language/operator incdec") [logical](language/operator_logical "cpp/language/operator logical") – [comparison](language/operator_comparison "cpp/language/operator comparison") [member access and indirection](language/operator_member_access "cpp/language/operator member access") [call, comma, ternary](language/operator_other "cpp/language/operator other") [`sizeof`](language/sizeof "cpp/language/sizeof") – [`alignof`](language/alignof "cpp/language/alignof") (C++11) [`new`](language/new "cpp/language/new") – [`delete`](language/delete "cpp/language/delete") – [`typeid`](language/typeid "cpp/language/typeid") [Operator overloading](language/operators "cpp/language/operators") [Default comparisons](language/default_comparisons "cpp/language/default comparisons") (C++20) [Operator precedence](language/operator_precedence "cpp/language/operator precedence") [Conversions](language/expressions#Conversions "cpp/language/expressions") [implicit](language/implicit_cast "cpp/language/implicit cast") – [explicit](language/explicit_cast "cpp/language/explicit cast") – [user-defined](language/cast_operator "cpp/language/cast operator") [`static_cast`](language/static_cast "cpp/language/static cast") – [`dynamic_cast`](language/dynamic_cast "cpp/language/dynamic cast") [`const_cast`](language/const_cast "cpp/language/const cast") – [`reinterpret_cast`](language/reinterpret_cast "cpp/language/reinterpret cast") [Literals](language/expressions#Literals "cpp/language/expressions") [boolean](language/bool_literal "cpp/language/bool literal") – [integer](language/integer_literal "cpp/language/integer literal") – [floating](language/floating_literal "cpp/language/floating literal") [character](language/character_literal "cpp/language/character literal") – [string](language/string_literal "cpp/language/string literal") [`nullptr`](language/nullptr "cpp/language/nullptr") (C++11) [user-defined](language/user_literal "cpp/language/user literal") (C++11). | **[Declarations](language/declarations "cpp/language/declarations")**. [Namespace declaration](language/namespace "cpp/language/namespace") [Namespace alias](language/namespace_alias "cpp/language/namespace alias") [References](language/reference "cpp/language/reference") – [Pointers](language/pointer "cpp/language/pointer") – [Arrays](language/array "cpp/language/array") [Structured bindings](language/structured_binding "cpp/language/structured binding") (C++17) [Enumerations and enumerators](language/enum "cpp/language/enum") [Storage duration and linkage](language/storage_duration "cpp/language/storage duration") [Translation-unit-local](language/tu_local "cpp/language/tu local") (C++20) [Language linkage](language/language_linkage "cpp/language/language linkage") [`inline` specifier](language/inline "cpp/language/inline") [Inline assembly](language/asm "cpp/language/asm") [`const`/`volatile`](language/cv "cpp/language/cv") – [`constexpr`](language/constexpr "cpp/language/constexpr") (C++11) [`consteval`](language/consteval "cpp/language/consteval") (C++20) – [`constinit`](language/constinit "cpp/language/constinit") (C++20) [`decltype`](language/decltype "cpp/language/decltype") (C++11) – [`auto`](language/auto "cpp/language/auto") (C++11) [`typedef`](language/typedef "cpp/language/typedef") – [Type alias](language/type_alias "cpp/language/type alias") (C++11) [Elaborated type specifiers](language/elaborated_type_specifier "cpp/language/elaborated type specifier") [Attributes](language/attributes "cpp/language/attributes") (C++11) – [`alignas`](language/alignas "cpp/language/alignas") (C++11) [`static_assert`](language/static_assert "cpp/language/static assert") (C++11). **[Initialization](language/initialization "cpp/language/initialization")**. [Default initialization](language/default_initialization "cpp/language/default initialization") [Value initialization](language/value_initialization "cpp/language/value initialization") [Copy initialization](language/copy_initialization "cpp/language/copy initialization") [Direct initialization](language/direct_initialization "cpp/language/direct initialization") [Aggregate initialization](language/aggregate_initialization "cpp/language/aggregate initialization") [List initialization](language/list_initialization "cpp/language/list initialization") (C++11) [Reference initialization](language/reference_initialization "cpp/language/reference initialization") [Static non-local initialization](language/initialization#Non-local_variables "cpp/language/initialization") [zero](language/zero_initialization "cpp/language/zero initialization") – [constant](language/constant_initialization "cpp/language/constant initialization") [Dynamic non-local initialization](language/initialization#Non-local_variables "cpp/language/initialization") [ordered](language/initialization#Non-local_variables "cpp/language/initialization") – [unordered](language/initialization#Non-local_variables "cpp/language/initialization") [Copy elision](language/copy_elision "cpp/language/copy elision"). **[Functions](language/functions "cpp/language/functions")**. [Function declaration](language/function "cpp/language/function") [Default arguments](language/default_arguments "cpp/language/default arguments") [Variadic arguments](language/variadic_arguments "cpp/language/variadic arguments") [Lambda expression](language/lambda "cpp/language/lambda") (C++11) [Argument-dependent lookup](language/adl "cpp/language/adl") [Overload resolution](language/overload_resolution "cpp/language/overload resolution") [Operator overloading](language/operators "cpp/language/operators") [Address of an overload set](language/overloaded_address "cpp/language/overloaded address") [Coroutines](language/coroutines "cpp/language/coroutines") (C++20). **[Statements](language/statements "cpp/language/statements")**. [`if`](language/if "cpp/language/if") – [`switch`](language/switch "cpp/language/switch") [`for`](language/for "cpp/language/for") – [range-`for`](language/range-for "cpp/language/range-for") (C++11) [`while`](language/while "cpp/language/while") – [`do`-`while`](language/do "cpp/language/do") [`continue`](language/continue "cpp/language/continue") – [`break`](language/break "cpp/language/break") – [`goto`](language/goto "cpp/language/goto") – [`return`](language/return "cpp/language/return") [`synchronized` and `atomic`](language/transactional_memory "cpp/language/transactional memory") (TM TS). | **[Classes](language/classes "cpp/language/classes")**. [Class types](language/class "cpp/language/class") – [Union types](language/union "cpp/language/union") [injected-class-name](language/injected-class-name "cpp/language/injected-class-name") [Data members](language/data_members "cpp/language/data members") – [Bit fields](language/bit_field "cpp/language/bit field") [Member functions](language/member_functions "cpp/language/member functions") – [The `this` pointer](language/this "cpp/language/this") [Static members](language/static "cpp/language/static") – [Nested classes](language/nested_types "cpp/language/nested types") [Derived class](language/derived_class "cpp/language/derived class") – [`using`-declaration](language/using_declaration "cpp/language/using declaration") [Empty base optimization](language/ebo "cpp/language/ebo") [Virtual function](language/virtual "cpp/language/virtual") – [Abstract class](language/abstract_class "cpp/language/abstract class") [`override`](language/override "cpp/language/override") (C++11) – [`final`](language/final "cpp/language/final") (C++11) [Member access](language/access "cpp/language/access") – [`friend`](language/friend "cpp/language/friend") [Constructors and member initializer lists](language/constructor "cpp/language/constructor") [Default constructor](language/default_constructor "cpp/language/default constructor") – [Destructor](language/destructor "cpp/language/destructor") [Copy constructor](language/copy_constructor "cpp/language/copy constructor") – [Copy assignment](language/copy_assignment "cpp/language/copy assignment") [Move constructor](language/move_constructor "cpp/language/move constructor") (C++11) [Move assignment](language/move_assignment "cpp/language/move assignment") (C++11) [Converting constructor](language/converting_constructor "cpp/language/converting constructor") – [`explicit` specifier](language/explicit "cpp/language/explicit"). **[Templates](language/templates "cpp/language/templates")**. [Template parameters and arguments](language/template_parameters "cpp/language/template parameters") [Class template](language/class_template "cpp/language/class template") – [Function template](language/function_template "cpp/language/function template") [Variable template](language/variable_template "cpp/language/variable template") (C++14) [Class member template](language/member_template "cpp/language/member template") [Template argument deduction](language/template_argument_deduction "cpp/language/template argument deduction") [Class template argument deduction](language/class_template_argument_deduction "cpp/language/class template argument deduction") (C++17) [Explicit specialization](language/template_specialization "cpp/language/template specialization") – [Partial specialization](language/partial_specialization "cpp/language/partial specialization") [Parameter packs](language/parameter_pack "cpp/language/parameter pack") (C++11) – [`sizeof...`](language/sizeof... "cpp/language/sizeof...") (C++11) [Fold-expressions](language/fold "cpp/language/fold") (C++17) [Dependent names](language/dependent_name "cpp/language/dependent name") – [SFINAE](language/sfinae "cpp/language/sfinae") [Constraints and concepts](language/constraints "cpp/language/constraints") (C++20) [Requires expression](language/requires "cpp/language/requires") (C++20). **[Exceptions](language/exceptions "cpp/language/exceptions")**. [`throw`-expression](language/throw "cpp/language/throw") [`try`-`catch` block](language/try_catch "cpp/language/try catch") [function-`try`-block](language/function-try-block "cpp/language/function-try-block") [`noexcept` specifier](language/noexcept_spec "cpp/language/noexcept spec") (C++11) [`noexcept` operator](language/noexcept "cpp/language/noexcept") (C++11) [Dynamic exception specification](language/except_spec "cpp/language/except spec") (until C++17). **Miscellaneous**. [History of C++](language/history "cpp/language/history") [Extending the namespace std](language/extending_std "cpp/language/extending std") [Acronyms](language/acronyms "cpp/language/acronyms"). **Idioms**. [Resource acquisition is initialization](language/raii "cpp/language/raii") [Rule of three/five/zero](language/rule_of_three "cpp/language/rule of three") [Pointer to implementation](language/pimpl "cpp/language/pimpl") [Zero-overhead principle](language/zero-overhead_principle "cpp/language/Zero-overhead principle"). | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/language "c/language") for C language constructs |
programming_docs
cpp C++17 C++17 ===== Following features were merged into C++17: * From the [File System TS](https://en.cppreference.com/w/cpp/experimental/fs "cpp/experimental/fs"): the [filesystem library](filesystem "cpp/filesystem"). * From the [Library fundamentals v1 TS](https://en.cppreference.com/w/cpp/experimental/lib_extensions "cpp/experimental/lib extensions"): features, including `[std::any](utility/any "cpp/utility/any")`, `[std::optional](utility/optional "cpp/utility/optional")`, `[std::string\_view](string/basic_string_view "cpp/string/basic string view")`, `[std::apply](utility/apply "cpp/utility/apply")`, [polymorphic allocators](memory#Allocators "cpp/memory"), [searchers](utility/functional#Searchers "cpp/utility/functional"). * From [Library fundamentals v2 TS](https://en.cppreference.com/w/cpp/experimental/lib_extensions_2 "cpp/experimental/lib extensions 2"): `[std::void\_t](types/void_t "cpp/types/void t")`, `[std::conjunction](types/conjunction "cpp/types/conjunction")`, `[std::disjunction](types/disjunction "cpp/types/disjunction")`, `[std::negation](types/negation "cpp/types/negation")`, `[std::not\_fn](utility/functional/not_fn "cpp/utility/functional/not fn")`, `[std::gcd](numeric/gcd "cpp/numeric/gcd")`, `[std::lcm](numeric/lcm "cpp/numeric/lcm")`. * From [Parallelism v1 TS](https://en.cppreference.com/w/cpp/experimental/parallelism "cpp/experimental/parallelism"): features, including [execution policies](algorithm/execution_policy_tag_t "cpp/algorithm/execution policy tag t"), `[std::reduce](algorithm/reduce "cpp/algorithm/reduce")`, `[std::inclusive\_scan](algorithm/inclusive_scan "cpp/algorithm/inclusive scan")`, `[std::exclusive\_scan](algorithm/exclusive_scan "cpp/algorithm/exclusive scan")`, but removing `exception_list`. * From [Mathematical special functions IS](https://en.cppreference.com/w/cpp/experimental/special_functions "cpp/experimental/special functions"): [mathematical special functions](numeric/special_functions "cpp/numeric/special functions"). * From C11: `[std::aligned\_alloc](memory/c/aligned_alloc "cpp/memory/c/aligned alloc")`, `[std::timespec\_get](chrono/c/timespec_get "cpp/chrono/c/timespec get")`. Obsolete --------- ### Removed `[std::auto\_ptr](memory/auto_ptr "cpp/memory/auto ptr")`, [deprecated function objects](utility/functional#Old_binders_and_adaptors "cpp/utility/functional"), `[std::random\_shuffle](algorithm/random_shuffle "cpp/algorithm/random shuffle")`, `[std::unexpected](error/unexpected "cpp/error/unexpected")`, the [obsolete iostreams aliases](io/ios_base#Deprecated_member_types "cpp/io/ios base"), [trigraphs](language/operator_alternative#Trigraphs_.28removed_in_C.2B.2B17.29 "cpp/language/operator alternative"), the [`register`](keyword/register "cpp/keyword/register") keyword, [`bool` increment](language/operator_incdec "cpp/language/operator incdec"), [dynamic exception specification](language/except_spec "cpp/language/except spec"). ### Deprecated `[std::iterator](iterator/iterator "cpp/iterator/iterator")`, `[std::raw\_storage\_iterator](memory/raw_storage_iterator "cpp/memory/raw storage iterator")`, `[std::get\_temporary\_buffer](memory/get_temporary_buffer "cpp/memory/get temporary buffer")`, `[std::is\_literal\_type](types/is_literal_type "cpp/types/is literal type")`, `[std::result\_of](types/result_of "cpp/types/result of")`, all of [`<codecvt>`](header/codecvt "cpp/header/codecvt") New language features ---------------------- * [fold-expressions](language/fold "cpp/language/fold") * [class template argument deduction](language/class_template_argument_deduction "cpp/language/class template argument deduction") * non-type [template parameters](language/template_parameters "cpp/language/template parameters") declared with `auto` * compile-time [`if constexpr`](language/if "cpp/language/if") * [inline variables](language/inline "cpp/language/inline") * [structured bindings](language/structured_binding "cpp/language/structured binding") * initializers for [`if`](language/if "cpp/language/if") and [`switch`](language/switch "cpp/language/switch") * [`u8` character literal](language/character_literal "cpp/language/character literal") * simplified [nested namespaces](language/namespace "cpp/language/namespace") * [`using`-declaration](language/namespace "cpp/language/namespace") declaring multiple names * made [`noexcept`](language/noexcept_spec "cpp/language/noexcept spec") part of type system * new [order of evaluation](language/eval_order "cpp/language/eval order") rules * guaranteed [copy elision](language/copy_elision "cpp/language/copy elision") * [temporary materialization](language/implicit_conversion#Temporary_materialization "cpp/language/implicit conversion") * [lambda capture of `*this`](language/lambda#Lambda_capture "cpp/language/lambda") * [constexpr lambda](language/lambda "cpp/language/lambda") * [attribute namespaces](language/attributes "cpp/language/attributes") don't have to repeat * new [attributes](language/attributes "cpp/language/attributes"): + `[[[fallthrough](language/attributes/fallthrough "cpp/language/attributes/fallthrough")]]` + `[[[maybe\_unused](language/attributes/maybe_unused "cpp/language/attributes/maybe unused")]]` + `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` * [`__has_include`](preprocessor/include "cpp/preprocessor/include") New headers ------------ * [`<any>`](header/any "cpp/header/any") * [`<charconv>`](header/charconv "cpp/header/charconv") * [`<execution>`](header/execution "cpp/header/execution") * [`<filesystem>`](header/filesystem "cpp/header/filesystem") * [`<memory_resource>`](header/memory_resource "cpp/header/memory resource") * [`<optional>`](header/optional "cpp/header/optional") * [`<string_view>`](header/string_view "cpp/header/string view") * [`<variant>`](header/variant "cpp/header/variant") New library features --------------------- ### Utility types * `[std::tuple](utility/tuple "cpp/utility/tuple")`: + `[std::apply](utility/apply "cpp/utility/apply")` + `[std::make\_from\_tuple](utility/make_from_tuple "cpp/utility/make from tuple")` + [deduction guides](utility/tuple/deduction_guides "cpp/utility/tuple/deduction guides") * `[std::any](utility/any "cpp/utility/any")` * `[std::optional](utility/optional "cpp/utility/optional")` * `[std::variant](utility/variant "cpp/utility/variant")` * [searchers](utility/functional#Searchers "cpp/utility/functional") * `[std::as\_const](utility/as_const "cpp/utility/as const")` * `[std::not\_fn](utility/functional/not_fn "cpp/utility/functional/not fn")` ### Memory management * uninitialized memory algorithms + `[std::destroy\_at](memory/destroy_at "cpp/memory/destroy at")` + `[std::destroy](memory/destroy "cpp/memory/destroy")` + `[std::destroy\_n](memory/destroy_n "cpp/memory/destroy n")` + `[std::uninitialized\_move](memory/uninitialized_move "cpp/memory/uninitialized move")` + `[std::uninitialized\_value\_construct](memory/uninitialized_value_construct "cpp/memory/uninitialized value construct")` * [`weak_from_this`](memory/enable_shared_from_this/weak_from_this "cpp/memory/enable shared from this/weak from this") * `[std::pmr::memory\_resource](memory/memory_resource "cpp/memory/memory resource")` and `[std::polymorphic\_allocator](memory/polymorphic_allocator "cpp/memory/polymorphic allocator")` * `[std::aligned\_alloc](memory/c/aligned_alloc "cpp/memory/c/aligned alloc")` * transparent `[std::owner\_less](memory/owner_less "cpp/memory/owner less")` * array support for `[std::shared\_ptr](memory/shared_ptr "cpp/memory/shared ptr")` * [allocation functions](memory/new/operator_new "cpp/memory/new/operator new") with explicit alignment ### Compile-time programming * [`std::byte`](types/byte "cpp/types/byte") * `[std::conjunction](types/conjunction "cpp/types/conjunction")`/`[std::disjunction](types/disjunction "cpp/types/disjunction")`/`[std::negation](types/negation "cpp/types/negation")` * [type trait](types "cpp/types") variable templates (`*xxx*_+v`) * `[std::is\_swappable](types/is_swappable "cpp/types/is swappable")` * [`is_invocable`](types/is_invocable "cpp/types/is invocable") * [`is_aggregate`](types/is_aggregate "cpp/types/is aggregate") * `[std::has\_unique\_object\_representations](types/has_unique_object_representations "cpp/types/has unique object representations")` ### Algorithms * `[std::clamp](algorithm/clamp "cpp/algorithm/clamp")` * parallel algorithms and [execution policies](algorithm/execution_policy_tag_t "cpp/algorithm/execution policy tag t") * `[std::reduce](algorithm/reduce "cpp/algorithm/reduce")` * `[std::inclusive\_scan](algorithm/inclusive_scan "cpp/algorithm/inclusive scan")` * `[std::exclusive\_scan](algorithm/exclusive_scan "cpp/algorithm/exclusive scan")` * `[std::gcd](numeric/gcd "cpp/numeric/gcd")` * `[std::lcm](numeric/lcm "cpp/numeric/lcm")` ### Iterators and containers * map/set [`extract`](container/map/extract "cpp/container/map/extract") and map/set [`merge`](container/map/merge "cpp/container/map/merge") * map/unordered\_map [`try_emplace`](container/map/try_emplace "cpp/container/map/try emplace") and [`insert_or_assign`](container/map/insert_or_assign "cpp/container/map/insert or assign") * contiguous iterators ([LegacyContiguousIterator](named_req/contiguousiterator "cpp/named req/ContiguousIterator")) * non-member `[std::size](iterator/size "cpp/iterator/size")`/`[std::empty](iterator/empty "cpp/iterator/empty")`/`[std::data](iterator/data "cpp/iterator/data")` ### Numerics * [mathematical special functions](numeric/special_functions "cpp/numeric/special functions") * 3D `[std::hypot](numeric/math/hypot "cpp/numeric/math/hypot")` ### Other * [cache line interface](thread/hardware_destructive_interference_size "cpp/thread/hardware destructive interference size") * `[std::launder](utility/launder "cpp/utility/launder")` * `[std::uncaught\_exceptions](error/uncaught_exception "cpp/error/uncaught exception")` * [`std::to_chars`](utility/to_chars "cpp/utility/to chars")/[`std::from_chars`](utility/from_chars "cpp/utility/from chars") * [`std::atomic<T>::is_always_lock_free`](atomic/atomic/is_always_lock_free "cpp/atomic/atomic/is always lock free") * [`std::scoped_lock`](thread/scoped_lock "cpp/thread/scoped lock") * `[std::timespec\_get](chrono/c/timespec_get "cpp/chrono/c/timespec get")` * rounding functions for `[std::chrono::duration](chrono/duration "cpp/chrono/duration")` and `[std::chrono::time\_point](chrono/time_point "cpp/chrono/time point")` Defect reports --------------- | Defect Reports fixed in C++17 (290 core, 363 library) | | --- | | * [CWG#92](https://wg21.cmeerw.net/cwg/issue92) * [CWG#150](https://wg21.cmeerw.net/cwg/issue150) * [CWG#212](https://wg21.cmeerw.net/cwg/issue212) * [CWG#238](https://wg21.cmeerw.net/cwg/issue238) * [CWG#242](https://wg21.cmeerw.net/cwg/issue242) * [CWG#253](https://wg21.cmeerw.net/cwg/issue253) * [CWG#314](https://wg21.cmeerw.net/cwg/issue314) * [CWG#330](https://wg21.cmeerw.net/cwg/issue330) * [CWG#343](https://wg21.cmeerw.net/cwg/issue343) * [CWG#393](https://wg21.cmeerw.net/cwg/issue393) * [CWG#426](https://wg21.cmeerw.net/cwg/issue426) * [CWG#591](https://wg21.cmeerw.net/cwg/issue591) * [CWG#609](https://wg21.cmeerw.net/cwg/issue609) * [CWG#636](https://wg21.cmeerw.net/cwg/issue636) * [CWG#727](https://wg21.cmeerw.net/cwg/issue727) * [CWG#987](https://wg21.cmeerw.net/cwg/issue987) * [CWG#1021](https://wg21.cmeerw.net/cwg/issue1021) * [CWG#1116](https://wg21.cmeerw.net/cwg/issue1116) * [CWG#1247](https://wg21.cmeerw.net/cwg/issue1247) * [CWG#1274](https://wg21.cmeerw.net/cwg/issue1274) * [CWG#1284](https://wg21.cmeerw.net/cwg/issue1284) * [CWG#1292](https://wg21.cmeerw.net/cwg/issue1292) * [CWG#1309](https://wg21.cmeerw.net/cwg/issue1309) * [CWG#1315](https://wg21.cmeerw.net/cwg/issue1315) * [CWG#1338](https://wg21.cmeerw.net/cwg/issue1338) * [CWG#1343](https://wg21.cmeerw.net/cwg/issue1343) * [CWG#1351](https://wg21.cmeerw.net/cwg/issue1351) * [CWG#1356](https://wg21.cmeerw.net/cwg/issue1356) * [CWG#1391](https://wg21.cmeerw.net/cwg/issue1391) * [CWG#1395](https://wg21.cmeerw.net/cwg/issue1395) * [CWG#1397](https://wg21.cmeerw.net/cwg/issue1397) * [CWG#1446](https://wg21.cmeerw.net/cwg/issue1446) * [CWG#1465](https://wg21.cmeerw.net/cwg/issue1465) * [CWG#1467](https://wg21.cmeerw.net/cwg/issue1467) * [CWG#1484](https://wg21.cmeerw.net/cwg/issue1484) * [CWG#1490](https://wg21.cmeerw.net/cwg/issue1490) * [CWG#1492](https://wg21.cmeerw.net/cwg/issue1492) * [CWG#1496](https://wg21.cmeerw.net/cwg/issue1496) * [CWG#1518](https://wg21.cmeerw.net/cwg/issue1518) * [CWG#1552](https://wg21.cmeerw.net/cwg/issue1552) * [CWG#1558](https://wg21.cmeerw.net/cwg/issue1558) * [CWG#1561](https://wg21.cmeerw.net/cwg/issue1561) * [CWG#1571](https://wg21.cmeerw.net/cwg/issue1571) * [CWG#1572](https://wg21.cmeerw.net/cwg/issue1572) * [CWG#1573](https://wg21.cmeerw.net/cwg/issue1573) * [CWG#1589](https://wg21.cmeerw.net/cwg/issue1589) * [CWG#1591](https://wg21.cmeerw.net/cwg/issue1591) * [CWG#1596](https://wg21.cmeerw.net/cwg/issue1596) * [CWG#1600](https://wg21.cmeerw.net/cwg/issue1600) * [CWG#1603](https://wg21.cmeerw.net/cwg/issue1603) * [CWG#1614](https://wg21.cmeerw.net/cwg/issue1614) * [CWG#1615](https://wg21.cmeerw.net/cwg/issue1615) * [CWG#1622](https://wg21.cmeerw.net/cwg/issue1622) * [CWG#1630](https://wg21.cmeerw.net/cwg/issue1630) * [CWG#1631](https://wg21.cmeerw.net/cwg/issue1631) * [CWG#1633](https://wg21.cmeerw.net/cwg/issue1633) * [CWG#1638](https://wg21.cmeerw.net/cwg/issue1638) * [CWG#1639](https://wg21.cmeerw.net/cwg/issue1639) * [CWG#1645](https://wg21.cmeerw.net/cwg/issue1645) * [CWG#1652](https://wg21.cmeerw.net/cwg/issue1652) * [CWG#1653](https://wg21.cmeerw.net/cwg/issue1653) * [CWG#1657](https://wg21.cmeerw.net/cwg/issue1657) * [CWG#1672](https://wg21.cmeerw.net/cwg/issue1672) * [CWG#1677](https://wg21.cmeerw.net/cwg/issue1677) * [CWG#1683](https://wg21.cmeerw.net/cwg/issue1683) * [CWG#1686](https://wg21.cmeerw.net/cwg/issue1686) * [CWG#1694](https://wg21.cmeerw.net/cwg/issue1694) * [CWG#1696](https://wg21.cmeerw.net/cwg/issue1696) * [CWG#1705](https://wg21.cmeerw.net/cwg/issue1705) * [CWG#1708](https://wg21.cmeerw.net/cwg/issue1708) * [CWG#1710](https://wg21.cmeerw.net/cwg/issue1710) * [CWG#1712](https://wg21.cmeerw.net/cwg/issue1712) * [CWG#1715](https://wg21.cmeerw.net/cwg/issue1715) * [CWG#1719](https://wg21.cmeerw.net/cwg/issue1719) * [CWG#1722](https://wg21.cmeerw.net/cwg/issue1722) * [CWG#1734](https://wg21.cmeerw.net/cwg/issue1734) * [CWG#1736](https://wg21.cmeerw.net/cwg/issue1736) * [CWG#1744](https://wg21.cmeerw.net/cwg/issue1744) * [CWG#1748](https://wg21.cmeerw.net/cwg/issue1748) * [CWG#1750](https://wg21.cmeerw.net/cwg/issue1750) * [CWG#1751](https://wg21.cmeerw.net/cwg/issue1751) * [CWG#1752](https://wg21.cmeerw.net/cwg/issue1752) * [CWG#1753](https://wg21.cmeerw.net/cwg/issue1753) * [CWG#1756](https://wg21.cmeerw.net/cwg/issue1756) * [CWG#1757](https://wg21.cmeerw.net/cwg/issue1757) * [CWG#1758](https://wg21.cmeerw.net/cwg/issue1758) * [CWG#1766](https://wg21.cmeerw.net/cwg/issue1766) * [CWG#1774](https://wg21.cmeerw.net/cwg/issue1774) * [CWG#1776](https://wg21.cmeerw.net/cwg/issue1776) * [CWG#1777](https://wg21.cmeerw.net/cwg/issue1777) * [CWG#1779](https://wg21.cmeerw.net/cwg/issue1779) * [CWG#1780](https://wg21.cmeerw.net/cwg/issue1780) * [CWG#1782](https://wg21.cmeerw.net/cwg/issue1782) * [CWG#1784](https://wg21.cmeerw.net/cwg/issue1784) * [CWG#1788](https://wg21.cmeerw.net/cwg/issue1788) * [CWG#1791](https://wg21.cmeerw.net/cwg/issue1791) * [CWG#1793](https://wg21.cmeerw.net/cwg/issue1793) * [CWG#1794](https://wg21.cmeerw.net/cwg/issue1794) * [CWG#1795](https://wg21.cmeerw.net/cwg/issue1795) * [CWG#1796](https://wg21.cmeerw.net/cwg/issue1796) * [CWG#1797](https://wg21.cmeerw.net/cwg/issue1797) * [CWG#1799](https://wg21.cmeerw.net/cwg/issue1799) * [CWG#1800](https://wg21.cmeerw.net/cwg/issue1800) * [CWG#1802](https://wg21.cmeerw.net/cwg/issue1802) * [CWG#1804](https://wg21.cmeerw.net/cwg/issue1804) * [CWG#1805](https://wg21.cmeerw.net/cwg/issue1805) * [CWG#1806](https://wg21.cmeerw.net/cwg/issue1806) * [CWG#1807](https://wg21.cmeerw.net/cwg/issue1807) * [CWG#1809](https://wg21.cmeerw.net/cwg/issue1809) * [CWG#1810](https://wg21.cmeerw.net/cwg/issue1810) * [CWG#1811](https://wg21.cmeerw.net/cwg/issue1811) * [CWG#1812](https://wg21.cmeerw.net/cwg/issue1812) * [CWG#1813](https://wg21.cmeerw.net/cwg/issue1813) * [CWG#1814](https://wg21.cmeerw.net/cwg/issue1814) * [CWG#1815](https://wg21.cmeerw.net/cwg/issue1815) * [CWG#1816](https://wg21.cmeerw.net/cwg/issue1816) * [CWG#1819](https://wg21.cmeerw.net/cwg/issue1819) * [CWG#1823](https://wg21.cmeerw.net/cwg/issue1823) * [CWG#1824](https://wg21.cmeerw.net/cwg/issue1824) * [CWG#1825](https://wg21.cmeerw.net/cwg/issue1825) * [CWG#1830](https://wg21.cmeerw.net/cwg/issue1830) * [CWG#1832](https://wg21.cmeerw.net/cwg/issue1832) * [CWG#1834](https://wg21.cmeerw.net/cwg/issue1834) * [CWG#1838](https://wg21.cmeerw.net/cwg/issue1838) * [CWG#1843](https://wg21.cmeerw.net/cwg/issue1843) * [CWG#1846](https://wg21.cmeerw.net/cwg/issue1846) * [CWG#1847](https://wg21.cmeerw.net/cwg/issue1847) * [CWG#1848](https://wg21.cmeerw.net/cwg/issue1848) * [CWG#1850](https://wg21.cmeerw.net/cwg/issue1850) * [CWG#1851](https://wg21.cmeerw.net/cwg/issue1851) * [CWG#1852](https://wg21.cmeerw.net/cwg/issue1852) * [CWG#1858](https://wg21.cmeerw.net/cwg/issue1858) * [CWG#1860](https://wg21.cmeerw.net/cwg/issue1860) * [CWG#1861](https://wg21.cmeerw.net/cwg/issue1861) * [CWG#1863](https://wg21.cmeerw.net/cwg/issue1863) * [CWG#1865](https://wg21.cmeerw.net/cwg/issue1865) * [CWG#1866](https://wg21.cmeerw.net/cwg/issue1866) * [CWG#1870](https://wg21.cmeerw.net/cwg/issue1870) * [CWG#1872](https://wg21.cmeerw.net/cwg/issue1872) * [CWG#1873](https://wg21.cmeerw.net/cwg/issue1873) * [CWG#1874](https://wg21.cmeerw.net/cwg/issue1874) * [CWG#1875](https://wg21.cmeerw.net/cwg/issue1875) * [CWG#1877](https://wg21.cmeerw.net/cwg/issue1877) * [CWG#1878](https://wg21.cmeerw.net/cwg/issue1878) * [CWG#1881](https://wg21.cmeerw.net/cwg/issue1881) * [CWG#1882](https://wg21.cmeerw.net/cwg/issue1882) * [CWG#1885](https://wg21.cmeerw.net/cwg/issue1885) * [CWG#1886](https://wg21.cmeerw.net/cwg/issue1886) * [CWG#1887](https://wg21.cmeerw.net/cwg/issue1887) * [CWG#1888](https://wg21.cmeerw.net/cwg/issue1888) * [CWG#1891](https://wg21.cmeerw.net/cwg/issue1891) * [CWG#1892](https://wg21.cmeerw.net/cwg/issue1892) * [CWG#1895](https://wg21.cmeerw.net/cwg/issue1895) * [CWG#1899](https://wg21.cmeerw.net/cwg/issue1899) * [CWG#1902](https://wg21.cmeerw.net/cwg/issue1902) * [CWG#1903](https://wg21.cmeerw.net/cwg/issue1903) * [CWG#1909](https://wg21.cmeerw.net/cwg/issue1909) * [CWG#1911](https://wg21.cmeerw.net/cwg/issue1911) * [CWG#1916](https://wg21.cmeerw.net/cwg/issue1916) * [CWG#1920](https://wg21.cmeerw.net/cwg/issue1920) * [CWG#1922](https://wg21.cmeerw.net/cwg/issue1922) * [CWG#1925](https://wg21.cmeerw.net/cwg/issue1925) * [CWG#1926](https://wg21.cmeerw.net/cwg/issue1926) * [CWG#1929](https://wg21.cmeerw.net/cwg/issue1929) * [CWG#1930](https://wg21.cmeerw.net/cwg/issue1930) * [CWG#1932](https://wg21.cmeerw.net/cwg/issue1932) * [CWG#1940](https://wg21.cmeerw.net/cwg/issue1940) * [CWG#1941](https://wg21.cmeerw.net/cwg/issue1941) * [CWG#1942](https://wg21.cmeerw.net/cwg/issue1942) * [CWG#1946](https://wg21.cmeerw.net/cwg/issue1946) * [CWG#1949](https://wg21.cmeerw.net/cwg/issue1949) * [CWG#1951](https://wg21.cmeerw.net/cwg/issue1951) * [CWG#1952](https://wg21.cmeerw.net/cwg/issue1952) * [CWG#1955](https://wg21.cmeerw.net/cwg/issue1955) * [CWG#1956](https://wg21.cmeerw.net/cwg/issue1956) * [CWG#1958](https://wg21.cmeerw.net/cwg/issue1958) * [CWG#1959](https://wg21.cmeerw.net/cwg/issue1959) * [CWG#1961](https://wg21.cmeerw.net/cwg/issue1961) * [CWG#1963](https://wg21.cmeerw.net/cwg/issue1963) * [CWG#1966](https://wg21.cmeerw.net/cwg/issue1966) * [CWG#1967](https://wg21.cmeerw.net/cwg/issue1967) * [CWG#1971](https://wg21.cmeerw.net/cwg/issue1971) * [CWG#1975](https://wg21.cmeerw.net/cwg/issue1975) * [CWG#1978](https://wg21.cmeerw.net/cwg/issue1978) * [CWG#1981](https://wg21.cmeerw.net/cwg/issue1981) * [CWG#1988](https://wg21.cmeerw.net/cwg/issue1988) * [CWG#1990](https://wg21.cmeerw.net/cwg/issue1990) * [CWG#1991](https://wg21.cmeerw.net/cwg/issue1991) * [CWG#1992](https://wg21.cmeerw.net/cwg/issue1992) * [CWG#1995](https://wg21.cmeerw.net/cwg/issue1995) * [CWG#1999](https://wg21.cmeerw.net/cwg/issue1999) * [CWG#2000](https://wg21.cmeerw.net/cwg/issue2000) * [CWG#2001](https://wg21.cmeerw.net/cwg/issue2001) * [CWG#2004](https://wg21.cmeerw.net/cwg/issue2004) * [CWG#2006](https://wg21.cmeerw.net/cwg/issue2006) * [CWG#2008](https://wg21.cmeerw.net/cwg/issue2008) * [CWG#2010](https://wg21.cmeerw.net/cwg/issue2010) * [CWG#2011](https://wg21.cmeerw.net/cwg/issue2011) * [CWG#2012](https://wg21.cmeerw.net/cwg/issue2012) * [CWG#2015](https://wg21.cmeerw.net/cwg/issue2015) * [CWG#2016](https://wg21.cmeerw.net/cwg/issue2016) * [CWG#2017](https://wg21.cmeerw.net/cwg/issue2017) * [CWG#2019](https://wg21.cmeerw.net/cwg/issue2019) * [CWG#2022](https://wg21.cmeerw.net/cwg/issue2022) * [CWG#2024](https://wg21.cmeerw.net/cwg/issue2024) * [CWG#2026](https://wg21.cmeerw.net/cwg/issue2026) * [CWG#2027](https://wg21.cmeerw.net/cwg/issue2027) * [CWG#2031](https://wg21.cmeerw.net/cwg/issue2031) * [CWG#2032](https://wg21.cmeerw.net/cwg/issue2032) * [CWG#2033](https://wg21.cmeerw.net/cwg/issue2033) * [CWG#2038](https://wg21.cmeerw.net/cwg/issue2038) * [CWG#2039](https://wg21.cmeerw.net/cwg/issue2039) * [CWG#2040](https://wg21.cmeerw.net/cwg/issue2040) * [CWG#2041](https://wg21.cmeerw.net/cwg/issue2041) * [CWG#2044](https://wg21.cmeerw.net/cwg/issue2044) * [CWG#2046](https://wg21.cmeerw.net/cwg/issue2046) * [CWG#2047](https://wg21.cmeerw.net/cwg/issue2047) * [CWG#2052](https://wg21.cmeerw.net/cwg/issue2052) * [CWG#2061](https://wg21.cmeerw.net/cwg/issue2061) * [CWG#2063](https://wg21.cmeerw.net/cwg/issue2063) * [CWG#2064](https://wg21.cmeerw.net/cwg/issue2064) * [CWG#2066](https://wg21.cmeerw.net/cwg/issue2066) * [CWG#2068](https://wg21.cmeerw.net/cwg/issue2068) * [CWG#2069](https://wg21.cmeerw.net/cwg/issue2069) * [CWG#2071](https://wg21.cmeerw.net/cwg/issue2071) * [CWG#2075](https://wg21.cmeerw.net/cwg/issue2075) * [CWG#2076](https://wg21.cmeerw.net/cwg/issue2076) * [CWG#2079](https://wg21.cmeerw.net/cwg/issue2079) * [CWG#2082](https://wg21.cmeerw.net/cwg/issue2082) * [CWG#2084](https://wg21.cmeerw.net/cwg/issue2084) * [CWG#2085](https://wg21.cmeerw.net/cwg/issue2085) * [CWG#2091](https://wg21.cmeerw.net/cwg/issue2091) * [CWG#2093](https://wg21.cmeerw.net/cwg/issue2093) * [CWG#2094](https://wg21.cmeerw.net/cwg/issue2094) * [CWG#2095](https://wg21.cmeerw.net/cwg/issue2095) * [CWG#2096](https://wg21.cmeerw.net/cwg/issue2096) * [CWG#2098](https://wg21.cmeerw.net/cwg/issue2098) * [CWG#2099](https://wg21.cmeerw.net/cwg/issue2099) * [CWG#2100](https://wg21.cmeerw.net/cwg/issue2100) * [CWG#2101](https://wg21.cmeerw.net/cwg/issue2101) * [CWG#2104](https://wg21.cmeerw.net/cwg/issue2104) * [CWG#2106](https://wg21.cmeerw.net/cwg/issue2106) * [CWG#2107](https://wg21.cmeerw.net/cwg/issue2107) * [CWG#2109](https://wg21.cmeerw.net/cwg/issue2109) * [CWG#2113](https://wg21.cmeerw.net/cwg/issue2113) * [CWG#2120](https://wg21.cmeerw.net/cwg/issue2120) * [CWG#2122](https://wg21.cmeerw.net/cwg/issue2122) * [CWG#2124](https://wg21.cmeerw.net/cwg/issue2124) * [CWG#2129](https://wg21.cmeerw.net/cwg/issue2129) * [CWG#2130](https://wg21.cmeerw.net/cwg/issue2130) * [CWG#2137](https://wg21.cmeerw.net/cwg/issue2137) * [CWG#2140](https://wg21.cmeerw.net/cwg/issue2140) * [CWG#2141](https://wg21.cmeerw.net/cwg/issue2141) * [CWG#2143](https://wg21.cmeerw.net/cwg/issue2143) * [CWG#2145](https://wg21.cmeerw.net/cwg/issue2145) * [CWG#2146](https://wg21.cmeerw.net/cwg/issue2146) * [CWG#2147](https://wg21.cmeerw.net/cwg/issue2147) * [CWG#2153](https://wg21.cmeerw.net/cwg/issue2153) * [CWG#2154](https://wg21.cmeerw.net/cwg/issue2154) * [CWG#2155](https://wg21.cmeerw.net/cwg/issue2155) * [CWG#2156](https://wg21.cmeerw.net/cwg/issue2156) * [CWG#2157](https://wg21.cmeerw.net/cwg/issue2157) * [CWG#2163](https://wg21.cmeerw.net/cwg/issue2163) * [CWG#2167](https://wg21.cmeerw.net/cwg/issue2167) * [CWG#2171](https://wg21.cmeerw.net/cwg/issue2171) * [CWG#2174](https://wg21.cmeerw.net/cwg/issue2174) * [CWG#2175](https://wg21.cmeerw.net/cwg/issue2175) * [CWG#2176](https://wg21.cmeerw.net/cwg/issue2176) * [CWG#2180](https://wg21.cmeerw.net/cwg/issue2180) * [CWG#2184](https://wg21.cmeerw.net/cwg/issue2184) * [CWG#2191](https://wg21.cmeerw.net/cwg/issue2191) * [CWG#2196](https://wg21.cmeerw.net/cwg/issue2196) * [CWG#2198](https://wg21.cmeerw.net/cwg/issue2198) * [CWG#2201](https://wg21.cmeerw.net/cwg/issue2201) * [CWG#2205](https://wg21.cmeerw.net/cwg/issue2205) * [CWG#2206](https://wg21.cmeerw.net/cwg/issue2206) * [CWG#2211](https://wg21.cmeerw.net/cwg/issue2211) * [CWG#2214](https://wg21.cmeerw.net/cwg/issue2214) * [CWG#2218](https://wg21.cmeerw.net/cwg/issue2218) * [CWG#2220](https://wg21.cmeerw.net/cwg/issue2220) * [CWG#2224](https://wg21.cmeerw.net/cwg/issue2224) * [CWG#2247](https://wg21.cmeerw.net/cwg/issue2247) * [CWG#2248](https://wg21.cmeerw.net/cwg/issue2248) * [CWG#2251](https://wg21.cmeerw.net/cwg/issue2251) * [CWG#2259](https://wg21.cmeerw.net/cwg/issue2259) * [CWG#2262](https://wg21.cmeerw.net/cwg/issue2262) * [CWG#2268](https://wg21.cmeerw.net/cwg/issue2268) * [CWG#2271](https://wg21.cmeerw.net/cwg/issue2271) * [CWG#2272](https://wg21.cmeerw.net/cwg/issue2272) * [CWG#2276](https://wg21.cmeerw.net/cwg/issue2276) * [#LWG839](http://wg21.link/lwg839) * [#LWG1041](http://wg21.link/lwg1041) * [#LWG1150](http://wg21.link/lwg1150) * [#LWG1169](http://wg21.link/lwg1169) * [#LWG1201](http://wg21.link/lwg1201) * [#LWG1526](http://wg21.link/lwg1526) * [#LWG2016](http://wg21.link/lwg2016) * [#LWG2051](http://wg21.link/lwg2051) * [#LWG2059](http://wg21.link/lwg2059) * [#LWG2062](http://wg21.link/lwg2062) * [#LWG2063](http://wg21.link/lwg2063) * [#LWG2072](http://wg21.link/lwg2072) * [#LWG2076](http://wg21.link/lwg2076) * [#LWG2101](http://wg21.link/lwg2101) * [#LWG2106](http://wg21.link/lwg2106) * [#LWG2108](http://wg21.link/lwg2108) * [#LWG2111](http://wg21.link/lwg2111) * [#LWG2118](http://wg21.link/lwg2118) * [#LWG2119](http://wg21.link/lwg2119) * [#LWG2127](http://wg21.link/lwg2127) * [#LWG2129](http://wg21.link/lwg2129) * [#LWG2133](http://wg21.link/lwg2133) * [#LWG2156](http://wg21.link/lwg2156) * [#LWG2160](http://wg21.link/lwg2160) * [#LWG2166](http://wg21.link/lwg2166) * [#LWG2168](http://wg21.link/lwg2168) * [#LWG2170](http://wg21.link/lwg2170) * [#LWG2179](http://wg21.link/lwg2179) * [#LWG2181](http://wg21.link/lwg2181) * [#LWG2192](http://wg21.link/lwg2192) * [#LWG2208](http://wg21.link/lwg2208) * [#LWG2212](http://wg21.link/lwg2212) * [#LWG2217](http://wg21.link/lwg2217) * [#LWG2218](http://wg21.link/lwg2218) * [#LWG2219](http://wg21.link/lwg2219) * [#LWG2221](http://wg21.link/lwg2221) * [#LWG2223](http://wg21.link/lwg2223) * [#LWG2224](http://wg21.link/lwg2224) * [#LWG2228](http://wg21.link/lwg2228) * [#LWG2230](http://wg21.link/lwg2230) * [#LWG2232](http://wg21.link/lwg2232) * [#LWG2233](http://wg21.link/lwg2233) * [#LWG2234](http://wg21.link/lwg2234) * [#LWG2239](http://wg21.link/lwg2239) * [#LWG2241](http://wg21.link/lwg2241) * [#LWG2244](http://wg21.link/lwg2244) * [#LWG2245](http://wg21.link/lwg2245) * [#LWG2250](http://wg21.link/lwg2250) * [#LWG2259](http://wg21.link/lwg2259) * [#LWG2260](http://wg21.link/lwg2260) * [#LWG2261](http://wg21.link/lwg2261) * [#LWG2266](http://wg21.link/lwg2266) * [#LWG2273](http://wg21.link/lwg2273) * [#LWG2274](http://wg21.link/lwg2274) * [#LWG2276](http://wg21.link/lwg2276) * [#LWG2294](http://wg21.link/lwg2294) * [#LWG2296](http://wg21.link/lwg2296) * [#LWG2309](http://wg21.link/lwg2309) * [#LWG2310](http://wg21.link/lwg2310) * [#LWG2312](http://wg21.link/lwg2312) * [#LWG2325](http://wg21.link/lwg2325) * [#LWG2328](http://wg21.link/lwg2328) * [#LWG2336](http://wg21.link/lwg2336) * [#LWG2340](http://wg21.link/lwg2340) * [#LWG2343](http://wg21.link/lwg2343) * [#LWG2353](http://wg21.link/lwg2353) * [#LWG2354](http://wg21.link/lwg2354) * [#LWG2361](http://wg21.link/lwg2361) * [#LWG2364](http://wg21.link/lwg2364) * [#LWG2365](http://wg21.link/lwg2365) * [#LWG2367](http://wg21.link/lwg2367) * [#LWG2368](http://wg21.link/lwg2368) * [#LWG2369](http://wg21.link/lwg2369) * [#LWG2370](http://wg21.link/lwg2370) * [#LWG2376](http://wg21.link/lwg2376) * [#LWG2377](http://wg21.link/lwg2377) * [#LWG2378](http://wg21.link/lwg2378) * [#LWG2380](http://wg21.link/lwg2380) * [#LWG2384](http://wg21.link/lwg2384) * [#LWG2385](http://wg21.link/lwg2385) * [#LWG2387](http://wg21.link/lwg2387) * [#LWG2391](http://wg21.link/lwg2391) * [#LWG2393](http://wg21.link/lwg2393) * [#LWG2394](http://wg21.link/lwg2394) * [#LWG2396](http://wg21.link/lwg2396) * [#LWG2397](http://wg21.link/lwg2397) * [#LWG2399](http://wg21.link/lwg2399) * [#LWG2400](http://wg21.link/lwg2400) * [#LWG2401](http://wg21.link/lwg2401) * [#LWG2403](http://wg21.link/lwg2403) * [#LWG2404](http://wg21.link/lwg2404) * [#LWG2406](http://wg21.link/lwg2406) * [#LWG2407](http://wg21.link/lwg2407) * [#LWG2408](http://wg21.link/lwg2408) * [#LWG2411](http://wg21.link/lwg2411) * [#LWG2415](http://wg21.link/lwg2415) * [#LWG2419](http://wg21.link/lwg2419) * [#LWG2420](http://wg21.link/lwg2420) * [#LWG2422](http://wg21.link/lwg2422) * [#LWG2424](http://wg21.link/lwg2424) * [#LWG2425](http://wg21.link/lwg2425) * [#LWG2426](http://wg21.link/lwg2426) * [#LWG2427](http://wg21.link/lwg2427) * [#LWG2428](http://wg21.link/lwg2428) * [#LWG2433](http://wg21.link/lwg2433) * [#LWG2434](http://wg21.link/lwg2434) * [#LWG2435](http://wg21.link/lwg2435) * [#LWG2436](http://wg21.link/lwg2436) * [#LWG2437](http://wg21.link/lwg2437) * [#LWG2438](http://wg21.link/lwg2438) * [#LWG2439](http://wg21.link/lwg2439) * [#LWG2440](http://wg21.link/lwg2440) * [#LWG2441](http://wg21.link/lwg2441) * [#LWG2442](http://wg21.link/lwg2442) * [#LWG2443](http://wg21.link/lwg2443) * [#LWG2445](http://wg21.link/lwg2445) * [#LWG2447](http://wg21.link/lwg2447) * [#LWG2448](http://wg21.link/lwg2448) * [#LWG2450](http://wg21.link/lwg2450) * [#LWG2454](http://wg21.link/lwg2454) * [#LWG2455](http://wg21.link/lwg2455) * [#LWG2456](http://wg21.link/lwg2456) * [#LWG2458](http://wg21.link/lwg2458) * [#LWG2459](http://wg21.link/lwg2459) * [#LWG2460](http://wg21.link/lwg2460) * [#LWG2462](http://wg21.link/lwg2462) * [#LWG2464](http://wg21.link/lwg2464) * [#LWG2465](http://wg21.link/lwg2465) * [#LWG2466](http://wg21.link/lwg2466) * [#LWG2467](http://wg21.link/lwg2467) * [#LWG2468](http://wg21.link/lwg2468) * [#LWG2469](http://wg21.link/lwg2469) * [#LWG2470](http://wg21.link/lwg2470) * [#LWG2473](http://wg21.link/lwg2473) * [#LWG2475](http://wg21.link/lwg2475) * [#LWG2476](http://wg21.link/lwg2476) * [#LWG2477](http://wg21.link/lwg2477) * [#LWG2482](http://wg21.link/lwg2482) * [#LWG2483](http://wg21.link/lwg2483) * [#LWG2484](http://wg21.link/lwg2484) * [#LWG2485](http://wg21.link/lwg2485) * [#LWG2486](http://wg21.link/lwg2486) * [#LWG2487](http://wg21.link/lwg2487) * [#LWG2488](http://wg21.link/lwg2488) * [#LWG2489](http://wg21.link/lwg2489) * [#LWG2492](http://wg21.link/lwg2492) * [#LWG2495](http://wg21.link/lwg2495) * [#LWG2501](http://wg21.link/lwg2501) * [#LWG2502](http://wg21.link/lwg2502) * [#LWG2503](http://wg21.link/lwg2503) * [#LWG2505](http://wg21.link/lwg2505) * [#LWG2510](http://wg21.link/lwg2510) * [#LWG2514](http://wg21.link/lwg2514) * [#LWG2519](http://wg21.link/lwg2519) * [#LWG2520](http://wg21.link/lwg2520) * [#LWG2523](http://wg21.link/lwg2523) * [#LWG2529](http://wg21.link/lwg2529) * [#LWG2531](http://wg21.link/lwg2531) * [#LWG2534](http://wg21.link/lwg2534) * [#LWG2536](http://wg21.link/lwg2536) * [#LWG2537](http://wg21.link/lwg2537) * [#LWG2540](http://wg21.link/lwg2540) * [#LWG2542](http://wg21.link/lwg2542) * [#LWG2543](http://wg21.link/lwg2543) * [#LWG2544](http://wg21.link/lwg2544) * [#LWG2545](http://wg21.link/lwg2545) * [#LWG2548](http://wg21.link/lwg2548) * [#LWG2549](http://wg21.link/lwg2549) * [#LWG2550](http://wg21.link/lwg2550) * [#LWG2554](http://wg21.link/lwg2554) * [#LWG2556](http://wg21.link/lwg2556) * [#LWG2557](http://wg21.link/lwg2557) * [#LWG2559](http://wg21.link/lwg2559) * [#LWG2560](http://wg21.link/lwg2560) * [#LWG2562](http://wg21.link/lwg2562) * [#LWG2565](http://wg21.link/lwg2565) * [#LWG2566](http://wg21.link/lwg2566) * [#LWG2567](http://wg21.link/lwg2567) * [#LWG2569](http://wg21.link/lwg2569) * [#LWG2571](http://wg21.link/lwg2571) * [#LWG2572](http://wg21.link/lwg2572) * [#LWG2576](http://wg21.link/lwg2576) * [#LWG2577](http://wg21.link/lwg2577) * [#LWG2578](http://wg21.link/lwg2578) * [#LWG2579](http://wg21.link/lwg2579) * [#LWG2581](http://wg21.link/lwg2581) * [#LWG2582](http://wg21.link/lwg2582) * [#LWG2583](http://wg21.link/lwg2583) * [#LWG2584](http://wg21.link/lwg2584) * [#LWG2585](http://wg21.link/lwg2585) * [#LWG2586](http://wg21.link/lwg2586) * [#LWG2587](http://wg21.link/lwg2587) * [#LWG2589](http://wg21.link/lwg2589) * [#LWG2590](http://wg21.link/lwg2590) * [#LWG2591](http://wg21.link/lwg2591) * [#LWG2596](http://wg21.link/lwg2596) * [#LWG2598](http://wg21.link/lwg2598) * [#LWG2663](http://wg21.link/lwg2663) * [#LWG2664](http://wg21.link/lwg2664) * [#LWG2665](http://wg21.link/lwg2665) * [#LWG2667](http://wg21.link/lwg2667) * [#LWG2669](http://wg21.link/lwg2669) * [#LWG2670](http://wg21.link/lwg2670) * [#LWG2671](http://wg21.link/lwg2671) * [#LWG2672](http://wg21.link/lwg2672) * [#LWG2673](http://wg21.link/lwg2673) * [#LWG2674](http://wg21.link/lwg2674) * [#LWG2676](http://wg21.link/lwg2676) * [#LWG2677](http://wg21.link/lwg2677) * [#LWG2678](http://wg21.link/lwg2678) * [#LWG2679](http://wg21.link/lwg2679) * [#LWG2680](http://wg21.link/lwg2680) * [#LWG2681](http://wg21.link/lwg2681) * [#LWG2683](http://wg21.link/lwg2683) * [#LWG2684](http://wg21.link/lwg2684) * [#LWG2685](http://wg21.link/lwg2685) * [#LWG2686](http://wg21.link/lwg2686) * [#LWG2687](http://wg21.link/lwg2687) * [#LWG2688](http://wg21.link/lwg2688) * [#LWG2689](http://wg21.link/lwg2689) * [#LWG2694](http://wg21.link/lwg2694) * [#LWG2696](http://wg21.link/lwg2696) * [#LWG2698](http://wg21.link/lwg2698) * [#LWG2699](http://wg21.link/lwg2699) * [#LWG2704](http://wg21.link/lwg2704) * [#LWG2706](http://wg21.link/lwg2706) * [#LWG2707](http://wg21.link/lwg2707) * [#LWG2709](http://wg21.link/lwg2709) * [#LWG2710](http://wg21.link/lwg2710) * [#LWG2711](http://wg21.link/lwg2711) * [#LWG2712](http://wg21.link/lwg2712) * [#LWG2715](http://wg21.link/lwg2715) * [#LWG2716](http://wg21.link/lwg2716) * [#LWG2718](http://wg21.link/lwg2718) * [#LWG2719](http://wg21.link/lwg2719) * [#LWG2720](http://wg21.link/lwg2720) * [#LWG2721](http://wg21.link/lwg2721) * [#LWG2722](http://wg21.link/lwg2722) * [#LWG2723](http://wg21.link/lwg2723) * [#LWG2724](http://wg21.link/lwg2724) * [#LWG2725](http://wg21.link/lwg2725) * [#LWG2726](http://wg21.link/lwg2726) * [#LWG2727](http://wg21.link/lwg2727) * [#LWG2728](http://wg21.link/lwg2728) * [#LWG2729](http://wg21.link/lwg2729) * [#LWG2732](http://wg21.link/lwg2732) * [#LWG2734](http://wg21.link/lwg2734) * [#LWG2735](http://wg21.link/lwg2735) * [#LWG2736](http://wg21.link/lwg2736) * [#LWG2738](http://wg21.link/lwg2738) * [#LWG2739](http://wg21.link/lwg2739) * [#LWG2740](http://wg21.link/lwg2740) * [#LWG2742](http://wg21.link/lwg2742) * [#LWG2744](http://wg21.link/lwg2744) * [#LWG2747](http://wg21.link/lwg2747) * [#LWG2748](http://wg21.link/lwg2748) * [#LWG2749](http://wg21.link/lwg2749) * [#LWG2752](http://wg21.link/lwg2752) * [#LWG2753](http://wg21.link/lwg2753) * [#LWG2754](http://wg21.link/lwg2754) * [#LWG2755](http://wg21.link/lwg2755) * [#LWG2756](http://wg21.link/lwg2756) * [#LWG2757](http://wg21.link/lwg2757) * [#LWG2758](http://wg21.link/lwg2758) * [#LWG2759](http://wg21.link/lwg2759) * [#LWG2760](http://wg21.link/lwg2760) * [#LWG2763](http://wg21.link/lwg2763) * [#LWG2765](http://wg21.link/lwg2765) * [#LWG2767](http://wg21.link/lwg2767) * [#LWG2768](http://wg21.link/lwg2768) * [#LWG2769](http://wg21.link/lwg2769) * [#LWG2770](http://wg21.link/lwg2770) * [#LWG2771](http://wg21.link/lwg2771) * [#LWG2773](http://wg21.link/lwg2773) * [#LWG2776](http://wg21.link/lwg2776) * [#LWG2777](http://wg21.link/lwg2777) * [#LWG2778](http://wg21.link/lwg2778) * [#LWG2781](http://wg21.link/lwg2781) * [#LWG2782](http://wg21.link/lwg2782) * [#LWG2784](http://wg21.link/lwg2784) * [#LWG2785](http://wg21.link/lwg2785) * [#LWG2786](http://wg21.link/lwg2786) * [#LWG2787](http://wg21.link/lwg2787) * [#LWG2788](http://wg21.link/lwg2788) * [#LWG2789](http://wg21.link/lwg2789) * [#LWG2790](http://wg21.link/lwg2790) * [#LWG2791](http://wg21.link/lwg2791) * [#LWG2793](http://wg21.link/lwg2793) * [#LWG2794](http://wg21.link/lwg2794) * [#LWG2795](http://wg21.link/lwg2795) * [#LWG2796](http://wg21.link/lwg2796) * [#LWG2798](http://wg21.link/lwg2798) * [#LWG2799](http://wg21.link/lwg2799) * [#LWG2801](http://wg21.link/lwg2801) * [#LWG2802](http://wg21.link/lwg2802) * [#LWG2803](http://wg21.link/lwg2803) * [#LWG2804](http://wg21.link/lwg2804) * [#LWG2805](http://wg21.link/lwg2805) * [#LWG2806](http://wg21.link/lwg2806) * [#LWG2807](http://wg21.link/lwg2807) * [#LWG2809](http://wg21.link/lwg2809) * [#LWG2810](http://wg21.link/lwg2810) * [#LWG2812](http://wg21.link/lwg2812) * [#LWG2817](http://wg21.link/lwg2817) * [#LWG2824](http://wg21.link/lwg2824) * [#LWG2825](http://wg21.link/lwg2825) * [#LWG2826](http://wg21.link/lwg2826) * [#LWG2830](http://wg21.link/lwg2830) * [#LWG2834](http://wg21.link/lwg2834) * [#LWG2835](http://wg21.link/lwg2835) * [#LWG2837](http://wg21.link/lwg2837) * [#LWG2838](http://wg21.link/lwg2838) * [#LWG2842](http://wg21.link/lwg2842) * [#LWG2850](http://wg21.link/lwg2850) * [#LWG2853](http://wg21.link/lwg2853) * [#LWG2855](http://wg21.link/lwg2855) * [#LWG2857](http://wg21.link/lwg2857) * [#LWG2861](http://wg21.link/lwg2861) * [#LWG2862](http://wg21.link/lwg2862) * [#LWG2863](http://wg21.link/lwg2863) * [#LWG2864](http://wg21.link/lwg2864) * [#LWG2866](http://wg21.link/lwg2866) * [#LWG2867](http://wg21.link/lwg2867) * [#LWG2868](http://wg21.link/lwg2868) * [#LWG2869](http://wg21.link/lwg2869) * [#LWG2872](http://wg21.link/lwg2872) * [#LWG2873](http://wg21.link/lwg2873) * [#LWG2874](http://wg21.link/lwg2874) * [#LWG2875](http://wg21.link/lwg2875) * [#LWG2876](http://wg21.link/lwg2876) * [#LWG2877](http://wg21.link/lwg2877) * [#LWG2878](http://wg21.link/lwg2878) * [#LWG2879](http://wg21.link/lwg2879) * [#LWG2880](http://wg21.link/lwg2880) * [#LWG2882](http://wg21.link/lwg2882) * [#LWG2887](http://wg21.link/lwg2887) * [#LWG2888](http://wg21.link/lwg2888) * [#LWG2889](http://wg21.link/lwg2889) * [#LWG2890](http://wg21.link/lwg2890) * [#LWG2895](http://wg21.link/lwg2895) * [#LWG2900](http://wg21.link/lwg2900) * [#LWG2901](http://wg21.link/lwg2901) * [#LWG2903](http://wg21.link/lwg2903) * [#LWG2904](http://wg21.link/lwg2904) * [#LWG2905](http://wg21.link/lwg2905) * [#LWG2908](http://wg21.link/lwg2908) * [#LWG2911](http://wg21.link/lwg2911) * [#LWG2912](http://wg21.link/lwg2912) * [#LWG2913](http://wg21.link/lwg2913) * [#LWG2914](http://wg21.link/lwg2914) * [#LWG2915](http://wg21.link/lwg2915) * [#LWG2917](http://wg21.link/lwg2917) * [#LWG2918](http://wg21.link/lwg2918) * [#LWG2919](http://wg21.link/lwg2919) * [#LWG2920](http://wg21.link/lwg2920) * [#LWG2921](http://wg21.link/lwg2921) * [#LWG2924](http://wg21.link/lwg2924) * [#LWG2925](http://wg21.link/lwg2925) * [#LWG2926](http://wg21.link/lwg2926) * [#LWG2927](http://wg21.link/lwg2927) * [#LWG2928](http://wg21.link/lwg2928) * [#LWG2934](http://wg21.link/lwg2934) * [#LWG2956](http://wg21.link/lwg2956) | Compiler support ----------------- Main Article: [C++17 compiler support](compiler_support#C.2B.2B17_features "cpp/compiler support"). ### C++17 core language features | C++17 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | DR: New `auto` rules for direct-list-initialization | [N3922](https://wg21.link/N3922) | 5 | 3.8 | 19.0 (2015)\* | Yes | 4.10.1 | 17.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | [`static_assert`](language/static_assert "cpp/language/static assert") with no message | [N3928](https://wg21.link/N3928) | 6 | 2.5 | 19.10\* | Yes | 4.12 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | `typename` in a template template parameter | [N4051](https://wg21.link/N4051) | 5 | 3.5 | 19.0 (2015)\* | Yes | 4.10.1 | 17.0 | | | 10.3 | 11.0 | 17.7 | Yes\* | | Removing [trigraphs](language/operator_alternative#Trigraphs_.28removed_in_C.2B.2B17.29 "cpp/language/operator alternative") | [N4086](https://wg21.link/N4086) | 5 | 3.5 | 16.0\* | Yes | 5.0 | | | | 10.3 | 11.0 | 19.1 | 11.0 | | [Nested namespace](language/namespace#Syntax "cpp/language/namespace") definition | [N4230](https://wg21.link/N4230) | 6 | 3.6 | 19.0 (Update 3)\* | Yes | 4.12 | 17.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | [Attributes](language/attributes "cpp/language/attributes") for namespaces and enumerators | [N4266](https://wg21.link/N4266) | 4.9 (partial)\*6 | 3.6 | 19.0 (2015)\* | Yes | 4.11 | 17.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | [`u8` character literals](language/character_literal "cpp/language/character literal") | [N4267](https://wg21.link/N4267) | 6 | 3.6 | 19.0 (2015)\* | Yes | 4.11 | 17.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | Allow constant evaluation for all non-type template arguments | [N4268](https://wg21.link/N4268) | 6 | 3.6 | 19.12\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | [Fold Expressions](language/fold "cpp/language/fold") | [N4295](https://wg21.link/N4295) | 6 | 3.6 | 19.12\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 18.1 | 11.0 | | [Unary fold expressions](language/fold#Explanation "cpp/language/fold") and empty parameter packs | [P0036R0](https://wg21.link/P0036R0) | 6 | 3.9 | 19.12\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 19.1 | 11.0 | | Remove Deprecated Use of the [`register`](keyword/register "cpp/keyword/register") Keyword | [P0001R1](https://wg21.link/P0001R1) | 7 | 3.8 | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | Remove Deprecated `operator++(bool)` | [P0002R1](https://wg21.link/P0002R1) | 7 | 3.8 | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | Make exception specifications part of the type system | [P0012R1](https://wg21.link/P0012R1) | 7 | 4 | 19.12\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 19.1 | 11.0 | | [Aggregate classes](language/aggregate_initialization "cpp/language/aggregate initialization") with base classes | [P0017R1](https://wg21.link/P0017R1) | 7 | 3.9 | 19.14\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | [`__has_include`](preprocessor/include "cpp/preprocessor/include") in preprocessor conditionals | [P0061R1](https://wg21.link/P0061R1) | 5 | Yes | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | DR: New specification for [inheriting constructors](language/using_declaration#Inheriting_constructors "cpp/language/using declaration") (DR1941 et al) | [P0136R1](https://wg21.link/P0136R1) | 7 | 3.9 | 19.14\* | Yes | 6.1 | | | | 10.3 | 11.0 | 19.1 | 11.0 | | Lambda capture of `*this` | [P0018R3](https://wg21.link/P0018R3) | 7 | 3.9 | 19.11\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 18.1 | 11.0 | | Direct-list-initialization of enumerations | [P0138R2](https://wg21.link/P0138R2) | 7 | 3.9 | 19.11\* | Yes | 4.14 | 18.0 | | | 10.3 | 11.0 | 19.1 | 11.0 | | [constexpr lambda expressions](language/lambda "cpp/language/lambda") | [P0170R1](https://wg21.link/P0170R1) | 7 | 5 | 19.11\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 18.1 | 11.0 | | Differing begin and end types in [range-based for](language/range-for "cpp/language/range-for") | [P0184R0](https://wg21.link/P0184R0) | 6 | 3.9 | 19.10\* | Yes | 4.12 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | `[[[fallthrough](language/attributes/fallthrough "cpp/language/attributes/fallthrough")]]` [attribute](language/attributes "cpp/language/attributes") | [P0188R1](https://wg21.link/P0188R1) | 7 | 3.9 | 19.10\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` [attribute](language/attributes "cpp/language/attributes") | [P0189R1](https://wg21.link/P0189R1) | 7 | 3.9 | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | `[[[maybe\_unused](language/attributes/maybe_unused "cpp/language/attributes/maybe unused")]]` [attribute](language/attributes "cpp/language/attributes") | [P0212R1](https://wg21.link/P0212R1) | 7 | 3.9 | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | Hexadecimal [floating-point literals](language/floating_literal "cpp/language/floating literal") | [P0245R1](https://wg21.link/P0245R1) | 3.0 | Yes | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | Using attribute namespaces without repetition | [P0028R4](https://wg21.link/P0028R4) | 7 | 3.9 | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | [Dynamic memory allocation](language/new "cpp/language/new") for over-aligned data | [P0035R4](https://wg21.link/P0035R4) | 7 | 4 | 19.12\* | 10.0.0\* | 4.14 | 19.0 | | | 10.3 | 11.0 | 19.1 | 11.0 | | [Class template argument deduction](language/class_template_argument_deduction "cpp/language/class template argument deduction") | [P0091R3](https://wg21.link/P0091R3) | 7 | 5 | 19.14\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | Non-type template parameters with `auto` type | [P0127R2](https://wg21.link/P0127R2) | 7 | 4 | 19.14\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | Guaranteed [copy elision](language/copy_elision "cpp/language/copy elision") | [P0135R1](https://wg21.link/P0135R1) | 7 | 4 | 19.13\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | Replacement of class objects containing reference members | [P0137R1](https://wg21.link/P0137R1) | 7 | 6 | 19.14\* | Yes | 5.0 | | | | 10.3 | 11.0 | 19.1 | 11.0 | | Stricter [expression evaluation order](language/eval_order "cpp/language/eval order") | [P0145R3](https://wg21.link/P0145R3) | 7 | 4 | 19.14\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | [Structured Bindings](language/structured_binding "cpp/language/structured binding") | [P0217R3](https://wg21.link/P0217R3) | 7 | 4 | 19.11\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 18.1 | 11.0\* | | Ignore unknown [attributes](language/attributes "cpp/language/attributes") | [P0283R2](https://wg21.link/P0283R2) | Yes | 3.9 | 19.11\* | Yes | 4.13 | 18.0 | | | 10.3 | 11.0 | 17.7 | 11.0 | | [constexpr if](language/if "cpp/language/if") statements | [P0292R2](https://wg21.link/P0292R2) | 7 | 3.9 | 19.11\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 18.1 | 11.0 | | init-statements for [if](language/if "cpp/language/if") and [switch](language/switch "cpp/language/switch") | [P0305R1](https://wg21.link/P0305R1) | 7 | 3.9 | 19.11\* | Yes | 4.14 | 18.0 | | | 10.3 | 11.0 | 18.1 | 11.0 | | [Inline variables](language/inline "cpp/language/inline") | [P0386R2](https://wg21.link/P0386R2) | 7 | 3.9 | 19.12\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 18.1 | 11.0 | | Removing [dynamic exception specifications](language/except_spec "cpp/language/except spec") | [P0003R5](https://wg21.link/P0003R5) | 7 | 4 | 19.10\* | Yes | 4.14 | 19.0 | | | 10.3 | 11.0 | 19.1 | 11.0 | | Pack expansions in using-declarations | [P0195R2](https://wg21.link/P0195R2) | 7 | 4 | 19.14\* | Yes | 5.0 | 19.0 | | | 10.3 | 11.0 | 19.1 | 11.0 | | DR: Matching of template template-arguments excludes compatible templates | [P0522R0](https://wg21.link/P0522R0) | 7 | 4 | 19.12\* | Yes | 5.0 | 19.0.1 | | | 10.3 | 11.0 | 19.1 | 11.0 | | C++17 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | ### C++17 library features | C++17 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Intel Parallel STL | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | `[std::void\_t](types/void_t "cpp/types/void t")` | [N3911](https://wg21.link/N3911) | 6 | 3.6 | 19.0 (2015)\* | Yes | N/A | | 10.3 | | | [`std::uncaught_exceptions()`](error/uncaught_exception "cpp/error/uncaught exception") | [N4259](https://wg21.link/N4259) | 6 | 3.7 | 19.0 (2015)\* | Yes | N/A | | 10.3 | | | `[std::size()](iterator/size "cpp/iterator/size")`, `[std::empty()](iterator/empty "cpp/iterator/empty")` and `[std::data()](iterator/data "cpp/iterator/data")` | [N4280](https://wg21.link/N4280) | 6 | 3.6 | 19.0 (2015)\* | Yes | N/A | | 10.3 | | | Improving `[std::pair](utility/pair "cpp/utility/pair")` and `[std::tuple](utility/tuple "cpp/utility/tuple")` | [N4387](https://wg21.link/N4387) | 6 | 4 | 19.0 (Update 2)\* | Yes | N/A | | 10.3 | | | `[std::bool\_constant](types/integral_constant "cpp/types/integral constant")` | [N4389](https://wg21.link/N4389) | 6 | 3.7 | 19.0 (2015)\* | Yes | N/A | | 10.3 | | | `[std::shared\_mutex](thread/shared_mutex "cpp/thread/shared mutex")` (untimed) | [N4508](https://wg21.link/N4508) | 6 | 3.7 | 19.0 (Update 2)\* | Yes | N/A | | 10.3 | | | [Type traits](meta "cpp/meta") variable templates | [P0006R0](https://wg21.link/P0006R0) | 7 | 3.8 | 19.0 (Update 2)\* | Yes | N/A | | 10.3 | | | [Logical operator type traits](meta#Operations_on_traits "cpp/meta") | [P0013R1](https://wg21.link/P0013R1) | 6 | 3.8 | 19.0 (Update 2)\* | Yes | N/A | | 10.3 | | | Parallel algorithms and execution policies | [P0024R2](https://wg21.link/P0024R2) | 9\* | | 19.14\* | | 18.0\* | | | | | `[std::clamp()](algorithm/clamp "cpp/algorithm/clamp")` | [P0025R0](https://wg21.link/P0025R0) | 7 | 3.9 | 19.0 (Update 3)\* | 10.0.0\* | N/A | | 10.3 | | | [Hardware interference size](thread/hardware_destructive_interference_size "cpp/thread/hardware destructive interference size") | [P0154R1](https://wg21.link/P0154R1) | 12 | 15 (partial)\* | 19.11\* | | N/A | | 10.3 | | | [(nothrow-)swappable traits](types/is_swappable "cpp/types/is swappable") | [P0185R1](https://wg21.link/P0185R1) | 7 | 3.9 | 19.0 (Update 3)\* | 10.0.0\* | N/A | | 10.3 | | | [File system library](filesystem "cpp/filesystem") | [P0218R1](https://wg21.link/P0218R1) | 8 | 7 | 19.14\* | 11.0.0\* | N/A | | 10.3 | | | `[std::string\_view](string/basic_string_view "cpp/string/basic string view")` | [N3921](https://wg21.link/N3921)[P0220R1](https://wg21.link/P0220R1) | 7 | 4 | 19.10\* | 10.0.0\* | N/A | | 10.3 | | | `[std::any](utility/any "cpp/utility/any")` | [P0220R1](https://wg21.link/P0220R1) | 7 | 4 | 19.10\* | 10.0.0\* | N/A | | 10.3 | | | `[std::optional](utility/optional "cpp/utility/optional")` | [P0220R1](https://wg21.link/P0220R1) | 7 | 4 | 19.10\* | 10.0.0\* | N/A | | 10.3 | | | [Polymorphic memory resources](header/memory_resource "cpp/header/memory resource") | [P0220R1](https://wg21.link/P0220R1) | 9 | | 19.13\* | | N/A | | 10.3 | | | [Mathematical special functions](numeric/special_functions "cpp/numeric/special functions") | [P0226R1](https://wg21.link/P0226R1) | 7 | | 19.14\* | | N/A | | 10.3 | | | Major portion of C11 standard library | [P0063R3](https://wg21.link/P0063R3) | 9 | 7 | 19.0 (2015)\*(partial)\* | 10.0.0\* | N/A | | | | | Splicing [`Maps`](container/map/merge "cpp/container/map/merge") and [`Sets`](container/set/merge "cpp/container/set/merge") | [P0083R3](https://wg21.link/P0083R3) | 7 | 8 | 19.12\* | 10.0.0\* | N/A | | | | | `[std::variant](utility/variant "cpp/utility/variant")` | [P0088R3](https://wg21.link/P0088R3) | 7 | 4 | 19.10\* | 10.0.0\* | N/A | | 10.3 | | | `[std::make\_from\_tuple()](utility/make_from_tuple "cpp/utility/make from tuple")` | [P0209R2](https://wg21.link/P0209R2) | 7 | 3.9 | 19.10\* | Yes | N/A | | 10.3 | | | `[std::has\_unique\_object\_representations](types/has_unique_object_representations "cpp/types/has unique object representations")` | [P0258R2](https://wg21.link/P0258R2) | 7 | 6 | 19.11\* | Yes | N/A | | 10.3 | | | `[std::gcd()](numeric/gcd "cpp/numeric/gcd")` and `[std::lcm()](numeric/lcm "cpp/numeric/lcm")` | [P0295R0](https://wg21.link/P0295R0) | 7 | 4 | 19.11\* | Yes | N/A | | 10.3 | | | `[std::not\_fn](utility/functional/not_fn "cpp/utility/functional/not fn")` | [P0005R4](https://wg21.link/P0005R4)[P0358R1](https://wg21.link/P0358R1) | 7 | 3.9 | 19.12\* | Yes | N/A | | 10.3 | | | [Elementary string conversions](utility#Elementary_string_conversions "cpp/utility")\* | [P0067R5](https://wg21.link/P0067R5) | 8 (no FP)11 | 7 (no FP)14 (no FP from\_chars) | 19.14\* (no FP)\*19.24\* | 10.0.0\* (no FP). | N/A | | 10.3 (no FP from\_chars) | | | `[std::shared\_ptr](memory/shared_ptr "cpp/memory/shared ptr")` and `[std::weak\_ptr](memory/weak_ptr "cpp/memory/weak ptr")` with array support | [P0414R2](https://wg21.link/P0414R2) | 7 | 11 | 19.12\* | 12.0.0\* | N/A | | 10.3 | | | [`std::scoped_lock`](thread/scoped_lock "cpp/thread/scoped lock") | [P0156R2](https://wg21.link/P0156R2) | 7 | 5 | 19.11\* | Yes | N/A | | 10.3 | | | [`std::byte`](types/byte "cpp/types/byte") | [P0298R3](https://wg21.link/P0298R3) | 7 | 5 | 19.11\* | Yes | N/A | | 10.3 | | | [`std::is_aggregate`](types/is_aggregate "cpp/types/is aggregate") | [LWG2911](https://cplusplus.github.io/LWG/issue2911) | 7 | 5 | 19.15\* | Yes | N/A | | 10.3 | | | DR: [`std::hash<std::filesystem::path>`](filesystem/path/hash "cpp/filesystem/path/hash") | [LWG3657](https://cplusplus.github.io/LWG/issue3657) | 12 | | 19.32\* | | N/A | | | | | C++17 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Intel Parallel STL | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | Notes: * As of 2020-11-20, the latest release of Oracle Developer Studio [is 12.6](https://www.oracle.com/application-development/technologies/developerstudio-component-matrix.html). Its [documentation](https://docs.oracle.com/cd/E77782_01/html/E77789/bkaje.html) does not mention C++17. * Cray compiler may have support for some features earlier than 11.0. That version is when it became a derivative of Clang, gaining all of the attendant language feature support of the base compiler. See Cray/HPE document S-2179 [[1]](https://support.hpe.com/hpesc/public/docDisplay?docLocale=en_US&docId=a00123566en_us) *\** - hover over a cell with the version number to see notes. ### External links * [Working c++17 examples](https://github.com/makelinux/examples/blob/HEAD/cpp/17.cpp)
programming_docs
cpp Current Status Current Status ============== **Recent milestones: C++20 published, C++23 underway**. C++20 has been published, and work is now underway on C++23. Starting in 2012, the committee has transitioned to a "decoupled" model where major pieces of work can progress independently from the Standard itself and be delivered in "feature branch" TSes. Vendors can choose to implement these, and the community can gain experience with the std::experimental version of each feature. This lets us learn and adjust each feature's design based on experience before it is cast in stone when merged into the "trunk" C++ Standard itself. In the meantime, the Standard can be delivered on a more regular cadence with smaller and more predictable batches of features. This approach also helps C++ compilers to track the Standard more closely and add both the experimental and the draft-final C++ features in a more consistent order. [The current schedule is in paper P1000](https://wg21.link/p1000). You can also visit [open-std.org](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/) to get the latest C++ standards committee papers. Reading through those proposals, you can track the C++ developing trends and know how does a cool idea turned into the standard. However, those papers **ARE NOT** and also **SHOULD NOT BE TREATED AS** the standard documents. ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/current_status "c/current status") for Current status | cpp Concurrency support library (since C++11) Concurrency support library (since C++11) ========================================= C++ includes built-in support for threads, atomic operations, mutual exclusion, condition variables, and futures. ### Threads Threads enable programs to execute across several processor cores. | Defined in header `[<thread>](header/thread "cpp/header/thread")` | | --- | | [thread](thread/thread "cpp/thread/thread") (C++11) | manages a separate thread (class) | | [jthread](thread/jthread "cpp/thread/jthread") (C++20) | `[std::thread](thread/thread "cpp/thread/thread")` with support for auto-joining and cancellation (class) | | Functions managing the current thread | | Defined in namespace `this_thread` | | [yield](thread/yield "cpp/thread/yield") (C++11) | suggests that the implementation reschedule execution of threads (function) | | [get\_id](thread/get_id "cpp/thread/get id") (C++11) | returns the thread id of the current thread (function) | | [sleep\_for](thread/sleep_for "cpp/thread/sleep for") (C++11) | stops the execution of the current thread for a specified time duration (function) | | [sleep\_until](thread/sleep_until "cpp/thread/sleep until") (C++11) | stops the execution of the current thread until a specified time point (function) | | | | | | | | | | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Thread cancellation The `stop_XXX` types are designed to enable thread cancellation for `std::jthread`, although they can also be used independently of `std::jthread` - for example to interrupt `std::condition_variable_any` waiting functions, or for a custom thread management implementation. In fact they do not even need to be used to "stop" anything, but can instead be used for a thread-safe one-time function(s) invocation trigger, for example. | Defined in header `[<stop\_token>](header/stop_token "cpp/header/stop token")` | | --- | | [stop\_token](thread/stop_token "cpp/thread/stop token") (C++20) | an interface for querying if a `[std::jthread](thread/jthread "cpp/thread/jthread")` cancellation request has been made (class) | | [stop\_source](thread/stop_source "cpp/thread/stop source") (C++20) | class representing a request to stop one or more `[std::jthread](thread/jthread "cpp/thread/jthread")`s (class) | | [stop\_callback](thread/stop_callback "cpp/thread/stop callback") (C++20) | an interface for registering callbacks on `[std::jthread](thread/jthread "cpp/thread/jthread")` cancellation (class template) | | (since C++20) | ### Cache size access | Defined in header `[<new>](header/new "cpp/header/new")` | | --- | | [hardware\_destructive\_interference\_sizehardware\_constructive\_interference\_size](thread/hardware_destructive_interference_size "cpp/thread/hardware destructive interference size") (C++17) | min offset to avoid false sharingmax offset to promote true sharing (constant) | ### Atomic operations These components are provided for fine-grained atomic operations allowing for lockless concurrent programming. Each atomic operation is indivisible with regards to any other atomic operation that involves the same object. Atomic objects are [free of data races](language/memory_model#Threads_and_data_races "cpp/language/memory model"). | | | | --- | --- | | Neither the `_Atomic` macro, nor any of the non-macro global namespace declarations are provided by any C++ standard library header other than `<stdatomic.h>`. | (since C++23) | | Defined in header `[<atomic>](header/atomic "cpp/header/atomic")` | | --- | | Atomic types | | [atomic](atomic/atomic "cpp/atomic/atomic") (C++11) | atomic class template and specializations for bool, integral, and pointer types (class template) | | [atomic\_ref](atomic/atomic_ref "cpp/atomic/atomic ref") (C++20) | provides atomic operations on non-atomic objects (class template) | | Operations on atomic types | | [atomic\_is\_lock\_free](atomic/atomic_is_lock_free "cpp/atomic/atomic is lock free") (C++11) | checks if the atomic type's operations are lock-free (function template) | | [atomic\_storeatomic\_store\_explicit](atomic/atomic_store "cpp/atomic/atomic store") (C++11)(C++11) | atomically replaces the value of the atomic object with a non-atomic argument (function template) | | [atomic\_loadatomic\_load\_explicit](atomic/atomic_load "cpp/atomic/atomic load") (C++11)(C++11) | atomically obtains the value stored in an atomic object (function template) | | [atomic\_exchangeatomic\_exchange\_explicit](atomic/atomic_exchange "cpp/atomic/atomic exchange") (C++11)(C++11) | atomically replaces the value of the atomic object with non-atomic argument and returns the old value of the atomic (function template) | | [atomic\_compare\_exchange\_weakatomic\_compare\_exchange\_weak\_explicitatomic\_compare\_exchange\_strongatomic\_compare\_exchange\_strong\_explicit](atomic/atomic_compare_exchange "cpp/atomic/atomic compare exchange") (C++11)(C++11)(C++11)(C++11) | atomically compares the value of the atomic object with non-atomic argument and performs atomic exchange if equal or atomic load if not (function template) | | [atomic\_fetch\_addatomic\_fetch\_add\_explicit](atomic/atomic_fetch_add "cpp/atomic/atomic fetch add") (C++11)(C++11) | adds a non-atomic value to an atomic object and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_subatomic\_fetch\_sub\_explicit](atomic/atomic_fetch_sub "cpp/atomic/atomic fetch sub") (C++11)(C++11) | subtracts a non-atomic value from an atomic object and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_andatomic\_fetch\_and\_explicit](atomic/atomic_fetch_and "cpp/atomic/atomic fetch and") (C++11)(C++11) | replaces the atomic object with the result of bitwise AND with a non-atomic argument and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_oratomic\_fetch\_or\_explicit](atomic/atomic_fetch_or "cpp/atomic/atomic fetch or") (C++11)(C++11) | replaces the atomic object with the result of bitwise OR with a non-atomic argument and obtains the previous value of the atomic (function template) | | [atomic\_fetch\_xoratomic\_fetch\_xor\_explicit](atomic/atomic_fetch_xor "cpp/atomic/atomic fetch xor") (C++11)(C++11) | replaces the atomic object with the result of bitwise XOR with a non-atomic argument and obtains the previous value of the atomic (function template) | | [atomic\_waitatomic\_wait\_explicit](atomic/atomic_wait "cpp/atomic/atomic wait") (C++20)(C++20) | blocks the thread until notified and the atomic value changes (function template) | | [atomic\_notify\_one](atomic/atomic_notify_one "cpp/atomic/atomic notify one") (C++20) | notifies a thread blocked in atomic\_wait (function template) | | [atomic\_notify\_all](atomic/atomic_notify_all "cpp/atomic/atomic notify all") (C++20) | notifies all threads blocked in atomic\_wait (function template) | | Flag type and operations | | [atomic\_flag](atomic/atomic_flag "cpp/atomic/atomic flag") (C++11) | the lock-free boolean atomic type (class) | | [atomic\_flag\_test\_and\_setatomic\_flag\_test\_and\_set\_explicit](atomic/atomic_flag_test_and_set "cpp/atomic/atomic flag test and set") (C++11)(C++11) | atomically sets the flag to `true` and returns its previous value (function) | | [atomic\_flag\_clearatomic\_flag\_clear\_explicit](atomic/atomic_flag_clear "cpp/atomic/atomic flag clear") (C++11)(C++11) | atomically sets the value of the flag to `false` (function) | | [atomic\_flag\_testatomic\_flag\_test\_explicit](atomic/atomic_flag_test "cpp/atomic/atomic flag test") (C++20)(C++20) | atomically returns the value of the flag (function) | | [atomic\_flag\_waitatomic\_flag\_wait\_explicit](atomic/atomic_flag_wait "cpp/atomic/atomic flag wait") (C++20)(C++20) | blocks the thread until notified and the flag changes (function) | | [atomic\_flag\_notify\_one](atomic/atomic_flag_notify_one "cpp/atomic/atomic flag notify one") (C++20) | notifies a thread blocked in atomic\_flag\_wait (function) | | [atomic\_flag\_notify\_all](atomic/atomic_flag_notify_all "cpp/atomic/atomic flag notify all") (C++20) | notifies all threads blocked in atomic\_flag\_wait (function) | | Initialization | | [atomic\_init](atomic/atomic_init "cpp/atomic/atomic init") (C++11)(deprecated in C++20) | non-atomic initialization of a default-constructed atomic object (function template) | | [ATOMIC\_VAR\_INIT](atomic/atomic_var_init "cpp/atomic/ATOMIC VAR INIT") (C++11)(deprecated in C++20) | constant initialization of an atomic variable of static storage duration (function macro) | | [ATOMIC\_FLAG\_INIT](atomic/atomic_flag_init "cpp/atomic/ATOMIC FLAG INIT") (C++11) | initializes an `[std::atomic\_flag](atomic/atomic_flag "cpp/atomic/atomic flag")` to `false` (macro constant) | | Memory synchronization ordering | | [memory\_order](atomic/memory_order "cpp/atomic/memory order") (C++11) | defines memory ordering constraints for the given atomic operation (enum) | | [kill\_dependency](atomic/kill_dependency "cpp/atomic/kill dependency") (C++11) | removes the specified object from the `[std::memory\_order\_consume](atomic/memory_order "cpp/atomic/memory order")` dependency tree (function template) | | [atomic\_thread\_fence](atomic/atomic_thread_fence "cpp/atomic/atomic thread fence") (C++11) | generic memory order-dependent fence synchronization primitive (function) | | [atomic\_signal\_fence](atomic/atomic_signal_fence "cpp/atomic/atomic signal fence") (C++11) | fence between a thread and a signal handler executed in the same thread (function) | | Defined in header `[<stdatomic.h>](header/stdatomic.h "cpp/header/stdatomic.h")` | | C compatibility macros | | [\_Atomic](atomic/atomic "cpp/atomic/atomic") (C++23) | compatibility macro such that `_Atomic(T)` is identical to `[std::atomic](http://en.cppreference.com/w/cpp/atomic/atomic)<T>` (function macro) | ### Mutual exclusion Mutual exclusion algorithms prevent multiple threads from simultaneously accessing shared resources. This prevents data races and provides support for synchronization between threads. | Defined in header `[<mutex>](header/mutex "cpp/header/mutex")` | | --- | | [mutex](thread/mutex "cpp/thread/mutex") (C++11) | provides basic mutual exclusion facility (class) | | [timed\_mutex](thread/timed_mutex "cpp/thread/timed mutex") (C++11) | provides mutual exclusion facility which implements locking with a timeout (class) | | [recursive\_mutex](thread/recursive_mutex "cpp/thread/recursive mutex") (C++11) | provides mutual exclusion facility which can be locked recursively by the same thread (class) | | [recursive\_timed\_mutex](thread/recursive_timed_mutex "cpp/thread/recursive timed mutex") (C++11) | provides mutual exclusion facility which can be locked recursively by the same thread and implements locking with a timeout (class) | | Defined in header `[<shared\_mutex>](header/shared_mutex "cpp/header/shared mutex")` | | [shared\_mutex](thread/shared_mutex "cpp/thread/shared mutex") (C++17) | provides shared mutual exclusion facility (class) | | [shared\_timed\_mutex](thread/shared_timed_mutex "cpp/thread/shared timed mutex") (C++14) | provides shared mutual exclusion facility and implements locking with a timeout (class) | | Generic mutex management | | Defined in header `[<mutex>](header/mutex "cpp/header/mutex")` | | [lock\_guard](thread/lock_guard "cpp/thread/lock guard") (C++11) | implements a strictly scope-based mutex ownership wrapper (class template) | | [scoped\_lock](thread/scoped_lock "cpp/thread/scoped lock") (C++17) | deadlock-avoiding RAII wrapper for multiple mutexes (class template) | | [unique\_lock](thread/unique_lock "cpp/thread/unique lock") (C++11) | implements movable mutex ownership wrapper (class template) | | [shared\_lock](thread/shared_lock "cpp/thread/shared lock") (C++14) | implements movable shared mutex ownership wrapper (class template) | | [defer\_lock\_ttry\_to\_lock\_tadopt\_lock\_t](thread/lock_tag_t "cpp/thread/lock tag t") (C++11)(C++11)(C++11) | tag type used to specify locking strategy (class) | | [defer\_locktry\_to\_lockadopt\_lock](thread/lock_tag "cpp/thread/lock tag") (C++11)(C++11)(C++11) | tag constants used to specify locking strategy (constant) | | Generic locking algorithms | | [try\_lock](thread/try_lock "cpp/thread/try lock") (C++11) | attempts to obtain ownership of mutexes via repeated calls to `try_lock` (function template) | | [lock](thread/lock "cpp/thread/lock") (C++11) | locks specified mutexes, blocks if any are unavailable (function template) | | Call once | | [once\_flag](thread/once_flag "cpp/thread/once flag") (C++11) | helper object to ensure that [`call_once`](thread/call_once "cpp/thread/call once") invokes the function only once (class) | | [call\_once](thread/call_once "cpp/thread/call once") (C++11) | invokes a function only once even if called from multiple threads (function template) | ### Condition variables A condition variable is a synchronization primitive that allows multiple threads to communicate with each other. It allows some number of threads to wait (possibly with a timeout) for notification from another thread that they may proceed. A condition variable is always associated with a mutex. | Defined in header `[<condition\_variable>](header/condition_variable "cpp/header/condition variable")` | | --- | | [condition\_variable](thread/condition_variable "cpp/thread/condition variable") (C++11) | provides a condition variable associated with a `[std::unique\_lock](thread/unique_lock "cpp/thread/unique lock")` (class) | | [condition\_variable\_any](thread/condition_variable_any "cpp/thread/condition variable any") (C++11) | provides a condition variable associated with any lock type (class) | | [notify\_all\_at\_thread\_exit](thread/notify_all_at_thread_exit "cpp/thread/notify all at thread exit") (C++11) | schedules a call to `notify_all` to be invoked when this thread is completely finished (function) | | [cv\_status](thread/cv_status "cpp/thread/cv status") (C++11) | lists the possible results of timed waits on condition variables (enum) | | | | | | | | | | | | | | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Semaphores A semaphore is a lightweight synchronization primitive used to constrain concurrent access to a shared resource. When either would suffice, a semaphore can be more efficient than a condition variable. | Defined in header `[<semaphore>](header/semaphore "cpp/header/semaphore")` | | --- | | [counting\_semaphore](thread/counting_semaphore "cpp/thread/counting semaphore") (C++20) | semaphore that models a non-negative resource count (class template) | | [binary\_semaphore](thread/counting_semaphore "cpp/thread/counting semaphore") (C++20) | semaphore that has only two states (typedef) | Latches and Barriers Latches and barriers are thread coordination mechanisms that allow any number of threads to block until an expected number of threads arrive. A latch cannot be reused, while a barrier can be used repeatedly. | Defined in header `[<latch>](header/latch "cpp/header/latch")` | | --- | | [latch](thread/latch "cpp/thread/latch") (C++20) | single-use thread barrier (class) | | Defined in header `[<barrier>](header/barrier "cpp/header/barrier")` | | [barrier](thread/barrier "cpp/thread/barrier") (C++20) | reusable thread barrier (class template) | | (since C++20) | ### Futures The standard library provides facilities to obtain values that are returned and to catch exceptions that are thrown by asynchronous tasks (i.e. functions launched in separate threads). These values are communicated in a *shared state*, in which the asynchronous task may write its return value or store an exception, and which may be examined, waited for, and otherwise manipulated by other threads that hold instances of `[std::future](thread/future "cpp/thread/future")` or `[std::shared\_future](thread/shared_future "cpp/thread/shared future")` that reference that shared state. | Defined in header `[<future>](header/future "cpp/header/future")` | | --- | | [promise](thread/promise "cpp/thread/promise") (C++11) | stores a value for asynchronous retrieval (class template) | | [packaged\_task](thread/packaged_task "cpp/thread/packaged task") (C++11) | packages a function to store its return value for asynchronous retrieval (class template) | | [future](thread/future "cpp/thread/future") (C++11) | waits for a value that is set asynchronously (class template) | | [shared\_future](thread/shared_future "cpp/thread/shared future") (C++11) | waits for a value (possibly referenced by other futures) that is set asynchronously (class template) | | [async](thread/async "cpp/thread/async") (C++11) | runs a function asynchronously (potentially in a new thread) and returns a `[std::future](thread/future "cpp/thread/future")` that will hold the result (function template) | | [launch](thread/launch "cpp/thread/launch") (C++11) | specifies the launch policy for `[std::async](thread/async "cpp/thread/async")` (enum) | | [future\_status](thread/future_status "cpp/thread/future status") (C++11) | specifies the results of timed waits performed on `[std::future](thread/future "cpp/thread/future")` and `[std::shared\_future](thread/shared_future "cpp/thread/shared future")` (enum) | | Future errors | | [future\_error](thread/future_error "cpp/thread/future error") (C++11) | reports an error related to futures or promises (class) | | [future\_category](thread/future_category "cpp/thread/future category") (C++11) | identifies the future error category (function) | | [future\_errc](thread/future_errc "cpp/thread/future errc") (C++11) | identifies the future error codes (enum) | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/thread "c/thread") for Concurrency support library | cpp std Symbol Index std Symbol Index ================ This page tries to list all the symbols that are available from the *Standard Library* in the namespace `std::`. The symbols are written as follows: * Function names with `()`. * Templates with `<>`. Symbols from `std::`'s sub-namespaces (e.g. `chrono`) are not listed here, but the namespace names (prepended with the icon ▶) are the links to the corresponding pages. [Macro symbols](symbol_index/macro "cpp/symbol index/macro") and [symbols removed from the standard library](symbol_index/zombie_names "cpp/symbol index/zombie names") are listed in separated pages. `[\_](#.28underscore.29) [A](#A) [B](#B) [C](#C) [D](#D) [E](#E) [F](#F) [G](#G) [H](#H) [I](#I) [J](#J) [K](#K) [L](#L) [M](#M) [N](#N) [O](#O) [P](#P) [Q](#Q) [R](#R) [S](#S) [T](#T) [U](#U) [V](#V) [W](#W) [X](#X) [Y](#Y) [Z](#Z)` --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ### [Macros](symbol_index/macro "cpp/symbol index/macro") ### \_ (underscore) [`_Exit()`](utility/program/_exit "cpp/utility/program/ Exit") (since C++11) ### A [`abs()`](numeric/math/abs "cpp/numeric/math/abs") (int) [`abs()`](numeric/math/fabs "cpp/numeric/math/fabs") (float) [`abs<>()`](numeric/complex/abs "cpp/numeric/complex/abs") (std::complex) [`abs<>()`](numeric/valarray/abs "cpp/numeric/valarray/abs") (std::valarray) [`acos()`](numeric/math/acos "cpp/numeric/math/acos") [`acos<>()`](numeric/complex/acos "cpp/numeric/complex/acos") (std::complex) (since C++11) [`acos<>()`](numeric/valarray/acos "cpp/numeric/valarray/acos") (std::valarray) [`acosf()`](numeric/math/acos "cpp/numeric/math/acos") (since C++11) [`acosh()`](numeric/math/acosh "cpp/numeric/math/acosh") (since C++11) [`acosh<>()`](numeric/complex/acosh "cpp/numeric/complex/acosh") (std::complex) (since C++11) [`acoshf()`](numeric/math/acosh "cpp/numeric/math/acosh") (since C++11) [`acoshl()`](numeric/math/acosh "cpp/numeric/math/acosh") (since C++11) [`acosl()`](numeric/math/acos "cpp/numeric/math/acos") (since C++11) [`accumulate<>()`](algorithm/accumulate "cpp/algorithm/accumulate") [`add_const<>`](types/add_cv "cpp/types/add cv") (since C++11) [`add_const_t<>`](types/add_cv "cpp/types/add cv") (since C++14) [`add_cv<>`](types/add_cv "cpp/types/add cv") (since C++11) [`add_cv_t<>`](types/add_cv "cpp/types/add cv") (since C++14) [`add_pointer<>`](types/add_pointer "cpp/types/add pointer") (since C++11) [`add_pointer_t<>`](types/add_pointer "cpp/types/add pointer") (since C++14) [`add_lvalue_reference<>`](types/add_reference "cpp/types/add reference") (since C++11) [`add_lvalue_reference_t<>`](types/add_reference "cpp/types/add reference") (since C++14) [`add_rvalue_reference<>`](types/add_reference "cpp/types/add reference") (since C++11) [`add_rvalue_reference_t<>`](types/add_reference "cpp/types/add reference") (since C++14) [`add_volatile<>`](types/add_cv "cpp/types/add cv") (since C++11) [`add_volatile_t<>`](types/add_cv "cpp/types/add cv") (since C++14) [`addressof<>()`](memory/addressof "cpp/memory/addressof") (since C++11) [`adjacent_difference<>()`](algorithm/adjacent_difference "cpp/algorithm/adjacent difference") [`adjacent_find<>()`](algorithm/adjacent_find "cpp/algorithm/adjacent find") [`adopt_lock`](thread/lock_tag "cpp/thread/lock tag") (since C++11) [`adopt_lock_t`](thread/lock_tag_t "cpp/thread/lock tag t") (since C++11) [`advance<>()`](iterator/advance "cpp/iterator/advance") [`align()`](memory/align "cpp/memory/align") (since C++11) [`align_val_t`](memory/new/align_val_t "cpp/memory/new/align val t") (since C++17) [`aligned_alloc()`](memory/c/aligned_alloc "cpp/memory/c/aligned alloc") (since C++17) [`aligned_storage<>`](types/aligned_storage "cpp/types/aligned storage") (since C++11) [`aligned_storage_t<>`](types/aligned_storage "cpp/types/aligned storage") (since C++14) [`aligned_union<>`](types/aligned_union "cpp/types/aligned union") (since C++11) [`aligned_union_t<>`](types/aligned_union "cpp/types/aligned union") (since C++14) [`alignment_of<>`](types/alignment_of "cpp/types/alignment of") (since C++11) [`alignment_of_v<>`](types/alignment_of "cpp/types/alignment of") (since C++17) [`all_of<>()`](algorithm/all_any_none_of "cpp/algorithm/all any none of") (since C++11) [`allocate_at_least<>()`](memory/allocate_at_least "cpp/memory/allocate at least") (since C++23) [`allocate_shared<>()`](memory/shared_ptr/allocate_shared "cpp/memory/shared ptr/allocate shared") (since C++11) [`allocate_shared_for_overwrite<>()`](memory/shared_ptr/allocate_shared "cpp/memory/shared ptr/allocate shared") (since C++20) [`allocation_result<>`](memory/allocation_result "cpp/memory/allocation result") (since C++23) [`allocator<>`](memory/allocator "cpp/memory/allocator") [`allocator_arg`](memory/allocator_arg "cpp/memory/allocator arg") (since C++11) [`allocator_arg_t`](memory/allocator_arg_t "cpp/memory/allocator arg t") (since C++11) [`allocator_traits<>`](memory/allocator_traits "cpp/memory/allocator traits") (since C++11) [`any`](utility/any "cpp/utility/any") (since C++17) [`any_cast<>()`](utility/any/any_cast "cpp/utility/any/any cast") (since C++17) [`any_of<>()`](algorithm/all_any_none_of "cpp/algorithm/all any none of") (since C++11) [`apply<>()`](utility/apply "cpp/utility/apply") (since C++17) [`arg<>()`](numeric/complex/arg "cpp/numeric/complex/arg") [`array<>`](container/array "cpp/container/array") (since C++11) [`as_bytes<>()`](container/span/as_bytes "cpp/container/span/as bytes") (since C++20) [`as_const<>()`](utility/as_const "cpp/utility/as const") (since C++17) [`as_writable_bytes<>()`](container/span/as_bytes "cpp/container/span/as bytes") (since C++20) [`asctime()`](chrono/c/asctime "cpp/chrono/c/asctime") [`asin()`](numeric/math/asin "cpp/numeric/math/asin") [`asin<>()`](numeric/complex/asin "cpp/numeric/complex/asin") (std::complex) (since C++11) [`asin<>()`](numeric/valarray/asin "cpp/numeric/valarray/asin") (std::valarray) [`asinf()`](numeric/math/asin "cpp/numeric/math/asin") (since C++11) [`asinh()`](numeric/math/asinh "cpp/numeric/math/asinh") (since C++11) [`asinh<>()`](numeric/complex/asinh "cpp/numeric/complex/asinh") (std::complex) (since C++11) [`asinhf()`](numeric/math/asinh "cpp/numeric/math/asinh") (since C++11) [`asinhl()`](numeric/math/asinh "cpp/numeric/math/asinh") (since C++11) [`asinl()`](numeric/math/asin "cpp/numeric/math/asin") (since C++11) [`assignable_from<>`](concepts/assignable_from "cpp/concepts/assignable from") (since C++20) [`assoc_laguerre()`](numeric/special_functions/assoc_laguerre "cpp/numeric/special functions/assoc laguerre") (since C++17) [`assoc_laguerref()`](numeric/special_functions/assoc_laguerre "cpp/numeric/special functions/assoc laguerre") (since C++17) [`assoc_laguerrel()`](numeric/special_functions/assoc_laguerre "cpp/numeric/special functions/assoc laguerre") (since C++17) [`assoc_legendre()`](numeric/special_functions/assoc_legendre "cpp/numeric/special functions/assoc legendre") (since C++17) [`assoc_legendref()`](numeric/special_functions/assoc_legendre "cpp/numeric/special functions/assoc legendre") (since C++17) [`assoc_legendrel()`](numeric/special_functions/assoc_legendre "cpp/numeric/special functions/assoc legendre") (since C++17) [`assume_aligned<>()`](memory/assume_aligned "cpp/memory/assume aligned") (since C++20) [`async<>()`](thread/async "cpp/thread/async") (since C++11) [`at_quick_exit()`](utility/program/at_quick_exit "cpp/utility/program/at quick exit") (since C++11) [`atan()`](numeric/math/atan "cpp/numeric/math/atan") [`atan<>()`](numeric/complex/atan "cpp/numeric/complex/atan") (std::complex) (since C++11) [`atan<>()`](numeric/valarray/atan "cpp/numeric/valarray/atan") (std::valarray) [`atan2()`](numeric/math/atan2 "cpp/numeric/math/atan2") [`atan2<>()`](numeric/valarray/atan2 "cpp/numeric/valarray/atan2") (std::valarray) [`atan2f()`](numeric/math/atan2 "cpp/numeric/math/atan2") (since C++11) [`atan2l()`](numeric/math/atan2 "cpp/numeric/math/atan2") (since C++11) [`atanf()`](numeric/math/atan "cpp/numeric/math/atan") (since C++11) [`atanh()`](numeric/math/atanh "cpp/numeric/math/atanh") (since C++11) [`atanh<>()`](numeric/complex/atanh "cpp/numeric/complex/atanh") (std::complex) (since C++11) [`atanhf()`](numeric/math/atanh "cpp/numeric/math/atanh") (since C++11) [`atanhl()`](numeric/math/atanh "cpp/numeric/math/atanh") (since C++11) [`atanl()`](numeric/math/atan "cpp/numeric/math/atan") (since C++11) [`atexit()`](utility/program/atexit "cpp/utility/program/atexit") [`atof()`](string/byte/atof "cpp/string/byte/atof") [`atoi()`](string/byte/atoi "cpp/string/byte/atoi") [`atol()`](string/byte/atoi "cpp/string/byte/atoi") [`atoll()`](string/byte/atoi "cpp/string/byte/atoi") (since C++11) [`atomic<>`](atomic/atomic "cpp/atomic/atomic") (since C++11) atomic<> full specializations and typedefs: [`atomic_bool`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_char`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_char16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_char32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_char8_t`](atomic/atomic "cpp/atomic/atomic") (since C++20) [`atomic_int`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int8_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int64_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_fast8_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_fast16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_fast32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_fast64_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_least8_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_least16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_least32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_int_least64_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_intmax_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_intptr_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_llong`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_long`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_ptrdiff_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_schar`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_short`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_signed_lock_free`](atomic/atomic "cpp/atomic/atomic") (since C++20) [`atomic_size_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uchar`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint8_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint64_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_fast8_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_fast16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_fast32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_fast64_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_least8_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_least16_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_least32_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uint_least64_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uintmax_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_uintptr_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_ullong`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_ulong`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_unsigned_lock_free`](atomic/atomic "cpp/atomic/atomic") (since C++20) [`atomic_ushort`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_wchar_t`](atomic/atomic "cpp/atomic/atomic") (since C++11) [`atomic_compare_exchange_strong<>()`](atomic/atomic_compare_exchange "cpp/atomic/atomic compare exchange") (since C++11) [`atomic_compare_exchange_strong<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_compare_exchange_strong_explicit<>()`](atomic/atomic_compare_exchange "cpp/atomic/atomic compare exchange") (since C++11) [`atomic_compare_exchange_strong_explicit<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_compare_exchange_weak<>()`](atomic/atomic_compare_exchange "cpp/atomic/atomic compare exchange") (since C++11) [`atomic_compare_exchange_weak<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_compare_exchange_weak_explicit<>()`](atomic/atomic_compare_exchange "cpp/atomic/atomic compare exchange") (since C++11) [`atomic_compare_exchange_weak_explicit<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_exchange<>()`](atomic/atomic_exchange "cpp/atomic/atomic exchange") (since C++11) [`atomic_exchange<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_exchange_explicit<>()`](atomic/atomic_exchange "cpp/atomic/atomic exchange") (since C++11) [`atomic_exchange_explicit<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_fetch_add<>()`](atomic/atomic_fetch_add "cpp/atomic/atomic fetch add") (since C++11) [`atomic_fetch_add_explicit<>()`](atomic/atomic_fetch_add "cpp/atomic/atomic fetch add") (since C++11) [`atomic_fetch_and<>()`](atomic/atomic_fetch_and "cpp/atomic/atomic fetch and") (since C++11) [`atomic_fetch_and_explicit<>()`](atomic/atomic_fetch_and "cpp/atomic/atomic fetch and") (since C++11) [`atomic_fetch_or<>()`](atomic/atomic_fetch_or "cpp/atomic/atomic fetch or") (since C++11) [`atomic_fetch_or_explicit<>()`](atomic/atomic_fetch_or "cpp/atomic/atomic fetch or") (since C++11) [`atomic_fetch_sub<>()`](atomic/atomic_fetch_sub "cpp/atomic/atomic fetch sub") (since C++11) [`atomic_fetch_sub_explicit<>()`](atomic/atomic_fetch_sub "cpp/atomic/atomic fetch sub") (since C++11) [`atomic_fetch_xor<>()`](atomic/atomic_fetch_xor "cpp/atomic/atomic fetch xor") (since C++11) [`atomic_fetch_xor_explicit<>()`](atomic/atomic_fetch_xor "cpp/atomic/atomic fetch xor") (since C++11) [`atomic_flag`](atomic/atomic_flag "cpp/atomic/atomic flag") (since C++11) [`atomic_flag_clear()`](atomic/atomic_flag_clear "cpp/atomic/atomic flag clear") (since C++11) [`atomic_flag_clear_explicit()`](atomic/atomic_flag_clear "cpp/atomic/atomic flag clear") (since C++11) [`atomic_flag_notify_all()`](atomic/atomic_flag_notify_all "cpp/atomic/atomic flag notify all") (since C++20) [`atomic_flag_notify_one()`](atomic/atomic_flag_notify_one "cpp/atomic/atomic flag notify one") (since C++20) [`atomic_flag_test()`](atomic/atomic_flag_test "cpp/atomic/atomic flag test") (since C++20) [`atomic_flag_test_explicit()`](atomic/atomic_flag_test "cpp/atomic/atomic flag test") (since C++20) [`atomic_flag_test_and_set()`](atomic/atomic_flag_test_and_set "cpp/atomic/atomic flag test and set") (since C++11) [`atomic_flag_test_and_set_explicit()`](atomic/atomic_flag_test_and_set "cpp/atomic/atomic flag test and set") (since C++11) [`atomic_flag_wait()`](atomic/atomic_flag_wait "cpp/atomic/atomic flag wait") (since C++20) [`atomic_flag_wait_explicit()`](atomic/atomic_flag_wait "cpp/atomic/atomic flag wait") (since C++20) [`atomic_init<>()`](atomic/atomic_init "cpp/atomic/atomic init") (since C++11)(deprecated in C++20) [`atomic_is_lock_free<>()`](atomic/atomic_is_lock_free "cpp/atomic/atomic is lock free") (since C++11) [`atomic_is_lock_free<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_load<>()`](atomic/atomic_load "cpp/atomic/atomic load") (since C++11) [`atomic_load<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_load_explicit<>()`](atomic/atomic_load "cpp/atomic/atomic load") (since C++11) [`atomic_load_explicit<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_notify_all<>()`](atomic/atomic_notify_all "cpp/atomic/atomic notify all") (since C++20) [`atomic_notify_one<>()`](atomic/atomic_notify_one "cpp/atomic/atomic notify one") (since C++20) [`atomic_ref<>`](atomic/atomic_ref "cpp/atomic/atomic ref") (since C++20) [`atomic_signal_fence()`](atomic/atomic_signal_fence "cpp/atomic/atomic signal fence") (since C++11) [`atomic_store<>()`](atomic/atomic_store "cpp/atomic/atomic store") (since C++11) [`atomic_store<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_store_explicit<>()`](atomic/atomic_store "cpp/atomic/atomic store") (since C++11) [`atomic_store_explicit<>()`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") (std::shared\_ptr) (since C++11)(deprecated in C++20) [`atomic_thread_fence()`](atomic/atomic_thread_fence "cpp/atomic/atomic thread fence") (since C++11) [`atomic_wait<>()`](atomic/atomic_wait "cpp/atomic/atomic wait") (since C++20) [`atomic_wait_explicit<>()`](atomic/atomic_wait "cpp/atomic/atomic wait") (since C++20) [`atto`](numeric/ratio "cpp/numeric/ratio") (since C++11) ### B [`back_insert_iterator<>`](iterator/back_insert_iterator "cpp/iterator/back insert iterator") [`back_inserter<>()`](iterator/back_inserter "cpp/iterator/back inserter") [`bad_alloc`](memory/new/bad_alloc "cpp/memory/new/bad alloc") [`bad_any_cast`](utility/any/bad_any_cast "cpp/utility/any/bad any cast") (since C++17) [`bad_array_new_length`](memory/new/bad_array_new_length "cpp/memory/new/bad array new length") (since C++11) [`bad_cast`](types/bad_cast "cpp/types/bad cast") [`bad_exception`](error/bad_exception "cpp/error/bad exception") [`bad_expected_access<>`](utility/expected/bad_expected_access "cpp/utility/expected/bad expected access") (since C++23) [`bad_function_call`](utility/functional/bad_function_call "cpp/utility/functional/bad function call") (since C++11) [`bad_optional_access`](utility/optional/bad_optional_access "cpp/utility/optional/bad optional access") (since C++17) [`bad_typeid`](types/bad_typeid "cpp/types/bad typeid") [`bad_variant_access`](utility/variant/bad_variant_access "cpp/utility/variant/bad variant access") (since C++17) [`bad_weak_ptr`](memory/bad_weak_ptr "cpp/memory/bad weak ptr") (since C++11) [`barrier<>`](thread/barrier "cpp/thread/barrier") (since C++20) [`basic_common_reference<>`](types/common_reference "cpp/types/common reference") (since C++20) [`basic_const_iterator<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/iterator/basic_const_iterator&action=edit&redlink=1 "cpp/iterator/basic const iterator (page does not exist)") (since C++23) [`basic_filebuf<>`](io/basic_filebuf "cpp/io/basic filebuf") [`basic_format_arg<>`](utility/format/basic_format_arg "cpp/utility/format/basic format arg") (since C++20) [`basic_format_args<>`](utility/format/basic_format_args "cpp/utility/format/basic format args") (since C++20) [`basic_format_context<>`](utility/format/basic_format_context "cpp/utility/format/basic format context") (since C++20) [`basic_format_parse_context<>`](utility/format/basic_format_parse_context "cpp/utility/format/basic format parse context") (since C++20) [`basic_format_string<>`](utility/format/basic_format_string "cpp/utility/format/basic format string") (since C++20) [`basic_fstream<>`](io/basic_fstream "cpp/io/basic fstream") [`basic_ifstream<>`](io/basic_ifstream "cpp/io/basic ifstream") [`basic_istream<>`](io/basic_istream "cpp/io/basic istream") [`basic_ios<>`](io/basic_ios "cpp/io/basic ios") [`basic_iostream<>`](io/basic_iostream "cpp/io/basic iostream") [`basic_ispanstream<>`](io/basic_ispanstream "cpp/io/basic ispanstream") (since C++23) [`basic_istringstream<>`](io/basic_istringstream "cpp/io/basic istringstream") [`basic_ofstream<>`](io/basic_ofstream "cpp/io/basic ofstream") [`basic_ostream<>`](io/basic_ostream "cpp/io/basic ostream") [`basic_ospanstream<>`](io/basic_ospanstream "cpp/io/basic ospanstream") (since C++23) [`basic_ostringstream<>`](io/basic_ostringstream "cpp/io/basic ostringstream") [`basic_osyncstream<>`](io/basic_osyncstream "cpp/io/basic osyncstream") (since C++20) [`basic_regex<>`](regex/basic_regex "cpp/regex/basic regex") (since C++11) [`basic_spanbuf<>`](io/basic_spanbuf "cpp/io/basic spanbuf") (since C++23) [`basic_spanstream<>`](io/basic_spanstream "cpp/io/basic spanstream") (since C++23) [`basic_stacktrace<>`](utility/basic_stacktrace "cpp/utility/basic stacktrace") (since C++23) [`basic_streambuf<>`](io/basic_streambuf "cpp/io/basic streambuf") [`basic_string<>`](string/basic_string "cpp/string/basic string") [`basic_string_view<>`](string/basic_string_view "cpp/string/basic string view") (since C++17) [`basic_stringbuf<>`](io/basic_stringbuf "cpp/io/basic stringbuf") [`basic_stringstream<>`](io/basic_stringstream "cpp/io/basic stringstream") [`basic_syncbuf<>`](io/basic_syncbuf "cpp/io/basic syncbuf") (since C++20) [`begin<>()`](iterator/begin "cpp/iterator/begin") (since C++11) [`bernoulli_distribution<>`](numeric/random/bernoulli_distribution "cpp/numeric/random/bernoulli distribution") (since C++11) [`beta()`](numeric/special_functions/beta "cpp/numeric/special functions/beta") (since C++17) [`betaf()`](numeric/special_functions/beta "cpp/numeric/special functions/beta") (since C++17) [`betal()`](numeric/special_functions/beta "cpp/numeric/special functions/beta") (since C++17) [`bidirectional_iterator<>`](iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator") (since C++20) [`bidirectional_iterator_tag`](iterator/iterator_tags "cpp/iterator/iterator tags") [`binary_search<>()`](algorithm/binary_search "cpp/algorithm/binary search") [`binary_semaphore`](thread/counting_semaphore "cpp/thread/counting semaphore") (since C++20) [`bind<>()`](utility/functional/bind "cpp/utility/functional/bind") (since C++11) [`bind_back<>()`](utility/functional/bind_front "cpp/utility/functional/bind front") (since C++23) [`bind_front<>()`](utility/functional/bind_front "cpp/utility/functional/bind front") (since C++20) [`binomial_distribution<>`](numeric/random/binomial_distribution "cpp/numeric/random/binomial distribution") (since C++11) [`bit_and<>`](utility/functional/bit_and "cpp/utility/functional/bit and") [`bit_cast<>()`](numeric/bit_cast "cpp/numeric/bit cast") (since C++20) [`bit_ceil<>()`](numeric/bit_ceil "cpp/numeric/bit ceil") (since C++20) [`bit_floor<>()`](numeric/bit_floor "cpp/numeric/bit floor") (since C++20) [`bit_not<>`](utility/functional/bit_not "cpp/utility/functional/bit not") (since C++14) [`bit_or<>`](utility/functional/bit_or "cpp/utility/functional/bit or") [`bit_width<>()`](numeric/bit_width "cpp/numeric/bit width") (since C++20) [`bit_xor<>`](utility/functional/bit_xor "cpp/utility/functional/bit xor") [`bitset<>`](utility/bitset "cpp/utility/bitset") [`bool_constant<>`](types/integral_constant "cpp/types/integral constant") (since C++17) [`boolalpha()`](io/manip/boolalpha "cpp/io/manip/boolalpha") [`boyer_moore_horspool_searcher<>`](utility/functional/boyer_moore_horspool_searcher "cpp/utility/functional/boyer moore horspool searcher") (since C++17) [`boyer_moore_searcher<>`](utility/functional/boyer_moore_searcher "cpp/utility/functional/boyer moore searcher") (since C++17) [`bsearch()`](algorithm/bsearch "cpp/algorithm/bsearch") [`btowc()`](string/multibyte/btowc "cpp/string/multibyte/btowc") [`byte`](types/byte "cpp/types/byte") (since C++17) [`byteswap<>()`](numeric/byteswap "cpp/numeric/byteswap") (since C++23) ### C [`c16rtomb()`](string/multibyte/c16rtomb "cpp/string/multibyte/c16rtomb") (since C++11) [`c32rtomb()`](string/multibyte/c32rtomb "cpp/string/multibyte/c32rtomb") (since C++11) [`c8rtomb()`](string/multibyte/c8rtomb "cpp/string/multibyte/c8rtomb") (since C++20) [`call_once<>()`](thread/call_once "cpp/thread/call once") (since C++11) [`calloc()`](memory/c/calloc "cpp/memory/c/calloc") [`cauchy_distribution<>`](numeric/random/cauchy_distribution "cpp/numeric/random/cauchy distribution") (since C++11) [`cbegin<>()`](iterator/begin "cpp/iterator/begin") (since C++14) [`cbrt()`](numeric/math/cbrt "cpp/numeric/math/cbrt") (since C++11) [`cbrtf()`](numeric/math/cbrt "cpp/numeric/math/cbrt") (since C++11) [`cbrtl()`](numeric/math/cbrt "cpp/numeric/math/cbrt") (since C++11) [`ceil()`](numeric/math/ceil "cpp/numeric/math/ceil") [`ceilf()`](numeric/math/ceil "cpp/numeric/math/ceil") (since C++11) [`ceill()`](numeric/math/ceil "cpp/numeric/math/ceil") (since C++11) [`cend<>()`](iterator/end "cpp/iterator/end") (since C++14) [`centi`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`cerr`](io/cerr "cpp/io/cerr") [`char_traits<>`](string/char_traits "cpp/string/char traits") [`chars_format`](utility/chars_format "cpp/utility/chars format") (since C++17) [`chi_squared_distribution<>`](numeric/random/chi_squared_distribution "cpp/numeric/random/chi squared distribution") (since C++11) ▶ [`chrono`](symbol_index/chrono "cpp/symbol index/chrono") (since C++11) ▶ [`chrono_literals`](symbol_index/chrono_literals "cpp/symbol index/chrono literals") (since C++14) [`cin`](io/cin "cpp/io/cin") [`clamp<>()`](algorithm/clamp "cpp/algorithm/clamp") (since C++17) [`clearerr()`](io/c/clearerr "cpp/io/c/clearerr") [`clock()`](chrono/c/clock "cpp/chrono/c/clock") [`clock_t`](chrono/c/clock_t "cpp/chrono/c/clock t") [`clog`](io/clog "cpp/io/clog") [`cmatch`](regex/match_results "cpp/regex/match results") (since C++11) [`cmp_equal<>()`](utility/intcmp "cpp/utility/intcmp") (since C++20) [`cmp_greater<>()`](utility/intcmp "cpp/utility/intcmp") (since C++20) [`cmp_greater_equal<>()`](utility/intcmp "cpp/utility/intcmp") (since C++20) [`cmp_less<>()`](utility/intcmp "cpp/utility/intcmp") (since C++20) [`cmp_less_equal<>()`](utility/intcmp "cpp/utility/intcmp") (since C++20) [`cmp_not_equal<>()`](utility/intcmp "cpp/utility/intcmp") (since C++20) [`codecvt<>`](locale/codecvt "cpp/locale/codecvt") [`codecvt_base`](locale/codecvt_base "cpp/locale/codecvt base") [`codecvt_byname<>`](locale/codecvt_byname "cpp/locale/codecvt byname") [`codecvt_mode`](locale/codecvt_mode "cpp/locale/codecvt mode") (since C++11)(deprecated in C++17) [`codecvt_utf16<>`](locale/codecvt_utf16 "cpp/locale/codecvt utf16") (since C++11)(deprecated in C++17) [`codecvt_utf8<>`](locale/codecvt_utf8 "cpp/locale/codecvt utf8") (since C++11)(deprecated in C++17) [`codecvt_utf8_utf16<>`](locale/codecvt_utf8_utf16 "cpp/locale/codecvt utf8 utf16") (since C++11)(deprecated in C++17) [`collate<>`](locale/collate "cpp/locale/collate") [`collate_byname<>`](locale/collate_byname "cpp/locale/collate byname") [`common_comparison_category<>`](utility/compare/common_comparison_category "cpp/utility/compare/common comparison category") (since C++20) [`common_comparison_category_t<>`](utility/compare/common_comparison_category "cpp/utility/compare/common comparison category") (since C++20) [`common_iterator<>`](iterator/common_iterator "cpp/iterator/common iterator") (since C++20) [`common_reference<>`](types/common_reference "cpp/types/common reference") (since C++20) [`common_reference_t<>`](types/common_reference "cpp/types/common reference") (since C++20) [`common_reference_with<>`](concepts/common_reference_with "cpp/concepts/common reference with") (since C++20) [`common_type<>`](types/common_type "cpp/types/common type") (since C++11) [`common_type_t<>`](types/common_type "cpp/types/common type") (since C++14) [`common_with`](concepts/common_with "cpp/concepts/common with") (since C++20) [`comp_ellint_1()`](numeric/special_functions/comp_ellint_1 "cpp/numeric/special functions/comp ellint 1") (since C++17) [`comp_ellint_1f()`](numeric/special_functions/comp_ellint_1 "cpp/numeric/special functions/comp ellint 1") (since C++17) [`comp_ellint_1l()`](numeric/special_functions/comp_ellint_1 "cpp/numeric/special functions/comp ellint 1") (since C++17) [`comp_ellint_2()`](numeric/special_functions/comp_ellint_2 "cpp/numeric/special functions/comp ellint 2") (since C++17) [`comp_ellint_2f()`](numeric/special_functions/comp_ellint_2 "cpp/numeric/special functions/comp ellint 2") (since C++17) [`comp_ellint_2l()`](numeric/special_functions/comp_ellint_2 "cpp/numeric/special functions/comp ellint 2") (since C++17) [`comp_ellint_3()`](numeric/special_functions/comp_ellint_3 "cpp/numeric/special functions/comp ellint 3") (since C++17) [`comp_ellint_3f()`](numeric/special_functions/comp_ellint_3 "cpp/numeric/special functions/comp ellint 3") (since C++17) [`comp_ellint_3l()`](numeric/special_functions/comp_ellint_3 "cpp/numeric/special functions/comp ellint 3") (since C++17) [`compare_partial_order_fallback`](utility/compare/compare_partial_order_fallback "cpp/utility/compare/compare partial order fallback") (since C++20) [`compare_strong_order_fallback`](utility/compare/compare_strong_order_fallback "cpp/utility/compare/compare strong order fallback") (since C++20) [`compare_three_way`](utility/compare/compare_three_way "cpp/utility/compare/compare three way") (since C++20) [`compare_three_way_result<>`](utility/compare/compare_three_way_result "cpp/utility/compare/compare three way result") (since C++20) [`compare_three_way_result_t<>`](utility/compare/compare_three_way_result "cpp/utility/compare/compare three way result") (since C++20) [`compare_weak_order_fallback`](utility/compare/compare_weak_order_fallback "cpp/utility/compare/compare weak order fallback") (since C++20) [`complex<>`](numeric/complex "cpp/numeric/complex") ▶ [`complex_literals`](symbol_index/complex_literals "cpp/symbol index/complex literals") (since C++14) [`conditional<>`](types/conditional "cpp/types/conditional") (since C++11) [`conditional_t<>`](types/conditional "cpp/types/conditional") (since C++14) [`condition_variable`](thread/condition_variable "cpp/thread/condition variable") (since C++11) [`condition_variable_any`](thread/condition_variable_any "cpp/thread/condition variable any") (since C++11) [`conjunction<>`](types/conjunction "cpp/types/conjunction") (since C++17) [`conjunction_v<>`](types/conjunction "cpp/types/conjunction") (since C++17) [`conj<>()`](numeric/complex/conj "cpp/numeric/complex/conj") [`const_iterator<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/iterator/const_iterator&action=edit&redlink=1 "cpp/iterator/const iterator (page does not exist)") (since C++23) [`const_pointer_cast<>()`](memory/shared_ptr/pointer_cast "cpp/memory/shared ptr/pointer cast") (since C++11) [`const_sentinel<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/iterator/const_sentinel&action=edit&redlink=1 "cpp/iterator/const sentinel (page does not exist)") (since C++23) [`construct_at<>()`](memory/construct_at "cpp/memory/construct at") (since C++20) [`constructible_from<>`](concepts/constructible_from "cpp/concepts/constructible from") (since C++20) [`consume_header`](locale/codecvt_mode "cpp/locale/codecvt mode") (since C++11)(deprecated in C++17) [`contiguous_iterator<>`](iterator/contiguous_iterator "cpp/iterator/contiguous iterator") (since C++20) [`contiguous_iterator_tag`](iterator/iterator_tags "cpp/iterator/iterator tags") (since C++20) [`convertible_to`](concepts/convertible_to "cpp/concepts/convertible to") (since C++20) [`copy<>()`](algorithm/copy "cpp/algorithm/copy") [`copy_backward<>()`](algorithm/copy_backward "cpp/algorithm/copy backward") [`copy_constructible`](concepts/copy_constructible "cpp/concepts/copy constructible") (since C++20) [`copy_if<>()`](algorithm/copy "cpp/algorithm/copy") (since C++11) [`copy_n<>()`](algorithm/copy_n "cpp/algorithm/copy n") (since C++11) [`copyable<>`](concepts/copyable "cpp/concepts/copyable") (since C++20) [`copysign()`](numeric/math/copysign "cpp/numeric/math/copysign") (since C++11) [`copysignf()`](numeric/math/copysign "cpp/numeric/math/copysign") (since C++11) [`copysignl()`](numeric/math/copysign "cpp/numeric/math/copysign") (since C++11) [`coroutine_handle<>`](coroutine/coroutine_handle "cpp/coroutine/coroutine handle") (since C++20) [`coroutine_traits<>`](coroutine/coroutine_traits "cpp/coroutine/coroutine traits") (since C++20) [`cos()`](numeric/math/cos "cpp/numeric/math/cos") [`cos<>()`](numeric/complex/cos "cpp/numeric/complex/cos") (std::complex) [`cos<>()`](numeric/valarray/cos "cpp/numeric/valarray/cos") (std::valarray) [`cosf()`](numeric/math/cos "cpp/numeric/math/cos") (since C++11) [`cosh()`](numeric/math/cosh "cpp/numeric/math/cosh") [`cosh<>()`](numeric/complex/cosh "cpp/numeric/complex/cosh") (std::complex) [`cosh<>()`](numeric/valarray/cosh "cpp/numeric/valarray/cosh") (std::valarray) [`coshf()`](numeric/math/cosh "cpp/numeric/math/cosh") (since C++11) [`coshl()`](numeric/math/cosh "cpp/numeric/math/cosh") (since C++11) [`cosl()`](numeric/math/cos "cpp/numeric/math/cos") (since C++11) [`count<>()`](algorithm/count "cpp/algorithm/count") [`count_if<>()`](algorithm/count_if "cpp/algorithm/count if") [`counted_iterator<>`](iterator/counted_iterator "cpp/iterator/counted iterator") (since C++20) [`counting_semaphore<>`](thread/counting_semaphore "cpp/thread/counting semaphore") (since C++20) [`countl_one<>()`](numeric/countl_one "cpp/numeric/countl one") (since C++20) [`countl_zero<>()`](numeric/countl_zero "cpp/numeric/countl zero") (since C++20) [`countr_one<>()`](numeric/countr_one "cpp/numeric/countr one") (since C++20) [`countr_zero<>()`](numeric/countr_zero "cpp/numeric/countr zero") (since C++20) [`cout`](io/cout "cpp/io/cout") [`crbegin<>()`](iterator/rbegin "cpp/iterator/rbegin") (since C++14) [`cref<>()`](utility/functional/ref "cpp/utility/functional/ref") (since C++11) [`cregex_iterator`](regex/regex_iterator "cpp/regex/regex iterator") (since C++11) [`cregex_token_iterator`](regex/regex_token_iterator "cpp/regex/regex token iterator") (since C++11) [`crend<>()`](iterator/rend "cpp/iterator/rend") (since C++14) [`csub_match`](regex/sub_match "cpp/regex/sub match") (since C++11) [`ctime()`](chrono/c/ctime "cpp/chrono/c/ctime") [`ctype<>`](locale/ctype "cpp/locale/ctype") [`ctype<>`](locale/ctype "cpp/locale/ctype") (char) [`ctype_base`](locale/ctype_base "cpp/locale/ctype base") [`ctype_byname<>`](locale/ctype_byname "cpp/locale/ctype byname") [`current_exception()`](error/current_exception "cpp/error/current exception") (since C++11) [`cv_status`](thread/cv_status "cpp/thread/cv status") (since C++11) [`cyl_bessel_i()`](numeric/special_functions/cyl_bessel_i "cpp/numeric/special functions/cyl bessel i") (since C++17) [`cyl_bessel_if()`](numeric/special_functions/cyl_bessel_i "cpp/numeric/special functions/cyl bessel i") (since C++17) [`cyl_bessel_il()`](numeric/special_functions/cyl_bessel_i "cpp/numeric/special functions/cyl bessel i") (since C++17) [`cyl_bessel_j()`](numeric/special_functions/cyl_bessel_j "cpp/numeric/special functions/cyl bessel j") (since C++17) [`cyl_bessel_jf()`](numeric/special_functions/cyl_bessel_j "cpp/numeric/special functions/cyl bessel j") (since C++17) [`cyl_bessel_jl()`](numeric/special_functions/cyl_bessel_j "cpp/numeric/special functions/cyl bessel j") (since C++17) [`cyl_bessel_k()`](numeric/special_functions/cyl_bessel_k "cpp/numeric/special functions/cyl bessel k") (since C++17) [`cyl_bessel_kf()`](numeric/special_functions/cyl_bessel_k "cpp/numeric/special functions/cyl bessel k") (since C++17) [`cyl_bessel_kl()`](numeric/special_functions/cyl_bessel_k "cpp/numeric/special functions/cyl bessel k") (since C++17) [`cyl_neumann()`](numeric/special_functions/cyl_neumann "cpp/numeric/special functions/cyl neumann") (since C++17) [`cyl_neumannf()`](numeric/special_functions/cyl_neumann "cpp/numeric/special functions/cyl neumann") (since C++17) [`cyl_neumannl()`](numeric/special_functions/cyl_neumann "cpp/numeric/special functions/cyl neumann") (since C++17) ### D [`data<>()`](iterator/data "cpp/iterator/data") (since C++17) [`dec()`](io/manip/hex "cpp/io/manip/hex") [`deca`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`decay<>`](types/decay "cpp/types/decay") (since C++11) [`decay_t<>`](types/decay "cpp/types/decay") (since C++14) [`deci`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`declval<>()`](utility/declval "cpp/utility/declval") (since C++11) [`default_accessor<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan/default_accessor&action=edit&redlink=1 "cpp/container/mdspan/default accessor (page does not exist)") (since C++23) [`default_delete`](memory/default_delete "cpp/memory/default delete") (since C++11) [`default_initializable<>`](concepts/default_initializable "cpp/concepts/default initializable") (since C++20) [`default_random_engine`](numeric/random "cpp/numeric/random") (since C++11) [`default_searcher<>`](utility/functional/default_searcher "cpp/utility/functional/default searcher") (since C++17) [`default_sentinel`](iterator/default_sentinel_t "cpp/iterator/default sentinel t") (since C++20) [`default_sentinel_t`](iterator/default_sentinel_t "cpp/iterator/default sentinel t") (since C++20) [`defaultfloat()`](io/manip/fixed "cpp/io/manip/fixed") (since C++11) [`defer_lock`](thread/lock_tag "cpp/thread/lock tag") (since C++11) [`defer_lock_t`](thread/lock_tag_t "cpp/thread/lock tag t") (since C++11) [`denorm_absent`](types/numeric_limits/float_denorm_style "cpp/types/numeric limits/float denorm style") [`denorm_indeterminate`](types/numeric_limits/float_denorm_style "cpp/types/numeric limits/float denorm style") [`denorm_present`](types/numeric_limits/float_denorm_style "cpp/types/numeric limits/float denorm style") [`deque<>`](container/deque "cpp/container/deque") [`derived_from<>`](concepts/derived_from "cpp/concepts/derived from") (since C++20) [`destroy<>()`](memory/destroy "cpp/memory/destroy") (since C++17) [`destroy_at<>()`](memory/destroy_at "cpp/memory/destroy at") (since C++17) [`destroy_n<>()`](memory/destroy_n "cpp/memory/destroy n") (since C++17) [`destroying_delete`](memory/new/destroying_delete_t "cpp/memory/new/destroying delete t") (since C++20) [`destroying_delete_t`](memory/new/destroying_delete_t "cpp/memory/new/destroying delete t") (since C++20) [`destructible`](concepts/destructible "cpp/concepts/destructible") (since C++20) [`difftime()`](chrono/c/difftime "cpp/chrono/c/difftime") [`disable_sized_sentinel_for<>`](iterator/sized_sentinel_for "cpp/iterator/sized sentinel for") (since C++20) [`discrete_distribution<>`](numeric/random/discrete_distribution "cpp/numeric/random/discrete distribution") (since C++11) [`discard_block_engine<>`](numeric/random/discard_block_engine "cpp/numeric/random/discard block engine") (since C++11) [`disjunction<>`](types/disjunction "cpp/types/disjunction") (since C++17) [`disjunction_v<>`](types/disjunction "cpp/types/disjunction") (since C++17) [`distance<>()`](iterator/distance "cpp/iterator/distance") [`div()`](numeric/math/div "cpp/numeric/math/div") [`div_t`](numeric/math/div "cpp/numeric/math/div") [`divides<>`](utility/functional/divides "cpp/utility/functional/divides") [`domain_error`](error/domain_error "cpp/error/domain error") [`double_t`](numeric/math "cpp/numeric/math") (since C++11) [`dynamic_extent`](container/span/dynamic_extent "cpp/container/span/dynamic extent") (since C++20) [`dynamic_pointer_cast<>()`](memory/shared_ptr/pointer_cast "cpp/memory/shared ptr/pointer cast") (since C++11) ### E [`ellint_1()`](numeric/special_functions/ellint_1 "cpp/numeric/special functions/ellint 1") (since C++17) [`ellint_1f()`](numeric/special_functions/ellint_1 "cpp/numeric/special functions/ellint 1") (since C++17) [`ellint_1l()`](numeric/special_functions/ellint_1 "cpp/numeric/special functions/ellint 1") (since C++17) [`ellint_2()`](numeric/special_functions/ellint_2 "cpp/numeric/special functions/ellint 2") (since C++17) [`ellint_2f()`](numeric/special_functions/ellint_2 "cpp/numeric/special functions/ellint 2") (since C++17) [`ellint_2l()`](numeric/special_functions/ellint_2 "cpp/numeric/special functions/ellint 2") (since C++17) [`ellint_3()`](numeric/special_functions/ellint_3 "cpp/numeric/special functions/ellint 3") (since C++17) [`ellint_3f()`](numeric/special_functions/ellint_3 "cpp/numeric/special functions/ellint 3") (since C++17) [`ellint_3l()`](numeric/special_functions/ellint_3 "cpp/numeric/special functions/ellint 3") (since C++17) [`emit_on_flush<>()`](io/manip/emit_on_flush "cpp/io/manip/emit on flush") (since C++20) [`empty<>()`](iterator/empty "cpp/iterator/empty") (since C++17) [`enable_if<>`](types/enable_if "cpp/types/enable if") (since C++11) [`enable_if_t<>`](types/enable_if "cpp/types/enable if") (since C++14) [`enable_shared_from_this<>`](memory/enable_shared_from_this "cpp/memory/enable shared from this") (since C++11) [`end<>()`](iterator/end "cpp/iterator/end") (since C++11) [`endian`](types/endian "cpp/types/endian") (since C++20) [`endl<>()`](io/manip/endl "cpp/io/manip/endl") [`ends<>()`](io/manip/ends "cpp/io/manip/ends") [`equal<>()`](algorithm/equal "cpp/algorithm/equal") [`equal_range<>()`](algorithm/equal_range "cpp/algorithm/equal range") [`equal_to<>`](utility/functional/equal_to "cpp/utility/functional/equal to") [`equality_comparable<>`](concepts/equality_comparable "cpp/concepts/equality comparable") (since C++20) [`equality_comparable_with<>`](concepts/equality_comparable "cpp/concepts/equality comparable") (since C++20) [`equivalence_relation<>`](concepts/equivalence_relation "cpp/concepts/equivalence relation") (since C++20) [`erase<>()`](string/basic_string/erase2 "cpp/string/basic string/erase2") (std::basic\_string) (since C++20) [`erase<>()`](container/deque/erase2 "cpp/container/deque/erase2") (std::deque) (since C++20) [`erase<>()`](container/forward_list/erase2 "cpp/container/forward list/erase2") (std::forward\_list) (since C++20) [`erase<>()`](container/list/erase2 "cpp/container/list/erase2") (std::list) (since C++20) [`erase<>()`](container/vector/erase2 "cpp/container/vector/erase2") (std::vector) (since C++20) [`erase_if<>()`](string/basic_string/erase2 "cpp/string/basic string/erase2") (std::basic\_string) (since C++20) [`erase_if<>()`](container/deque/erase2 "cpp/container/deque/erase2") (std::deque) (since C++20) [`erase_if<>()`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_map/erase_if&action=edit&redlink=1 "cpp/container/flat map/erase if (page does not exist)") (std::flat\_map) (since C++23) [`erase_if<>()`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multimap/erase_if&action=edit&redlink=1 "cpp/container/flat multimap/erase if (page does not exist)") (std::flat\_multimap) (since C++23) [`erase_if<>()`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multiset/erase_if&action=edit&redlink=1 "cpp/container/flat multiset/erase if (page does not exist)") (std::flat\_multiset) (since C++23) [`erase_if<>()`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_set/erase_if&action=edit&redlink=1 "cpp/container/flat set/erase if (page does not exist)") (std::flat\_set) (since C++23) [`erase_if<>()`](container/forward_list/erase2 "cpp/container/forward list/erase2") (std::forward\_list) (since C++20) [`erase_if<>()`](container/list/erase2 "cpp/container/list/erase2") (std::list) (since C++20) [`erase_if<>()`](container/map/erase_if "cpp/container/map/erase if") (std::map) (since C++20) [`erase_if<>()`](container/multimap/erase_if "cpp/container/multimap/erase if") (std::multimap) (since C++20) [`erase_if<>()`](container/multiset/erase_if "cpp/container/multiset/erase if") (std::multiset) (since C++20) [`erase_if<>()`](container/set/erase_if "cpp/container/set/erase if") (std::set) (since C++20) [`erase_if<>()`](container/unordered_map/erase_if "cpp/container/unordered map/erase if") (std::unordered\_map) (since C++20) [`erase_if<>()`](container/unordered_multimap/erase_if "cpp/container/unordered multimap/erase if") (std::unordered\_multimap) (since C++20) [`erase_if<>()`](container/unordered_multiset/erase_if "cpp/container/unordered multiset/erase if") (std::unordered\_multiset) (since C++20) [`erase_if<>()`](container/unordered_set/erase_if "cpp/container/unordered set/erase if") (std::unordered\_set) (since C++20) [`erase_if<>()`](container/vector/erase2 "cpp/container/vector/erase2") (std::vector) (since C++20) [`erf()`](numeric/math/erf "cpp/numeric/math/erf") (since C++11) [`erfc()`](numeric/math/erfc "cpp/numeric/math/erfc") (since C++11) [`erfcf()`](numeric/math/erfc "cpp/numeric/math/erfc") (since C++11) [`erfcl()`](numeric/math/erfc "cpp/numeric/math/erfc") (since C++11) [`erff()`](numeric/math/erf "cpp/numeric/math/erf") (since C++11) [`erfl()`](numeric/math/erf "cpp/numeric/math/erf") (since C++11) [`errc`](error/errc "cpp/error/errc") (since C++11) [`error_category`](error/error_category "cpp/error/error category") (since C++11) [`error_code`](error/error_code "cpp/error/error code") (since C++11) [`error_condition`](error/error_condition "cpp/error/error condition") (since C++11) [`exa`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`exception`](error/exception "cpp/error/exception") [`exception_ptr`](error/exception_ptr "cpp/error/exception ptr") (since C++11) [`exchange<>()`](utility/exchange "cpp/utility/exchange") (since C++14) [`exclusive_scan<>()`](algorithm/exclusive_scan "cpp/algorithm/exclusive scan") (since C++17) ▶ [`execution`](symbol_index/execution "cpp/symbol index/execution") (since C++17) [`exit()`](utility/program/exit "cpp/utility/program/exit") [`exp()`](numeric/math/exp "cpp/numeric/math/exp") [`exp<>()`](numeric/complex/exp "cpp/numeric/complex/exp") (std::complex) [`exp<>()`](numeric/valarray/exp "cpp/numeric/valarray/exp") (std::valarray) [`exp2()`](numeric/math/exp2 "cpp/numeric/math/exp2") (since C++11) [`exp2f()`](numeric/math/exp2 "cpp/numeric/math/exp2") (since C++11) [`exp2l()`](numeric/math/exp2 "cpp/numeric/math/exp2") (since C++11) [`expf()`](numeric/math/exp "cpp/numeric/math/exp") (since C++11) [`expected<>`](utility/expected "cpp/utility/expected") (since C++23) [`expint()`](numeric/special_functions/expint "cpp/numeric/special functions/expint") (since C++17) [`expintf()`](numeric/special_functions/expint "cpp/numeric/special functions/expint") (since C++17) [`expintl()`](numeric/special_functions/expint "cpp/numeric/special functions/expint") (since C++17) [`expl()`](numeric/math/exp "cpp/numeric/math/exp") (since C++11) [`expm1()`](numeric/math/expm1 "cpp/numeric/math/expm1") (since C++11) [`expm1f()`](numeric/math/expm1 "cpp/numeric/math/expm1") (since C++11) [`expm1l()`](numeric/math/expm1 "cpp/numeric/math/expm1") (since C++11) [`exponential_distribution<>`](numeric/random/exponential_distribution "cpp/numeric/random/exponential distribution") (since C++11) [`extent<>`](types/extent "cpp/types/extent") (since C++11) [`extent_v<>`](types/extent "cpp/types/extent") (since C++17) [`extents<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan/extents&action=edit&redlink=1 "cpp/container/mdspan/extents (page does not exist)") (since C++23) [`extreme_value_distribution<>`](numeric/random/extreme_value_distribution "cpp/numeric/random/extreme value distribution") (since C++11) ### F [`fabs()`](numeric/math/fabs "cpp/numeric/math/fabs") [`fabsf()`](numeric/math/fabs "cpp/numeric/math/fabs") (since C++11) [`fabsl()`](numeric/math/fabs "cpp/numeric/math/fabs") (since C++11) [`false_type`](types/integral_constant "cpp/types/integral constant") (since C++11) [`fclose()`](io/c/fclose "cpp/io/c/fclose") [`fdim()`](numeric/math/fdim "cpp/numeric/math/fdim") (since C++11) [`fdimf()`](numeric/math/fdim "cpp/numeric/math/fdim") (since C++11) [`fdiml()`](numeric/math/fdim "cpp/numeric/math/fdim") (since C++11) [`feclearexcept()`](numeric/fenv/feclearexcept "cpp/numeric/fenv/feclearexcept") (since C++11) [`fegetenv()`](numeric/fenv/feenv "cpp/numeric/fenv/feenv") (since C++11) [`fegetexceptflag()`](numeric/fenv/feexceptflag "cpp/numeric/fenv/feexceptflag") (since C++11) [`fegetround()`](numeric/fenv/feround "cpp/numeric/fenv/feround") (since C++11) [`feholdexcept()`](numeric/fenv/feholdexcept "cpp/numeric/fenv/feholdexcept") (since C++11) [`femto`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`fenv_t`](numeric/fenv "cpp/numeric/fenv") (since C++11) [`feof()`](io/c/feof "cpp/io/c/feof") [`feraiseexcept()`](numeric/fenv/feraiseexcept "cpp/numeric/fenv/feraiseexcept") (since C++11) [`ferror()`](io/c/ferror "cpp/io/c/ferror") [`fesetenv()`](numeric/fenv/feenv "cpp/numeric/fenv/feenv") (since C++11) [`fesetexceptflag()`](numeric/fenv/feexceptflag "cpp/numeric/fenv/feexceptflag") (since C++11) [`fesetround()`](numeric/fenv/feround "cpp/numeric/fenv/feround") (since C++11) [`fetestexcept()`](numeric/fenv/fetestexcept "cpp/numeric/fenv/fetestexcept") (since C++11) [`feupdateenv()`](numeric/fenv/feupdateenv "cpp/numeric/fenv/feupdateenv") (since C++11) [`fexcept_t`](numeric/fenv "cpp/numeric/fenv") (since C++11) [`fflush()`](io/c/fflush "cpp/io/c/fflush") [`fgetc()`](io/c/fgetc "cpp/io/c/fgetc") [`fgetpos()`](io/c/fgetpos "cpp/io/c/fgetpos") [`fgets()`](io/c/fgets "cpp/io/c/fgets") [`fgetwc()`](io/c/fgetwc "cpp/io/c/fgetwc") [`fgetws()`](io/c/fgetws "cpp/io/c/fgetws") [`FILE`](io/c/file "cpp/io/c/FILE") [`filebuf`](io/basic_filebuf "cpp/io/basic filebuf") ▶ [`filesystem`](symbol_index/filesystem "cpp/symbol index/filesystem") (since C++17) [`fill<>()`](algorithm/fill "cpp/algorithm/fill") [`fill_n<>()`](algorithm/fill_n "cpp/algorithm/fill n") [`find<>()`](algorithm/find "cpp/algorithm/find") [`find_end<>()`](algorithm/find_end "cpp/algorithm/find end") [`find_first_of<>()`](algorithm/find_first_of "cpp/algorithm/find first of") [`find_if<>()`](algorithm/find_if "cpp/algorithm/find if") [`find_if_not<>()`](algorithm/find_if_not "cpp/algorithm/find if not") (since C++11) [`fisher_f_distribution<>`](numeric/random/fisher_f_distribution "cpp/numeric/random/fisher f distribution") (since C++11) [`fixed()`](io/manip/fixed "cpp/io/manip/fixed") [`flat_map<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_map&action=edit&redlink=1 "cpp/container/flat map (page does not exist)") (since C++23) [`flat_multimap<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multimap&action=edit&redlink=1 "cpp/container/flat multimap (page does not exist)") (since C++23) [`flat_multiset<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_multiset&action=edit&redlink=1 "cpp/container/flat multiset (page does not exist)") (since C++23) [`flat_set<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/flat_set&action=edit&redlink=1 "cpp/container/flat set (page does not exist)") (since C++23) [`float_denorm_style`](types/numeric_limits/float_denorm_style "cpp/types/numeric limits/float denorm style") [`float_round_style`](types/numeric_limits/float_round_style "cpp/types/numeric limits/float round style") [`float_t`](numeric/math "cpp/numeric/math") (since C++11) [`floating_point<>`](concepts/floating_point "cpp/concepts/floating point") (since C++20) [`floor()`](numeric/math/floor "cpp/numeric/math/floor") [`floorf()`](numeric/math/floor "cpp/numeric/math/floor") (since C++11) [`floorl()`](numeric/math/floor "cpp/numeric/math/floor") (since C++11) [`flush<>()`](io/manip/flush "cpp/io/manip/flush") [`flush_emit<>()`](io/manip/flush_emit "cpp/io/manip/flush emit") (since C++20) [`fma()`](numeric/math/fma "cpp/numeric/math/fma") (since C++11) [`fmaf()`](numeric/math/fma "cpp/numeric/math/fma") (since C++11) [`fmal()`](numeric/math/fma "cpp/numeric/math/fma") (since C++11) [`fmax()`](numeric/math/fmax "cpp/numeric/math/fmax") (since C++11) [`fmaxf()`](numeric/math/fmax "cpp/numeric/math/fmax") (since C++11) [`fmaxl()`](numeric/math/fmax "cpp/numeric/math/fmax") (since C++11) [`fmin()`](numeric/math/fmin "cpp/numeric/math/fmin") (since C++11) [`fminf()`](numeric/math/fmin "cpp/numeric/math/fmin") (since C++11) [`fminl()`](numeric/math/fmin "cpp/numeric/math/fmin") (since C++11) [`fmod()`](numeric/math/fmod "cpp/numeric/math/fmod") [`fmodf()`](numeric/math/fmod "cpp/numeric/math/fmod") (since C++11) [`fmodl()`](numeric/math/fmod "cpp/numeric/math/fmod") (since C++11) [`fopen()`](io/c/fopen "cpp/io/c/fopen") [`for_each<>()`](algorithm/for_each "cpp/algorithm/for each") [`for_each_n<>()`](algorithm/for_each_n "cpp/algorithm/for each n") (since C++17) [`format<>()`](utility/format/format "cpp/utility/format/format") (since C++20) [`format_args`](utility/format/basic_format_args "cpp/utility/format/basic format args") (since C++20) [`format_context`](utility/format/basic_format_context "cpp/utility/format/basic format context") (since C++20) [`format_error()`](utility/format/format_error "cpp/utility/format/format error") (since C++20) [`format_parse_context`](utility/format/basic_format_parse_context "cpp/utility/format/basic format parse context") (since C++20) [`format_string<>`](utility/format/basic_format_string "cpp/utility/format/basic format string") (since C++20) [`format_to<>()`](utility/format/format_to "cpp/utility/format/format to") (since C++20) [`format_to_n<>()`](utility/format/format_to_n "cpp/utility/format/format to n") (since C++20) [`format_to_n_result<>`](utility/format/format_to_n "cpp/utility/format/format to n") (since C++20) [`formatted_size<>()`](utility/format/formatted_size "cpp/utility/format/formatted size") (since C++20) [`formatter<>`](utility/format/formatter "cpp/utility/format/formatter") (since C++20) [`forward<>()`](utility/forward "cpp/utility/forward") (since C++11) [`forward_as_tuple<>()`](utility/tuple/forward_as_tuple "cpp/utility/tuple/forward as tuple") (since C++11) [`forward_iterator<>`](iterator/forward_iterator "cpp/iterator/forward iterator") (since C++20) [`forward_iterator_tag`](iterator/iterator_tags "cpp/iterator/iterator tags") [`forward_like<>()`](utility/forward_like "cpp/utility/forward like") (since C++23) [`forward_list<>`](container/forward_list "cpp/container/forward list") (since C++11) [`fpclassify()`](numeric/math/fpclassify "cpp/numeric/math/fpclassify") (since C++11) [`fpos<>`](io/fpos "cpp/io/fpos") [`fpos_t`](io/c/fpos_t "cpp/io/c/fpos t") [`fprintf()`](io/c/fprintf "cpp/io/c/fprintf") [`fputc()`](io/c/fputc "cpp/io/c/fputc") [`fputs()`](io/c/fputs "cpp/io/c/fputs") [`fputwc()`](io/c/fputwc "cpp/io/c/fputwc") [`fputws()`](io/c/fputws "cpp/io/c/fputws") [`fread()`](io/c/fread "cpp/io/c/fread") [`free()`](memory/c/free "cpp/memory/c/free") [`freopen()`](io/c/freopen "cpp/io/c/freopen") [`frexp()`](numeric/math/frexp "cpp/numeric/math/frexp") [`frexpf()`](numeric/math/frexp "cpp/numeric/math/frexp") (since C++11) [`frexpl()`](numeric/math/frexp "cpp/numeric/math/frexp") (since C++11) [`from_chars()`](utility/from_chars "cpp/utility/from chars") (since C++17) [`from_chars_result`](utility/from_chars "cpp/utility/from chars") (since C++17) [`from_range`](ranges/from_range "cpp/ranges/from range") (since C++23) [`from_range_t`](ranges/from_range "cpp/ranges/from range") (since C++23) [`front_insert_iterator<>`](iterator/front_insert_iterator "cpp/iterator/front insert iterator") [`front_inserter<>()`](iterator/front_inserter "cpp/iterator/front inserter") [`fscanf()`](io/c/fscanf "cpp/io/c/fscanf") [`fseek()`](io/c/fseek "cpp/io/c/fseek") [`fsetpos()`](io/c/fsetpos "cpp/io/c/fsetpos") [`fstream`](io/basic_fstream "cpp/io/basic fstream") [`ftell()`](io/c/ftell "cpp/io/c/ftell") [`function<>`](utility/functional/function "cpp/utility/functional/function") (since C++11) [`future<>`](thread/future "cpp/thread/future") (since C++11) [`future_category()`](thread/future_category "cpp/thread/future category") (since C++11) [`future_errc`](thread/future_errc "cpp/thread/future errc") (since C++11) [`future_error`](thread/future_error "cpp/thread/future error") (since C++11) [`future_status`](thread/future_status "cpp/thread/future status") (since C++11) [`fwide()`](io/c/fwide "cpp/io/c/fwide") [`fwprintf()`](io/c/fwprintf "cpp/io/c/fwprintf") [`fwrite()`](io/c/fwrite "cpp/io/c/fwrite") [`fwscanf()`](io/c/fwscanf "cpp/io/c/fwscanf") ### G [`gamma_distribution<>`](numeric/random/gamma_distribution "cpp/numeric/random/gamma distribution") (since C++11) [`gcd<>()`](numeric/gcd "cpp/numeric/gcd") (since C++17) [`generate<>()`](algorithm/generate "cpp/algorithm/generate") [`generate_canonical<>`](numeric/random/generate_canonical "cpp/numeric/random/generate canonical") (since C++11) [`generate_header`](locale/codecvt_mode "cpp/locale/codecvt mode") (since C++11)(deprecated in C++17) [`generate_n<>()`](algorithm/generate_n "cpp/algorithm/generate n") [`generic_category()`](error/generic_category "cpp/error/generic category") (since C++11) [`geometric_distribution<>`](numeric/random/geometric_distribution "cpp/numeric/random/geometric distribution") (since C++11) [`get<>()`](container/array/get "cpp/container/array/get") (std::array) (since C++11) [`get<>()`](utility/pair/get "cpp/utility/pair/get") (std::pair) (since C++11) [`get<>()`](ranges/subrange/get "cpp/ranges/subrange/get") (std::ranges::subrange) (since C++20) [`get<>()`](utility/tuple/get "cpp/utility/tuple/get") (std::tuple) (since C++11) [`get<>()`](utility/variant/get "cpp/utility/variant/get") (std::variant) (since C++17) [`get_deleter<>()`](memory/shared_ptr/get_deleter "cpp/memory/shared ptr/get deleter") (since C++11) [`get_if<>()`](utility/variant/get_if "cpp/utility/variant/get if") (since C++17) [`get_money<>()`](io/manip/get_money "cpp/io/manip/get money") (since C++11) [`get_new_handler()`](memory/new/get_new_handler "cpp/memory/new/get new handler") (since C++11) [`get_terminate()`](error/get_terminate "cpp/error/get terminate") (since C++11) [`get_time<>()`](io/manip/get_time "cpp/io/manip/get time") (since C++11) [`getline<>()`](string/basic_string/getline "cpp/string/basic string/getline") [`getc()`](io/c/fgetc "cpp/io/c/fgetc") [`getchar()`](io/c/getchar "cpp/io/c/getchar") [`getenv()`](utility/program/getenv "cpp/utility/program/getenv") [`getwc()`](io/c/fgetwc "cpp/io/c/fgetwc") [`getwchar()`](io/c/getwchar "cpp/io/c/getwchar") [`giga`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`gmtime()`](chrono/c/gmtime "cpp/chrono/c/gmtime") [`greater<>`](utility/functional/greater "cpp/utility/functional/greater") [`greater_equal<>`](utility/functional/greater_equal "cpp/utility/functional/greater equal") [`gslice`](numeric/valarray/gslice "cpp/numeric/valarray/gslice") [`gslice_array<>`](numeric/valarray/gslice_array "cpp/numeric/valarray/gslice array") ### H [`hardware_constructive_interference_size`](thread/hardware_destructive_interference_size "cpp/thread/hardware destructive interference size") (since C++17) [`hardware_destructive_interference_size`](thread/hardware_destructive_interference_size "cpp/thread/hardware destructive interference size") (since C++17) [`has_facet<>()`](locale/has_facet "cpp/locale/has facet") [`has_single_bit<>()`](numeric/has_single_bit "cpp/numeric/has single bit") (since C++20) [`has_unique_object_representations<>`](types/has_unique_object_representations "cpp/types/has unique object representations") (since C++17) [`has_unique_object_representations_v<>`](types/has_unique_object_representations "cpp/types/has unique object representations") (since C++17) [`has_virtual_destructor<>`](types/has_virtual_destructor "cpp/types/has virtual destructor") (since C++11) [`has_virtual_destructor_v<>`](types/has_virtual_destructor "cpp/types/has virtual destructor") (since C++17) [`hash<>`](utility/hash "cpp/utility/hash") (since C++11) [`hecto`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`hermite()`](numeric/special_functions/hermite "cpp/numeric/special functions/hermite") (since C++17) [`hermitef()`](numeric/special_functions/hermite "cpp/numeric/special functions/hermite") (since C++17) [`hermitel()`](numeric/special_functions/hermite "cpp/numeric/special functions/hermite") (since C++17) [`hex()`](io/manip/hex "cpp/io/manip/hex") [`hexfloat()`](io/manip/fixed "cpp/io/manip/fixed") (since C++11) [`holds_alternative<>()`](utility/variant/holds_alternative "cpp/utility/variant/holds alternative") (since C++17) [`hypot()`](numeric/math/hypot "cpp/numeric/math/hypot") (since C++11) [`hypotf()`](numeric/math/hypot "cpp/numeric/math/hypot") (since C++11) [`hypotl()`](numeric/math/hypot "cpp/numeric/math/hypot") (since C++11) ### I [`identity`](utility/functional/identity "cpp/utility/functional/identity") (since C++20) [`ifstream`](io/basic_ifstream "cpp/io/basic ifstream") [`ignore`](utility/tuple/ignore "cpp/utility/tuple/ignore") (since C++11) [`ilogb()`](numeric/math/ilogb "cpp/numeric/math/ilogb") (since C++11) [`ilogbf()`](numeric/math/ilogb "cpp/numeric/math/ilogb") (since C++11) [`ilogbl()`](numeric/math/ilogb "cpp/numeric/math/ilogb") (since C++11) [`imag<>()`](numeric/complex/imag2 "cpp/numeric/complex/imag2") [`imaxabs()`](numeric/math/abs "cpp/numeric/math/abs") (since C++11) [`imaxdiv()`](numeric/math/div "cpp/numeric/math/div") (since C++11) [`imaxdiv_t`](numeric/math/div "cpp/numeric/math/div") (since C++11) [`in_place`](utility/in_place "cpp/utility/in place") (since C++17) [`in_place_index<>`](utility/in_place "cpp/utility/in place") (since C++17) [`in_place_index_t<>`](utility/in_place "cpp/utility/in place") (since C++17) [`in_place_t`](utility/in_place "cpp/utility/in place") (since C++17) [`in_place_type<>`](utility/in_place "cpp/utility/in place") (since C++17) [`in_place_type_t<>`](utility/in_place "cpp/utility/in place") (since C++17) [`in_range<>()`](utility/in_range "cpp/utility/in range") (since C++20) [`includes<>()`](algorithm/includes "cpp/algorithm/includes") [`inclusive_scan<>()`](algorithm/inclusive_scan "cpp/algorithm/inclusive scan") (since C++17) [`incrementable<>`](iterator/incrementable "cpp/iterator/incrementable") (since C++20) [`incrementable_traits<>`](iterator/incrementable_traits "cpp/iterator/incrementable traits") (since C++20) [`independent_bits_engine<>`](numeric/random/independent_bits_engine "cpp/numeric/random/independent bits engine") (since C++11) [`indirect_array<>`](numeric/valarray/indirect_array "cpp/numeric/valarray/indirect array") [`indirect_binary_predicate<>`](iterator/indirect_binary_predicate "cpp/iterator/indirect binary predicate") (since C++20) [`indirect_equivalence_relation<>`](iterator/indirect_equivalence_relation "cpp/iterator/indirect equivalence relation") (since C++20) [`indirect_result_t<>`](iterator/indirect_result_t "cpp/iterator/indirect result t") (since C++20) [`indirect_strict_weak_order<>`](iterator/indirect_strict_weak_order "cpp/iterator/indirect strict weak order") (since C++20) [`indirect_unary_predicate<>`](iterator/indirect_unary_predicate "cpp/iterator/indirect unary predicate") (since C++20) [`indirectly_comparable<>`](iterator/indirectly_comparable "cpp/iterator/indirectly comparable") (since C++20) [`indirectly_copyable<>`](iterator/indirectly_copyable "cpp/iterator/indirectly copyable") (since C++20) [`indirectly_copyable_storable<>`](iterator/indirectly_copyable_storable "cpp/iterator/indirectly copyable storable") (since C++20) [`indirectly_movable<>`](iterator/indirectly_movable "cpp/iterator/indirectly movable") (since C++20) [`indirectly_movable_storable<>`](iterator/indirectly_movable_storable "cpp/iterator/indirectly movable storable") (since C++20) [`indirectly_readable<>`](iterator/indirectly_readable "cpp/iterator/indirectly readable") (since C++20) [`indirectly_readable_traits<>`](iterator/indirectly_readable_traits "cpp/iterator/indirectly readable traits") (since C++20) [`indirectly_regular_unary_invocable<>`](iterator/indirectly_unary_invocable "cpp/iterator/indirectly unary invocable") (since C++20) [`indirectly_swappable<>`](iterator/indirectly_swappable "cpp/iterator/indirectly swappable") (since C++20) [`indirectly_unary_invocable<>`](iterator/indirectly_unary_invocable "cpp/iterator/indirectly unary invocable") (since C++20) [`indirectly_writable<>`](iterator/indirectly_writable "cpp/iterator/indirectly writable") (since C++20) [`initializer_list<>`](utility/initializer_list "cpp/utility/initializer list") (since C++11) [`inner_product<>()`](algorithm/inner_product "cpp/algorithm/inner product") [`inout_ptr<>()`](memory/inout_ptr_t/inout_ptr "cpp/memory/inout ptr t/inout ptr") (since C++23) [`inout_ptr_t<>`](memory/inout_ptr_t "cpp/memory/inout ptr t") (since C++23) [`inplace_merge<>()`](algorithm/inplace_merge "cpp/algorithm/inplace merge") [`input_iterator<>`](iterator/input_iterator "cpp/iterator/input iterator") (since C++20) [`input_iterator_tag`](iterator/iterator_tags "cpp/iterator/iterator tags") [`input_or_output_iterator<>`](iterator/input_or_output_iterator "cpp/iterator/input or output iterator") (since C++20) [`insert_iterator<>`](iterator/insert_iterator "cpp/iterator/insert iterator") [`inserter<>()`](iterator/inserter "cpp/iterator/inserter") [`int_fast16_t`](types/integer "cpp/types/integer") (since C++11) [`int_fast32_t`](types/integer "cpp/types/integer") (since C++11) [`int_fast64_t`](types/integer "cpp/types/integer") (since C++11) [`int_fast8_t`](types/integer "cpp/types/integer") (since C++11) [`int_least16_t`](types/integer "cpp/types/integer") (since C++11) [`int_least32_t`](types/integer "cpp/types/integer") (since C++11) [`int_least64_t`](types/integer "cpp/types/integer") (since C++11) [`int_least8_t`](types/integer "cpp/types/integer") (since C++11) [`int16_t`](types/integer "cpp/types/integer") (since C++11) [`int32_t`](types/integer "cpp/types/integer") (since C++11) [`int64_t`](types/integer "cpp/types/integer") (since C++11) [`int8_t`](types/integer "cpp/types/integer") (since C++11) [`integer_sequence<>`](utility/integer_sequence "cpp/utility/integer sequence") (since C++14) [`integral<>`](concepts/integral "cpp/concepts/integral") (since C++20) [`integral_constant<>`](types/integral_constant "cpp/types/integral constant") (since C++11) [`internal()`](io/manip/left "cpp/io/manip/left") [`intmax_t`](types/integer "cpp/types/integer") (since C++11) [`intptr_t`](types/integer "cpp/types/integer") (since C++11) [`invalid_argument`](error/invalid_argument "cpp/error/invalid argument") [`invocable<>`](concepts/invocable "cpp/concepts/invocable") (since C++20) [`invoke<>()`](utility/functional/invoke "cpp/utility/functional/invoke") (since C++17) [`invoke_r<>()`](utility/functional/invoke "cpp/utility/functional/invoke") (since C++23) [`invoke_result<>`](types/result_of "cpp/types/result of") (since C++17) [`invoke_result_t<>`](types/result_of "cpp/types/result of") (since C++17) [`io_errc`](io/io_errc "cpp/io/io errc") (since C++11) [`ios`](io/basic_ios "cpp/io/basic ios") [`ios_base`](io/ios_base "cpp/io/ios base") [`iostream`](io/basic_iostream "cpp/io/basic iostream") [`iostream_category`](io/iostream_category "cpp/io/iostream category") (since C++11) [`iota<>()`](algorithm/iota "cpp/algorithm/iota") (since C++11) [`is_abstract<>`](types/is_abstract "cpp/types/is abstract") (since C++11) [`is_abstract_v<>`](types/is_abstract "cpp/types/is abstract") (since C++17) [`is_aggregate<>`](types/is_aggregate "cpp/types/is aggregate") (since C++17) [`is_aggregate_v<>`](types/is_aggregate "cpp/types/is aggregate") (since C++17) [`is_arithmetic<>`](types/is_arithmetic "cpp/types/is arithmetic") (since C++11) [`is_arithmetic_v<>`](types/is_arithmetic "cpp/types/is arithmetic") (since C++17) [`is_array<>`](types/is_array "cpp/types/is array") (since C++11) [`is_array_v<>`](types/is_array "cpp/types/is array") (since C++17) [`is_assignable<>`](types/is_assignable "cpp/types/is assignable") (since C++11) [`is_assignable_v<>`](types/is_assignable "cpp/types/is assignable") (since C++17) [`is_base_of<>`](types/is_base_of "cpp/types/is base of") (since C++11) [`is_base_of_v<>`](types/is_base_of "cpp/types/is base of") (since C++17) [`is_bind_expression<>`](utility/functional/is_bind_expression "cpp/utility/functional/is bind expression") (since C++11) [`is_bind_expression_v<>`](utility/functional/is_bind_expression "cpp/utility/functional/is bind expression") (since C++17) [`is_bounded_array<>`](types/is_bounded_array "cpp/types/is bounded array") (since C++20) [`is_bounded_array_v<>`](types/is_bounded_array "cpp/types/is bounded array") (since C++20) [`is_class<>`](types/is_class "cpp/types/is class") (since C++11) [`is_class_v<>`](types/is_class "cpp/types/is class") (since C++17) [`is_compound<>`](types/is_compound "cpp/types/is compound") (since C++11) [`is_compound_v<>`](types/is_compound "cpp/types/is compound") (since C++17) [`is_const<>`](types/is_const "cpp/types/is const") (since C++11) [`is_const_v<>`](types/is_const "cpp/types/is const") (since C++17) [`is_constant_evaluated()`](types/is_constant_evaluated "cpp/types/is constant evaluated") (since C++20) [`is_constructible<>`](types/is_constructible "cpp/types/is constructible") (since C++11) [`is_constructible_v<>`](types/is_constructible "cpp/types/is constructible") (since C++17) [`is_convertible<>`](types/is_convertible "cpp/types/is convertible") (since C++11) [`is_convertible_v<>`](types/is_convertible "cpp/types/is convertible") (since C++17) [`is_copy_assignable<>`](types/is_copy_assignable "cpp/types/is copy assignable") (since C++11) [`is_copy_assignable_v<>`](types/is_copy_assignable "cpp/types/is copy assignable") (since C++17) [`is_copy_constructible<>`](types/is_copy_constructible "cpp/types/is copy constructible") (since C++11) [`is_copy_constructible_v<>`](types/is_copy_constructible "cpp/types/is copy constructible") (since C++17) [`is_corresponding_member<>()`](types/is_corresponding_member "cpp/types/is corresponding member") (since C++20) [`is_default_constructible<>`](types/is_default_constructible "cpp/types/is default constructible") (since C++11) [`is_default_constructible_v<>`](types/is_default_constructible "cpp/types/is default constructible") (since C++17) [`is_destructible<>`](types/is_destructible "cpp/types/is destructible") (since C++11) [`is_destructible_v<>`](types/is_destructible "cpp/types/is destructible") (since C++17) [`is_empty<>`](types/is_empty "cpp/types/is empty") (since C++11) [`is_empty_v<>`](types/is_empty "cpp/types/is empty") (since C++17) [`is_enum<>`](types/is_enum "cpp/types/is enum") (since C++11) [`is_enum_v<>`](types/is_enum "cpp/types/is enum") (since C++17) [`is_eq()`](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (since C++20) [`is_error_code_enum<>()`](error/error_code/is_error_code_enum "cpp/error/error code/is error code enum") (since C++11) [`is_error_condition_enum`](error/error_condition/is_error_condition_enum "cpp/error/error condition/is error condition enum") (since C++11) [`is_error_condition_enum_v`](error/error_condition/is_error_condition_enum "cpp/error/error condition/is error condition enum") (since C++17) [`is_execution_policy<>`](algorithm/is_execution_policy "cpp/algorithm/is execution policy") (since C++17) [`is_execution_policy_v<>`](algorithm/is_execution_policy "cpp/algorithm/is execution policy") (since C++17) [`is_final<>`](types/is_final "cpp/types/is final") (since C++14) [`is_final_v<>`](types/is_final "cpp/types/is final") (since C++17) [`is_floating_point<>`](types/is_floating_point "cpp/types/is floating point") (since C++11) [`is_floating_point_v<>`](types/is_floating_point "cpp/types/is floating point") (since C++17) [`is_function<>`](types/is_function "cpp/types/is function") (since C++11) [`is_function_v<>`](types/is_function "cpp/types/is function") (since C++17) [`is_fundamental<>`](types/is_fundamental "cpp/types/is fundamental") (since C++11) [`is_fundamental_v<>`](types/is_fundamental "cpp/types/is fundamental") (since C++17) [`is_gt()`](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (since C++20) [`is_gteq()`](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (since C++20) [`is_heap<>()`](algorithm/is_heap "cpp/algorithm/is heap") (since C++11) [`is_heap_until<>()`](algorithm/is_heap_until "cpp/algorithm/is heap until") (since C++11) [`is_integral<>`](types/is_integral "cpp/types/is integral") (since C++11) [`is_integral_v<>`](types/is_integral "cpp/types/is integral") (since C++17) [`is_invocable<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_invocable_r<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_invocable_r_v<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_invocable_v<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_layout_compatible<>`](types/is_layout_compatible "cpp/types/is layout compatible") (since C++20) [`is_layout_compatible_v<>`](types/is_layout_compatible "cpp/types/is layout compatible") (since C++20) [`is_lt()`](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (since C++20) [`is_lteq()`](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (since C++20) [`is_lvalue_reference<>`](types/is_lvalue_reference "cpp/types/is lvalue reference") (since C++11) [`is_lvalue_reference_v<>`](types/is_lvalue_reference "cpp/types/is lvalue reference") (since C++17) [`is_member_function_pointer<>`](types/is_member_function_pointer "cpp/types/is member function pointer") (since C++11) [`is_member_function_pointer_v<>`](types/is_member_function_pointer "cpp/types/is member function pointer") (since C++17) [`is_member_object_pointer<>`](types/is_member_object_pointer "cpp/types/is member object pointer") (since C++11) [`is_member_object_pointer_v<>`](types/is_member_object_pointer "cpp/types/is member object pointer") (since C++17) [`is_member_pointer<>`](types/is_member_pointer "cpp/types/is member pointer") (since C++11) [`is_member_pointer_v<>`](types/is_member_pointer "cpp/types/is member pointer") (since C++17) [`is_move_assignable<>`](types/is_move_assignable "cpp/types/is move assignable") (since C++11) [`is_move_assignable_v<>`](types/is_move_assignable "cpp/types/is move assignable") (since C++17) [`is_move_constructible<>`](types/is_move_constructible "cpp/types/is move constructible") (since C++11) [`is_move_constructible_v<>`](types/is_move_constructible "cpp/types/is move constructible") (since C++17) [`is_neq()`](utility/compare/named_comparison_functions "cpp/utility/compare/named comparison functions") (since C++20) [`is_nothrow_assignable<>`](types/is_assignable "cpp/types/is assignable") (since C++11) [`is_nothrow_assignable_v<>`](types/is_assignable "cpp/types/is assignable") (since C++17) [`is_nothrow_constructible<>`](types/is_constructible "cpp/types/is constructible") (since C++11) [`is_nothrow_constructible_v<>`](types/is_constructible "cpp/types/is constructible") (since C++17) [`is_nothrow_convertible<>`](types/is_convertible "cpp/types/is convertible") (since C++20) [`is_nothrow_convertible_v<>`](types/is_convertible "cpp/types/is convertible") (since C++20) [`is_nothrow_copy_assignable<>`](types/is_copy_assignable "cpp/types/is copy assignable") (since C++11) [`is_nothrow_copy_assignable_v<>`](types/is_copy_assignable "cpp/types/is copy assignable") (since C++17) [`is_nothrow_copy_constructible<>`](types/is_copy_constructible "cpp/types/is copy constructible") (since C++11) [`is_nothrow_copy_constructible_v<>`](types/is_copy_constructible "cpp/types/is copy constructible") (since C++17) [`is_nothrow_default_constructible<>`](types/is_default_constructible "cpp/types/is default constructible") (since C++11) [`is_nothrow_default_constructible_v<>`](types/is_default_constructible "cpp/types/is default constructible") (since C++17) [`is_nothrow_destructible<>`](types/is_destructible "cpp/types/is destructible") (since C++11) [`is_nothrow_destructible_v<>`](types/is_destructible "cpp/types/is destructible") (since C++17) [`is_nothrow_invocable<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_nothrow_invocable_r<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_nothrow_invocable_r_v<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_nothrow_invocable_v<>`](types/is_invocable "cpp/types/is invocable") (since C++17) [`is_nothrow_move_assignable<>`](types/is_move_assignable "cpp/types/is move assignable") (since C++11) [`is_nothrow_move_assignable_v<>`](types/is_move_assignable "cpp/types/is move assignable") (since C++17) [`is_nothrow_move_constructible<>`](types/is_move_constructible "cpp/types/is move constructible") (since C++11) [`is_nothrow_move_constructible_v<>`](types/is_move_constructible "cpp/types/is move constructible") (since C++17) [`is_nothrow_swappable<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_nothrow_swappable_v<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_nothrow_swappable_with<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_nothrow_swappable_with_v<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_null_pointer<>`](types/is_null_pointer "cpp/types/is null pointer") (since C++11) [`is_null_pointer_v<>`](types/is_null_pointer "cpp/types/is null pointer") (since C++17) [`is_object<>`](types/is_object "cpp/types/is object") (since C++11) [`is_object_v<>`](types/is_object "cpp/types/is object") (since C++17) [`is_partitioned<>()`](algorithm/is_partitioned "cpp/algorithm/is partitioned") (since C++11) [`is_permutation<>()`](algorithm/is_permutation "cpp/algorithm/is permutation") (since C++11) [`is_placeholder<>`](utility/functional/is_placeholder "cpp/utility/functional/is placeholder") (since C++11) [`is_placeholder_v<>`](utility/functional/is_placeholder "cpp/utility/functional/is placeholder") (since C++17) [`is_pod<>`](types/is_pod "cpp/types/is pod") (since C++11)(deprecated in C++20) [`is_pod_v<>`](types/is_pod "cpp/types/is pod") (since C++17)(deprecated in C++20) [`is_pointer<>`](types/is_pointer "cpp/types/is pointer") (since C++11) [`is_pointer_v<>`](types/is_pointer "cpp/types/is pointer") (since C++17) [`is_pointer_interconvertible_base_of<>`](types/is_pointer_interconvertible_base_of "cpp/types/is pointer interconvertible base of") (since C++20) [`is_pointer_interconvertible_base_of_v<>`](types/is_pointer_interconvertible_base_of "cpp/types/is pointer interconvertible base of") (since C++20) [`is_pointer_interconvertible_with_class<>()`](types/is_pointer_interconvertible_with_class "cpp/types/is pointer interconvertible with class") (since C++20) [`is_polymorphic<>`](types/is_polymorphic "cpp/types/is polymorphic") (since C++11) [`is_polymorphic_v<>`](types/is_polymorphic "cpp/types/is polymorphic") (since C++17) [`is_reference<>`](types/is_reference "cpp/types/is reference") (since C++11) [`is_reference_v<>`](types/is_reference "cpp/types/is reference") (since C++17) [`is_rvalue_reference<>`](types/is_rvalue_reference "cpp/types/is rvalue reference") (since C++11) [`is_rvalue_reference_v<>`](types/is_rvalue_reference "cpp/types/is rvalue reference") (since C++17) [`is_same<>`](types/is_same "cpp/types/is same") (since C++11) [`is_same_v<>`](types/is_same "cpp/types/is same") (since C++17) [`is_scalar<>`](types/is_scalar "cpp/types/is scalar") (since C++11) [`is_scalar_v<>`](types/is_scalar "cpp/types/is scalar") (since C++17) [`is_scoped_enum<>`](types/is_scoped_enum "cpp/types/is scoped enum") (since C++23) [`is_scoped_enum_v<>`](types/is_scoped_enum "cpp/types/is scoped enum") (since C++23) [`is_signed<>`](types/is_signed "cpp/types/is signed") (since C++11) [`is_signed_v<>`](types/is_signed "cpp/types/is signed") (since C++17) [`is_sorted<>()`](algorithm/is_sorted "cpp/algorithm/is sorted") (since C++11) [`is_sorted_until<>()`](algorithm/is_sorted_until "cpp/algorithm/is sorted until") (since C++11) [`is_standard_layout<>`](types/is_standard_layout "cpp/types/is standard layout") (since C++11) [`is_standard_layout_v<>`](types/is_standard_layout "cpp/types/is standard layout") (since C++17) [`is_swappable<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_swappable_v<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_swappable_with<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_swappable_with_v<>`](types/is_swappable "cpp/types/is swappable") (since C++17) [`is_trivial<>`](types/is_trivial "cpp/types/is trivial") (since C++11) [`is_trivial_v<>`](types/is_trivial "cpp/types/is trivial") (since C++17) [`is_trivially_assignable<>`](types/is_assignable "cpp/types/is assignable") (since C++11) [`is_trivially_assignable_v<>`](types/is_assignable "cpp/types/is assignable") (since C++17) [`is_trivially_constructible<>`](types/is_constructible "cpp/types/is constructible") (since C++11) [`is_trivially_constructible_v<>`](types/is_constructible "cpp/types/is constructible") (since C++17) [`is_trivially_copyable<>`](types/is_trivially_copyable "cpp/types/is trivially copyable") (since C++11) [`is_trivially_copyable_v<>`](types/is_trivially_copyable "cpp/types/is trivially copyable") (since C++17) [`is_trivially_copy_assignable<>`](types/is_copy_assignable "cpp/types/is copy assignable") (since C++11) [`is_trivially_copy_assignable_v<>`](types/is_copy_assignable "cpp/types/is copy assignable") (since C++17) [`is_trivially_copy_constructible<>`](types/is_copy_constructible "cpp/types/is copy constructible") (since C++11) [`is_trivially_copy_constructible_v<>`](types/is_copy_constructible "cpp/types/is copy constructible") (since C++17) [`is_trivially_default_constructible<>`](types/is_default_constructible "cpp/types/is default constructible") (since C++11) [`is_trivially_default_constructible_v<>`](types/is_default_constructible "cpp/types/is default constructible") (since C++17) [`is_trivially_destructible<>`](types/is_destructible "cpp/types/is destructible") (since C++11) [`is_trivially_destructible_v<>`](types/is_destructible "cpp/types/is destructible") (since C++17) [`is_trivially_move_assignable<>`](types/is_move_assignable "cpp/types/is move assignable") (since C++11) [`is_trivially_move_assignable_v<>`](types/is_move_assignable "cpp/types/is move assignable") (since C++17) [`is_trivially_move_constructible<>`](types/is_move_constructible "cpp/types/is move constructible") (since C++11) [`is_trivially_move_constructible_v<>`](types/is_move_constructible "cpp/types/is move constructible") (since C++17) [`is_unbounded_array<>`](types/is_unbounded_array "cpp/types/is unbounded array") (since C++20) [`is_unbounded_array_v<>`](types/is_unbounded_array "cpp/types/is unbounded array") (since C++20) [`is_union<>`](types/is_union "cpp/types/is union") (since C++11) [`is_union_v<>`](types/is_union "cpp/types/is union") (since C++17) [`is_unsigned<>`](types/is_unsigned "cpp/types/is unsigned") (since C++11) [`is_unsigned_v<>`](types/is_unsigned "cpp/types/is unsigned") (since C++17) [`is_void<>`](types/is_void "cpp/types/is void") (since C++11) [`is_void_v<>`](types/is_void "cpp/types/is void") (since C++17) [`is_volatile<>`](types/is_volatile "cpp/types/is volatile") (since C++11) [`is_volatile_v<>`](types/is_volatile "cpp/types/is volatile") (since C++17) [`isalnum()`](string/byte/isalnum "cpp/string/byte/isalnum") [`isalnum<>()`](locale/isalnum "cpp/locale/isalnum") (locale) [`isalpha()`](string/byte/isalpha "cpp/string/byte/isalpha") [`isalpha<>()`](locale/isalpha "cpp/locale/isalpha") (locale) [`isblank()`](string/byte/isblank "cpp/string/byte/isblank") (since C++11) [`isblank<>()`](locale/isblank "cpp/locale/isblank") (locale) (since C++11) [`iscntrl()`](string/byte/iscntrl "cpp/string/byte/iscntrl") [`iscntrl<>()`](locale/iscntrl "cpp/locale/iscntrl") (locale) [`isdigit()`](string/byte/isdigit "cpp/string/byte/isdigit") [`isdigit<>()`](locale/isdigit "cpp/locale/isdigit") (locale) [`isfinite()`](numeric/math/isfinite "cpp/numeric/math/isfinite") (since C++11) [`isgraph()`](string/byte/isgraph "cpp/string/byte/isgraph") [`isgraph<>()`](locale/isgraph "cpp/locale/isgraph") (locale) [`isgreater()`](numeric/math/isgreater "cpp/numeric/math/isgreater") (since C++11) [`isgreaterequal()`](numeric/math/isgreaterequal "cpp/numeric/math/isgreaterequal") (since C++11) [`isinf()`](numeric/math/isinf "cpp/numeric/math/isinf") (since C++11) [`isless()`](numeric/math/isless "cpp/numeric/math/isless") (since C++11) [`islessequal()`](numeric/math/islessequal "cpp/numeric/math/islessequal") (since C++11) [`islessgreater()`](numeric/math/islessgreater "cpp/numeric/math/islessgreater") (since C++11) [`islower()`](string/byte/islower "cpp/string/byte/islower") [`islower<>()`](locale/islower "cpp/locale/islower") (locale) [`isnan()`](numeric/math/isnan "cpp/numeric/math/isnan") (since C++11) [`isnormal()`](numeric/math/isnormal "cpp/numeric/math/isnormal") (since C++11) [`ispanstream`](io/basic_ispanstream "cpp/io/basic ispanstream") (since C++23) [`isprint()`](string/byte/isprint "cpp/string/byte/isprint") [`isprint<>()`](locale/isprint "cpp/locale/isprint") (locale) [`ispunct()`](string/byte/ispunct "cpp/string/byte/ispunct") [`ispunct<>()`](locale/ispunct "cpp/locale/ispunct") (locale) [`isspace()`](string/byte/isspace "cpp/string/byte/isspace") [`isspace<>()`](locale/isspace "cpp/locale/isspace") (locale) [`istream`](io/basic_istream "cpp/io/basic istream") [`istreambuf_iterator<>`](iterator/istreambuf_iterator "cpp/iterator/istreambuf iterator") [`istream_iterator<>`](iterator/istream_iterator "cpp/iterator/istream iterator") [`istringstream`](io/basic_istringstream "cpp/io/basic istringstream") [`istrstream`](io/istrstream "cpp/io/istrstream") (deprecated in C++98) [`isunordered()`](numeric/math/isunordered "cpp/numeric/math/isunordered") (since C++11) [`isupper()`](string/byte/isupper "cpp/string/byte/isupper") [`isupper<>()`](locale/isupper "cpp/locale/isupper") (locale) [`iswalnum()`](string/wide/iswalnum "cpp/string/wide/iswalnum") [`iswalpha()`](string/wide/iswalpha "cpp/string/wide/iswalpha") [`iswblank()`](string/wide/iswblank "cpp/string/wide/iswblank") (since C++11) [`iswcntrl()`](string/wide/iswcntrl "cpp/string/wide/iswcntrl") [`iswctype()`](string/wide/iswctype "cpp/string/wide/iswctype") [`iswdigit()`](string/wide/iswdigit "cpp/string/wide/iswdigit") [`iswgraph()`](string/wide/iswgraph "cpp/string/wide/iswgraph") [`iswlower()`](string/wide/iswlower "cpp/string/wide/iswlower") [`iswprint()`](string/wide/iswprint "cpp/string/wide/iswprint") [`iswpunct()`](string/wide/iswpunct "cpp/string/wide/iswpunct") [`iswspace()`](string/wide/iswspace "cpp/string/wide/iswspace") [`iswupper()`](string/wide/iswupper "cpp/string/wide/iswupper") [`iswxdigit()`](string/wide/iswxdigit "cpp/string/wide/iswxdigit") [`isxdigit()`](string/byte/isxdigit "cpp/string/byte/isxdigit") [`isxdigit<>()`](locale/isxdigit "cpp/locale/isxdigit") (locale) [`iter_common_reference_t<>`](iterator/iter_t "cpp/iterator/iter t") (since C++20) [`iter_const_reference_t<>`](iterator/iter_t "cpp/iterator/iter t") (since C++23) [`iter_difference_t<>`](iterator/iter_t "cpp/iterator/iter t") (since C++20) [`iter_reference_t<>`](iterator/iter_t "cpp/iterator/iter t") (since C++20) [`iter_rvalue_reference_t<>`](iterator/iter_t "cpp/iterator/iter t") (since C++20) [`iter_swap<>()`](algorithm/iter_swap "cpp/algorithm/iter swap") [`iter_value_t<>`](iterator/iter_t "cpp/iterator/iter t") (since C++20) [`iterator<>`](iterator/iterator "cpp/iterator/iterator") (deprecated in C++17) [`iterator_traits<>`](iterator/iterator_traits "cpp/iterator/iterator traits") ### J [`jmp_buf`](utility/program/jmp_buf "cpp/utility/program/jmp buf") [`jthread`](thread/jthread "cpp/thread/jthread") (since C++20) ### K [`kill_dependency<>()`](atomic/kill_dependency "cpp/atomic/kill dependency") (since C++11) [`kilo`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`knuth_b`](numeric/random "cpp/numeric/random") (since C++11) ### L [`labs()`](numeric/math/abs "cpp/numeric/math/abs") [`laguerre()`](numeric/special_functions/laguerre "cpp/numeric/special functions/laguerre") (since C++17) [`laguerref()`](numeric/special_functions/laguerre "cpp/numeric/special functions/laguerre") (since C++17) [`laguerrel()`](numeric/special_functions/laguerre "cpp/numeric/special functions/laguerre") (since C++17) [`latch`](thread/latch "cpp/thread/latch") (since C++20) [`launch`](thread/launch "cpp/thread/launch") (since C++11) [`launder<>()`](utility/launder "cpp/utility/launder") (since C++17) [`layout_left`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan/layout_left&action=edit&redlink=1 "cpp/container/mdspan/layout left (page does not exist)") (since C++23) [`layout_right`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan/layout_right&action=edit&redlink=1 "cpp/container/mdspan/layout right (page does not exist)") (since C++23) [`layout_stride`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan/layout_stride&action=edit&redlink=1 "cpp/container/mdspan/layout stride (page does not exist)") (since C++23) [`lcm<>()`](numeric/lcm "cpp/numeric/lcm") (since C++17) [`lconv`](locale/lconv "cpp/locale/lconv") [`ldexp()`](numeric/math/ldexp "cpp/numeric/math/ldexp") [`ldexpf()`](numeric/math/ldexp "cpp/numeric/math/ldexp") (since C++11) [`ldexpl()`](numeric/math/ldexp "cpp/numeric/math/ldexp") (since C++11) [`ldiv()`](numeric/math/div "cpp/numeric/math/div") [`ldiv_t`](numeric/math/div "cpp/numeric/math/div") [`left()`](io/manip/left "cpp/io/manip/left") [`legendre()`](numeric/special_functions/legendre "cpp/numeric/special functions/legendre") (since C++17) [`legendref()`](numeric/special_functions/legendre "cpp/numeric/special functions/legendre") (since C++17) [`legendrel()`](numeric/special_functions/legendre "cpp/numeric/special functions/legendre") (since C++17) [`length_error`](error/length_error "cpp/error/length error") [`lerp()`](numeric/lerp "cpp/numeric/lerp") (since C++20) [`less<>`](utility/functional/less "cpp/utility/functional/less") [`less_equal<>`](utility/functional/less_equal "cpp/utility/functional/less equal") [`lexicographical_compare<>()`](algorithm/lexicographical_compare "cpp/algorithm/lexicographical compare") [`lexicographical_compare_three_way<>()`](algorithm/lexicographical_compare_three_way "cpp/algorithm/lexicographical compare three way") (since C++20) [`lgamma()`](numeric/math/lgamma "cpp/numeric/math/lgamma") (since C++11) [`lgammaf()`](numeric/math/lgamma "cpp/numeric/math/lgamma") (since C++11) [`lgammal()`](numeric/math/lgamma "cpp/numeric/math/lgamma") (since C++11) [`linear_congruential_engine<>`](numeric/random/linear_congruential_engine "cpp/numeric/random/linear congruential engine") (since C++11) [`list<>`](container/list "cpp/container/list") ▶ [`literals`](symbol_index/literals "cpp/symbol index/literals") (since C++14) [`little_endian`](locale/codecvt_mode "cpp/locale/codecvt mode") (since C++11)(deprecated in C++17) [`llabs()`](numeric/math/abs "cpp/numeric/math/abs") (since C++11) [`lldiv()`](numeric/math/div "cpp/numeric/math/div") (since C++11) [`lldiv_t`](numeric/math/div "cpp/numeric/math/div") (since C++11) [`llrint()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`llrintf()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`llrintl()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`llround()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`llroundf()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`llroundl()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`locale`](locale/locale "cpp/locale/locale") [`localeconv()`](locale/localeconv "cpp/locale/localeconv") [`localtime()`](chrono/c/localtime "cpp/chrono/c/localtime") [`lock<>()`](thread/lock "cpp/thread/lock") (since C++11) [`lock_guard<>`](thread/lock_guard "cpp/thread/lock guard") (since C++11) [`log()`](numeric/math/log "cpp/numeric/math/log") [`log<>()`](numeric/complex/log "cpp/numeric/complex/log") (std::complex) [`log<>()`](numeric/valarray/log "cpp/numeric/valarray/log") (std::valarray) [`log10()`](numeric/math/log10 "cpp/numeric/math/log10") [`log10<>()`](numeric/complex/log10 "cpp/numeric/complex/log10") (std::complex) [`log10<>()`](numeric/valarray/log10 "cpp/numeric/valarray/log10") (std::valarray) [`log10f()`](numeric/math/log10 "cpp/numeric/math/log10") (since C++11) [`log10l()`](numeric/math/log10 "cpp/numeric/math/log10") (since C++11) [`log1p()`](numeric/math/log1p "cpp/numeric/math/log1p") (since C++11) [`log1pf()`](numeric/math/log1p "cpp/numeric/math/log1p") (since C++11) [`log1pl()`](numeric/math/log1p "cpp/numeric/math/log1p") (since C++11) [`log2()`](numeric/math/log2 "cpp/numeric/math/log2") (since C++11) [`log2f()`](numeric/math/log2 "cpp/numeric/math/log2") (since C++11) [`log2l()`](numeric/math/log2 "cpp/numeric/math/log2") (since C++11) [`logb()`](numeric/math/logb "cpp/numeric/math/logb") (since C++11) [`logbf()`](numeric/math/logb "cpp/numeric/math/logb") (since C++11) [`logbl()`](numeric/math/logb "cpp/numeric/math/logb") (since C++11) [`logf()`](numeric/math/log "cpp/numeric/math/log") (since C++11) [`logic_error`](error/logic_error "cpp/error/logic error") [`logical_and<>`](utility/functional/logical_and "cpp/utility/functional/logical and") [`logical_not<>`](utility/functional/logical_not "cpp/utility/functional/logical not") [`logical_or<>`](utility/functional/logical_or "cpp/utility/functional/logical or") [`logl()`](numeric/math/log "cpp/numeric/math/log") (since C++11) [`lognormal_distribution<>`](numeric/random/lognormal_distribution "cpp/numeric/random/lognormal distribution") (since C++11) [`longjmp()`](utility/program/longjmp "cpp/utility/program/longjmp") [`lower_bound<>()`](algorithm/lower_bound "cpp/algorithm/lower bound") [`lrint()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`lrintf()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`lrintl()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`lround()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`lroundf()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`lroundl()`](numeric/math/round "cpp/numeric/math/round") (since C++11) ### M [`make_any<>()`](utility/any/make_any "cpp/utility/any/make any") (since C++17) [`make_const_iterator<>()`](https://en.cppreference.com/mwiki/index.php?title=cpp/iterator/make_const_iterator&action=edit&redlink=1 "cpp/iterator/make const iterator (page does not exist)") (since C++23) [`make_const_sentinel<>()`](https://en.cppreference.com/mwiki/index.php?title=cpp/iterator/make_const_sentinel&action=edit&redlink=1 "cpp/iterator/make const sentinel (page does not exist)") (since C++23) [`make_error_code`](error/errc/make_error_code "cpp/error/errc/make error code") (std::errc) (since C++11) [`make_error_code`](io/io_errc/make_error_code "cpp/io/io errc/make error code") (std::io\_errc) (since C++11) [`make_error_condition`](error/errc/make_error_condition "cpp/error/errc/make error condition") (std::errc) (since C++11) [`make_error_condition`](io/io_errc/make_error_condition "cpp/io/io errc/make error condition") (std::io\_errc) (since C++11) [`make_exception_ptr<>()`](error/make_exception_ptr "cpp/error/make exception ptr") (since C++11) [`make_format_args<>()`](utility/format/make_format_args "cpp/utility/format/make format args") (since C++20) [`make_from_tuple<>()`](utility/make_from_tuple "cpp/utility/make from tuple") (since C++17) [`make_heap<>()`](algorithm/make_heap "cpp/algorithm/make heap") [`make_move_iterator<>()`](iterator/make_move_iterator "cpp/iterator/make move iterator") (since C++11) [`make_obj_using_allocator<>()`](memory/make_obj_using_allocator "cpp/memory/make obj using allocator") (since C++20) [`make_optional<>()`](utility/optional/make_optional "cpp/utility/optional/make optional") (since C++17) [`make_pair<>`](utility/pair/make_pair "cpp/utility/pair/make pair") [`make_reverse_iterator<>()`](iterator/make_reverse_iterator "cpp/iterator/make reverse iterator") (since C++14) [`make_shared<>()`](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared") (since C++11) [`make_shared_for_overwrite<>()`](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared") (since C++20) [`make_signed<>`](types/make_signed "cpp/types/make signed") (since C++11) [`make_signed_t<>`](types/make_signed "cpp/types/make signed") (since C++14) [`make_tuple<>()`](utility/tuple/make_tuple "cpp/utility/tuple/make tuple") (since C++11) [`make_unique<>()`](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique") (since C++14) [`make_unique_for_overwrite<>()`](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique") (since C++20) [`make_unsigned<>`](types/make_unsigned "cpp/types/make unsigned") (since C++11) [`make_unsigned_t<>`](types/make_unsigned "cpp/types/make unsigned") (since C++14) [`make_wformat_args<>()`](utility/format/make_format_args "cpp/utility/format/make format args") (since C++20) [`malloc()`](memory/c/malloc "cpp/memory/c/malloc") [`map<>`](container/map "cpp/container/map") [`mask_array<>`](numeric/valarray/mask_array "cpp/numeric/valarray/mask array") [`match_results<>`](regex/match_results "cpp/regex/match results") (since C++11) [`max<>()`](algorithm/max "cpp/algorithm/max") [`max_align_t`](types/max_align_t "cpp/types/max align t") (since C++11) [`max_element<>()`](algorithm/max_element "cpp/algorithm/max element") [`mblen()`](string/multibyte/mblen "cpp/string/multibyte/mblen") [`mbrlen()`](string/multibyte/mbrlen "cpp/string/multibyte/mbrlen") [`mbrtoc16()`](string/multibyte/mbrtoc16 "cpp/string/multibyte/mbrtoc16") (since C++11) [`mbrtoc32()`](string/multibyte/mbrtoc32 "cpp/string/multibyte/mbrtoc32") (since C++11) [`mbrtoc8()`](string/multibyte/mbrtoc8 "cpp/string/multibyte/mbrtoc8") (since C++20) [`mbrtowc()`](string/multibyte/mbrtowc "cpp/string/multibyte/mbrtowc") [`mbsinit()`](string/multibyte/mbsinit "cpp/string/multibyte/mbsinit") [`mbsrtowcs()`](string/multibyte/mbsrtowcs "cpp/string/multibyte/mbsrtowcs") [`mbstate_t`](string/multibyte/mbstate_t "cpp/string/multibyte/mbstate t") [`mbstowcs()`](string/multibyte/mbstowcs "cpp/string/multibyte/mbstowcs") [`mbtowc()`](string/multibyte/mbtowc "cpp/string/multibyte/mbtowc") [`mdspan<>`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/mdspan&action=edit&redlink=1 "cpp/container/mdspan (page does not exist)") (since C++23) [`mega`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`mem_fn<>()`](utility/functional/mem_fn "cpp/utility/functional/mem fn") (since C++11) [`memchr()`](string/byte/memchr "cpp/string/byte/memchr") [`memcmp()`](string/byte/memcmp "cpp/string/byte/memcmp") [`memcpy()`](string/byte/memcpy "cpp/string/byte/memcpy") [`memmove()`](string/byte/memmove "cpp/string/byte/memmove") [`memory_order`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memory_order_acq_rel`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memory_order_acquire`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memory_order_consume`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memory_order_relaxed`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memory_order_release`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memory_order_seq_cst`](atomic/memory_order "cpp/atomic/memory order") (since C++11) [`memset()`](string/byte/memset "cpp/string/byte/memset") [`merge<>()`](algorithm/merge "cpp/algorithm/merge") [`mergeable<>`](iterator/mergeable "cpp/iterator/mergeable") (since C++20) [`mersenne_twister_engine<>`](numeric/random/mersenne_twister_engine "cpp/numeric/random/mersenne twister engine") (since C++11) [`messages<>`](locale/messages "cpp/locale/messages") [`messages_base`](locale/messages_base "cpp/locale/messages base") [`messages_byname<>`](locale/messages_byname "cpp/locale/messages byname") [`micro`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`midpoint<>()`](numeric/midpoint "cpp/numeric/midpoint") (since C++20) [`milli`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`min<>()`](algorithm/min "cpp/algorithm/min") [`min_element<>()`](algorithm/min_element "cpp/algorithm/min element") [`minmax<>()`](algorithm/minmax "cpp/algorithm/minmax") (since C++11) [`minmax_element<>()`](algorithm/minmax_element "cpp/algorithm/minmax element") (since C++11) [`minstd_rand`](numeric/random "cpp/numeric/random") (since C++11) [`minstd_rand0`](numeric/random "cpp/numeric/random") (since C++11) [`minus<>`](utility/functional/minus "cpp/utility/functional/minus") [`mismatch<>()`](algorithm/mismatch "cpp/algorithm/mismatch") [`mktime()`](chrono/c/mktime "cpp/chrono/c/mktime") [`modf()`](numeric/math/modf "cpp/numeric/math/modf") [`modff()`](numeric/math/modf "cpp/numeric/math/modf") (since C++11) [`modfl()`](numeric/math/modf "cpp/numeric/math/modf") (since C++11) [`modulus<>`](utility/functional/modulus "cpp/utility/functional/modulus") [`money_base`](locale/money_base "cpp/locale/money base") [`money_get<>`](locale/money_get "cpp/locale/money get") [`moneypunct<>`](locale/moneypunct "cpp/locale/moneypunct") [`moneypunct_byname<>`](locale/moneypunct_byname "cpp/locale/moneypunct byname") [`money_put<>`](locale/money_put "cpp/locale/money put") [`monostate`](utility/variant/monostate "cpp/utility/variant/monostate") (since C++17) [`movable<>`](concepts/movable "cpp/concepts/movable") (since C++20) [`move<>()`](algorithm/move "cpp/algorithm/move") (algorithm) (since C++11) [`move<>()`](utility/move "cpp/utility/move") (utility) (since C++11) [`move_backward<>()`](algorithm/move_backward "cpp/algorithm/move backward") (since C++11) [`move_constructible<>`](concepts/move_constructible "cpp/concepts/move constructible") (since C++20) [`move_if_noexcept<>()`](utility/move_if_noexcept "cpp/utility/move if noexcept") (since C++11) [`move_iterator<>`](iterator/move_iterator "cpp/iterator/move iterator") (since C++11) [`move_only_function<>`](utility/functional/move_only_function "cpp/utility/functional/move only function") (since C++23) [`move_sentinel<>`](iterator/move_sentinel "cpp/iterator/move sentinel") (since C++20) [`mt19937`](numeric/random "cpp/numeric/random") (since C++11) [`mt19937_64`](numeric/random "cpp/numeric/random") (since C++11) [`multimap<>`](container/multimap "cpp/container/multimap") [`multiplies<>`](utility/functional/multiplies "cpp/utility/functional/multiplies") [`multiset<>`](container/multiset "cpp/container/multiset") [`mutex`](thread/mutex "cpp/thread/mutex") (since C++11) ### N [`nan()`](numeric/math/nan "cpp/numeric/math/nan") (since C++11) [`nanf()`](numeric/math/nan "cpp/numeric/math/nan") (since C++11) [`nanl()`](numeric/math/nan "cpp/numeric/math/nan") (since C++11) [`nano`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`nearbyint()`](numeric/math/nearbyint "cpp/numeric/math/nearbyint") (since C++11) [`nearbyintf()`](numeric/math/nearbyint "cpp/numeric/math/nearbyint") (since C++11) [`nearbyintl()`](numeric/math/nearbyint "cpp/numeric/math/nearbyint") (since C++11) [`negate<>`](utility/functional/negate "cpp/utility/functional/negate") [`negation<>`](types/negation "cpp/types/negation") (since C++17) [`negation_v<>`](types/negation "cpp/types/negation") (since C++17) [`negative_binomial_distribution<>`](numeric/random/negative_binomial_distribution "cpp/numeric/random/negative binomial distribution") (since C++11) [`nested_exception`](error/nested_exception "cpp/error/nested exception") (since C++11) [`new_handler`](memory/new/new_handler "cpp/memory/new/new handler") [`next<>()`](iterator/next "cpp/iterator/next") (since C++11) [`next_permutation<>()`](algorithm/next_permutation "cpp/algorithm/next permutation") [`nextafter()`](numeric/math/nextafter "cpp/numeric/math/nextafter") (since C++11) [`nextafterf()`](numeric/math/nextafter "cpp/numeric/math/nextafter") (since C++11) [`nextafterl()`](numeric/math/nextafter "cpp/numeric/math/nextafter") (since C++11) [`nexttoward()`](numeric/math/nextafter "cpp/numeric/math/nextafter") (since C++11) [`nexttowardf()`](numeric/math/nextafter "cpp/numeric/math/nextafter") (since C++11) [`nexttowardl()`](numeric/math/nextafter "cpp/numeric/math/nextafter") (since C++11) [`noemit_on_flush<>()`](io/manip/emit_on_flush "cpp/io/manip/emit on flush") (since C++20) [`noboolalpha()`](io/manip/boolalpha "cpp/io/manip/boolalpha") [`none_of<>()`](algorithm/all_any_none_of "cpp/algorithm/all any none of") (since C++11) [`noop_coroutine()`](coroutine/noop_coroutine "cpp/coroutine/noop coroutine") (since C++20) [`noop_coroutine_handle`](coroutine/coroutine_handle "cpp/coroutine/coroutine handle") (since C++20) [`noop_coroutine_promise`](coroutine/noop_coroutine_promise "cpp/coroutine/noop coroutine promise") (since C++20) [`norm<>()`](numeric/complex/norm "cpp/numeric/complex/norm") [`normal_distribution<>`](numeric/random/normal_distribution "cpp/numeric/random/normal distribution") (since C++11) [`noshowbase()`](io/manip/showbase "cpp/io/manip/showbase") [`noshowpoint()`](io/manip/showpoint "cpp/io/manip/showpoint") [`noshowpos()`](io/manip/showpos "cpp/io/manip/showpos") [`noskipws()`](io/manip/skipws "cpp/io/manip/skipws") [`nostopstate`](thread/stop_source/nostopstate "cpp/thread/stop source/nostopstate") (since C++20) [`nostopstate_t`](thread/stop_source/nostopstate_t "cpp/thread/stop source/nostopstate t") (since C++20) [`not_equal_to<>`](utility/functional/not_equal_to "cpp/utility/functional/not equal to") [`not_fn<>()`](utility/functional/not_fn "cpp/utility/functional/not fn") (since C++17) [`nothrow`](memory/new/nothrow "cpp/memory/new/nothrow") [`nothrow_t`](memory/new/nothrow_t "cpp/memory/new/nothrow t") [`notify_all_at_thread_exit()`](thread/notify_all_at_thread_exit "cpp/thread/notify all at thread exit") (since C++11) [`nounitbuf()`](io/manip/unitbuf "cpp/io/manip/unitbuf") [`nouppercase()`](io/manip/uppercase "cpp/io/manip/uppercase") [`nth_element<>()`](algorithm/nth_element "cpp/algorithm/nth element") [`nullopt`](utility/optional/nullopt "cpp/utility/optional/nullopt") (since C++17) [`nullopt_t`](utility/optional/nullopt_t "cpp/utility/optional/nullopt t") (since C++17) [`nullptr_t`](types/nullptr_t "cpp/types/nullptr t") (since C++11) [`numeric_limits<>`](types/numeric_limits "cpp/types/numeric limits") [`num_get<>`](locale/num_get "cpp/locale/num get") ▶ [`numbers`](symbol_index/numbers "cpp/symbol index/numbers") (since C++20) [`numpunct<>`](locale/numpunct "cpp/locale/numpunct") [`numpunct_byname<>`](locale/numpunct_byname "cpp/locale/numpunct byname") [`num_put<>`](locale/num_put "cpp/locale/num put") ### O [`oct()`](io/manip/hex "cpp/io/manip/hex") [`once_flag`](thread/once_flag "cpp/thread/once flag") (since C++11) [`ofstream`](io/basic_ofstream "cpp/io/basic ofstream") [`optional<>`](utility/optional "cpp/utility/optional") (since C++17) [`ospanstream`](io/basic_ospanstream "cpp/io/basic ospanstream") (since C++23) [`ostream`](io/basic_ostream "cpp/io/basic ostream") [`ostreambuf_iterator<>`](iterator/ostreambuf_iterator "cpp/iterator/ostreambuf iterator") [`ostream_iterator<>`](iterator/ostream_iterator "cpp/iterator/ostream iterator") [`ostringstream`](io/basic_ostringstream "cpp/io/basic ostringstream") [`ostrstream`](io/ostrstream "cpp/io/ostrstream") (deprecated in C++98) [`osyncstream`](io/basic_osyncstream "cpp/io/basic osyncstream") (since C++20) [`out_of_range`](error/out_of_range "cpp/error/out of range") [`out_ptr<>()`](memory/out_ptr_t/out_ptr "cpp/memory/out ptr t/out ptr") (since C++23) [`out_ptr_t<>`](memory/out_ptr_t "cpp/memory/out ptr t") (since C++23) [`output_iterator<>`](iterator/output_iterator "cpp/iterator/output iterator") (since C++20) [`output_iterator_tag`](iterator/iterator_tags "cpp/iterator/iterator tags") [`overflow_error`](error/overflow_error "cpp/error/overflow error") [`owner_less<>`](memory/owner_less "cpp/memory/owner less") (since C++11) ### P [`packaged_task<>`](thread/packaged_task "cpp/thread/packaged task") (since C++11) [`pair<>`](utility/pair "cpp/utility/pair") [`partial_order`](utility/compare/partial_order "cpp/utility/compare/partial order") (since C++20) [`partial_ordering`](utility/compare/partial_ordering "cpp/utility/compare/partial ordering") (since C++20) [`partial_sort<>()`](algorithm/partial_sort "cpp/algorithm/partial sort") [`partial_sort_copy<>()`](algorithm/partial_sort_copy "cpp/algorithm/partial sort copy") [`partial_sum<>()`](algorithm/partial_sum "cpp/algorithm/partial sum") [`partition<>()`](algorithm/partition "cpp/algorithm/partition") [`partition_copy<>()`](algorithm/partition_copy "cpp/algorithm/partition copy") (since C++11) [`partition_point<>()`](algorithm/partition_point "cpp/algorithm/partition point") (since C++11) [`permutable<>`](iterator/permutable "cpp/iterator/permutable") (since C++20) [`perror()`](io/c/perror "cpp/io/c/perror") [`peta`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`pico`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`piecewise_constant_distribution<>`](numeric/random/piecewise_constant_distribution "cpp/numeric/random/piecewise constant distribution") (since C++11) [`piecewise_construct`](utility/piecewise_construct "cpp/utility/piecewise construct") (since C++11) [`piecewise_construct_t`](utility/piecewise_construct_t "cpp/utility/piecewise construct t") (since C++11) [`piecewise_linear_distribution<>`](numeric/random/piecewise_linear_distribution "cpp/numeric/random/piecewise linear distribution") (since C++11) ▶ [`placeholders`](symbol_index/placeholders "cpp/symbol index/placeholders") (since C++11) [`plus<>`](utility/functional/plus "cpp/utility/functional/plus") ▶ [`pmr`](symbol_index/pmr "cpp/symbol index/pmr") (since C++17) [`pointer_traits<>`](memory/pointer_traits "cpp/memory/pointer traits") (since C++11) [`poisson_distribution<>`](numeric/random/poisson_distribution "cpp/numeric/random/poisson distribution") (since C++11) [`polar<>()`](numeric/complex/polar "cpp/numeric/complex/polar") [`pop_heap<>()`](algorithm/pop_heap "cpp/algorithm/pop heap") [`popcount<>()`](numeric/popcount "cpp/numeric/popcount") (since C++20) [`pow()`](numeric/math/pow "cpp/numeric/math/pow") [`pow<>()`](numeric/complex/pow "cpp/numeric/complex/pow") (std::complex) [`pow<>()`](numeric/valarray/pow "cpp/numeric/valarray/pow") (std::valarray) [`powf()`](numeric/math/pow "cpp/numeric/math/pow") (since C++11) [`powl()`](numeric/math/pow "cpp/numeric/math/pow") (since C++11) [`predicate<>`](concepts/predicate "cpp/concepts/predicate") (since C++20) [`prev<>()`](iterator/prev "cpp/iterator/prev") (since C++11) [`prev_permutation<>()`](algorithm/prev_permutation "cpp/algorithm/prev permutation") [`print<>()`](io/print "cpp/io/print") (since C++23) [`printf()`](io/c/fprintf "cpp/io/c/fprintf") [`println<>()`](io/println "cpp/io/println") (since C++23) [`priority_queue<>`](container/priority_queue "cpp/container/priority queue") [`proj<>()`](numeric/complex/proj "cpp/numeric/complex/proj") (since C++11) [`projected<>`](iterator/projected "cpp/iterator/projected") (since C++20) [`promise<>`](thread/promise "cpp/thread/promise") (since C++11) [`ptrdiff_t`](types/ptrdiff_t "cpp/types/ptrdiff t") [`push_heap<>()`](algorithm/push_heap "cpp/algorithm/push heap") [`put_money<>()`](io/manip/put_money "cpp/io/manip/put money") (since C++11) [`put_time<>()`](io/manip/put_time "cpp/io/manip/put time") (since C++11) [`putc()`](io/c/fputc "cpp/io/c/fputc") [`putchar()`](io/c/putchar "cpp/io/c/putchar") [`puts()`](io/c/puts "cpp/io/c/puts") [`putwc()`](io/c/fputwc "cpp/io/c/fputwc") [`putwchar()`](io/c/putwchar "cpp/io/c/putwchar") ### Q [`qsort()`](algorithm/qsort "cpp/algorithm/qsort") [`queue<>`](container/queue "cpp/container/queue") [`quick_exit()`](utility/program/quick_exit "cpp/utility/program/quick exit") (since C++11) [`quoted<>()`](io/manip/quoted "cpp/io/manip/quoted") (since C++14) ### R [`raise()`](utility/program/raise "cpp/utility/program/raise") [`rand()`](numeric/random/rand "cpp/numeric/random/rand") [`random_access_iterator<>`](iterator/random_access_iterator "cpp/iterator/random access iterator") (since C++20) [`random_access_iterator_tag`](iterator/iterator_tags "cpp/iterator/iterator tags") [`random_device`](numeric/random/random_device "cpp/numeric/random/random device") (since C++11) [`range_error`](error/range_error "cpp/error/range error") ▶ [`ranges`](symbol_index/ranges "cpp/symbol index/ranges") (since C++20) [`ranlux24`](numeric/random "cpp/numeric/random") (since C++11) [`ranlux24_base`](numeric/random "cpp/numeric/random") (since C++11) [`ranlux48`](numeric/random "cpp/numeric/random") (since C++11) [`ranlux48_base`](numeric/random "cpp/numeric/random") (since C++11) [`rank<>`](types/rank "cpp/types/rank") (since C++11) [`rank_v<>`](types/rank "cpp/types/rank") (since C++17) [`ratio`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`ratio_add<>`](numeric/ratio/ratio_add "cpp/numeric/ratio/ratio add") (since C++11) [`ratio_divide<>`](numeric/ratio/ratio_divide "cpp/numeric/ratio/ratio divide") (since C++11) [`ratio_equal<>`](numeric/ratio/ratio_equal "cpp/numeric/ratio/ratio equal") (since C++11) [`ratio_equal_v<>`](numeric/ratio/ratio_equal "cpp/numeric/ratio/ratio equal") (since C++17) [`ratio_greater<>`](numeric/ratio/ratio_greater "cpp/numeric/ratio/ratio greater") (since C++11) [`ratio_greater_equal<>`](numeric/ratio/ratio_greater_equal "cpp/numeric/ratio/ratio greater equal") (since C++11) [`ratio_greater_equal_v<>`](numeric/ratio/ratio_greater_equal "cpp/numeric/ratio/ratio greater equal") (since C++17) [`ratio_greater_v<>`](numeric/ratio/ratio_greater "cpp/numeric/ratio/ratio greater") (since C++17) [`ratio_less<>`](numeric/ratio/ratio_less "cpp/numeric/ratio/ratio less") (since C++11) [`ratio_less_equal<>`](numeric/ratio/ratio_less_equal "cpp/numeric/ratio/ratio less equal") (since C++11) [`ratio_less_equal_v<>`](numeric/ratio/ratio_less_equal "cpp/numeric/ratio/ratio less equal") (since C++17) [`ratio_less_v<>`](numeric/ratio/ratio_less "cpp/numeric/ratio/ratio less") (since C++17) [`ratio_multiply<>`](numeric/ratio/ratio_multiply "cpp/numeric/ratio/ratio multiply") (since C++11) [`ratio_not_equal<>`](numeric/ratio/ratio_not_equal "cpp/numeric/ratio/ratio not equal") (since C++11) [`ratio_not_equal_v<>`](numeric/ratio/ratio_not_equal "cpp/numeric/ratio/ratio not equal") (since C++17) [`ratio_subtract<>`](numeric/ratio/ratio_subtract "cpp/numeric/ratio/ratio subtract") (since C++11) [`rbegin<>()`](iterator/rbegin "cpp/iterator/rbegin") (since C++14) [`real<>()`](numeric/complex/real2 "cpp/numeric/complex/real2") [`realloc()`](memory/c/realloc "cpp/memory/c/realloc") [`recursive_mutex`](thread/recursive_mutex "cpp/thread/recursive mutex") (since C++11) [`recursive_timed_mutex`](thread/recursive_timed_mutex "cpp/thread/recursive timed mutex") (since C++11) [`reduce<>()`](algorithm/reduce "cpp/algorithm/reduce") (since C++17) [`ref<>()`](utility/functional/ref "cpp/utility/functional/ref") (since C++11) [`reference_wrapper<>`](utility/functional/reference_wrapper "cpp/utility/functional/reference wrapper") (since C++11) [`regex`](regex/basic_regex "cpp/regex/basic regex") (since C++11) ▶ [`regex_constants`](symbol_index/regex_constants "cpp/symbol index/regex constants") (since C++11) [`regex_error`](regex/regex_error "cpp/regex/regex error") (since C++11) [`regex_iterator<>`](regex/regex_iterator "cpp/regex/regex iterator") (since C++11) [`regex_match<>`](regex/regex_match "cpp/regex/regex match") (since C++11) [`regex_replace<>`](regex/regex_replace "cpp/regex/regex replace") (since C++11) [`regex_search<>`](regex/regex_search "cpp/regex/regex search") (since C++11) [`regex_token_iterator<>`](regex/regex_token_iterator "cpp/regex/regex token iterator") (since C++11) [`regex_traits<>`](regex/regex_traits "cpp/regex/regex traits") (since C++11) [`regular<>`](concepts/regular "cpp/concepts/regular") (since C++20) [`regular_invocable<>`](concepts/invocable "cpp/concepts/invocable") (since C++20) [`reinterpret_pointer_cast<>()`](memory/shared_ptr/pointer_cast "cpp/memory/shared ptr/pointer cast") (since C++11) ▶ [`rel_ops`](symbol_index/rel_ops "cpp/symbol index/rel ops") (deprecated in C++20) [`relation<>`](concepts/relation "cpp/concepts/relation") (since C++20) [`remainder()`](numeric/math/remainder "cpp/numeric/math/remainder") (since C++11) [`remainderf()`](numeric/math/remainder "cpp/numeric/math/remainder") (since C++11) [`remainderl()`](numeric/math/remainder "cpp/numeric/math/remainder") (since C++11) [`remove()`](io/c/remove "cpp/io/c/remove") [`remove<>()`](algorithm/remove "cpp/algorithm/remove") (algorithm) [`remove_if<>()`](algorithm/remove "cpp/algorithm/remove") [`remove_all_extents<>`](types/remove_all_extents "cpp/types/remove all extents") (since C++11) [`remove_all_extents_t<>`](types/remove_all_extents "cpp/types/remove all extents") (since C++14) [`remove_const<>`](types/remove_cv "cpp/types/remove cv") (since C++11) [`remove_const_t<>`](types/remove_cv "cpp/types/remove cv") (since C++14) [`remove_copy<>()`](algorithm/remove_copy "cpp/algorithm/remove copy") [`remove_copy_if<>()`](algorithm/remove_copy "cpp/algorithm/remove copy") [`remove_cv<>`](types/remove_cv "cpp/types/remove cv") (since C++11) [`remove_cv_t<>`](types/remove_cv "cpp/types/remove cv") (since C++14) [`remove_cvref<>`](types/remove_cvref "cpp/types/remove cvref") (since C++20) [`remove_cvref_t<>`](types/remove_cvref "cpp/types/remove cvref") (since C++20) [`remove_extent<>`](types/remove_extent "cpp/types/remove extent") (since C++11) [`remove_extent_t<>`](types/remove_extent "cpp/types/remove extent") (since C++14) [`remove_pointer<>`](types/remove_pointer "cpp/types/remove pointer") (since C++11) [`remove_pointer_t<>`](types/remove_pointer "cpp/types/remove pointer") (since C++14) [`remove_reference<>`](types/remove_reference "cpp/types/remove reference") (since C++11) [`remove_reference_t<>`](types/remove_reference "cpp/types/remove reference") (since C++14) [`remove_volatile<>`](types/remove_cv "cpp/types/remove cv") (since C++11) [`remove_volatile_t<>`](types/remove_cv "cpp/types/remove cv") (since C++14) [`remquo()`](numeric/math/remquo "cpp/numeric/math/remquo") (since C++11) [`remquof()`](numeric/math/remquo "cpp/numeric/math/remquo") (since C++11) [`remquol()`](numeric/math/remquo "cpp/numeric/math/remquo") (since C++11) [`rend<>()`](iterator/rend "cpp/iterator/rend") (since C++14) [`rename()`](io/c/rename "cpp/io/c/rename") [`replace<>()`](algorithm/replace "cpp/algorithm/replace") [`replace_copy<>()`](algorithm/replace_copy "cpp/algorithm/replace copy") [`replace_copy_if<>()`](algorithm/replace_copy_if "cpp/algorithm/replace copy if") [`replace_if<>()`](algorithm/replace "cpp/algorithm/replace") [`resetiosflags()`](io/manip/resetiosflags "cpp/io/manip/resetiosflags") [`rethrow_exception()`](error/rethrow_exception "cpp/error/rethrow exception") (since C++11) [`rethrow_if_nested<>()`](error/rethrow_if_nested "cpp/error/rethrow if nested") (since C++11) [`reverse<>()`](algorithm/reverse "cpp/algorithm/reverse") [`reverse_copy<>()`](algorithm/reverse_copy "cpp/algorithm/reverse copy") [`reverse_iterator<>`](iterator/reverse_iterator "cpp/iterator/reverse iterator") [`rewind()`](io/c/rewind "cpp/io/c/rewind") [`riemann_zeta()`](numeric/special_functions/riemann_zeta "cpp/numeric/special functions/riemann zeta") (since C++17) [`riemann_zetaf()`](numeric/special_functions/riemann_zeta "cpp/numeric/special functions/riemann zeta") (since C++17) [`riemann_zetal()`](numeric/special_functions/riemann_zeta "cpp/numeric/special functions/riemann zeta") (since C++17) [`right()`](io/manip/left "cpp/io/manip/left") [`rint()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`rintf()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`rintl()`](numeric/math/rint "cpp/numeric/math/rint") (since C++11) [`rotate<>()`](algorithm/rotate "cpp/algorithm/rotate") [`rotate_copy<>()`](algorithm/rotate_copy "cpp/algorithm/rotate copy") [`rotl<>()`](numeric/rotl "cpp/numeric/rotl") (since C++20) [`rotr<>()`](numeric/rotr "cpp/numeric/rotr") (since C++20) [`round()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`roundf()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`roundl()`](numeric/math/round "cpp/numeric/math/round") (since C++11) [`round_indeterminate`](types/numeric_limits/float_round_style "cpp/types/numeric limits/float round style") [`round_to_nearest`](types/numeric_limits/float_round_style "cpp/types/numeric limits/float round style") [`round_toward_infinity`](types/numeric_limits/float_round_style "cpp/types/numeric limits/float round style") [`round_toward_neg_infinity`](types/numeric_limits/float_round_style "cpp/types/numeric limits/float round style") [`round_toward_zero`](types/numeric_limits/float_round_style "cpp/types/numeric limits/float round style") [`runtime_error`](error/runtime_error "cpp/error/runtime error") ### S [`same_as<>`](concepts/same_as "cpp/concepts/same as") (since C++20) [`sample<>()`](algorithm/sample "cpp/algorithm/sample") (since C++17) [`scalbln()`](numeric/math/scalbn "cpp/numeric/math/scalbn") (since C++11) [`scalblnf()`](numeric/math/scalbn "cpp/numeric/math/scalbn") (since C++11) [`scalblnl()`](numeric/math/scalbn "cpp/numeric/math/scalbn") (since C++11) [`scalbn()`](numeric/math/scalbn "cpp/numeric/math/scalbn") (since C++11) [`scalbnf()`](numeric/math/scalbn "cpp/numeric/math/scalbn") (since C++11) [`scalbnl()`](numeric/math/scalbn "cpp/numeric/math/scalbn") (since C++11) [`scanf()`](io/c/fscanf "cpp/io/c/fscanf") [`scientific()`](io/manip/fixed "cpp/io/manip/fixed") [`scoped_allocator_adaptor<>`](memory/scoped_allocator_adaptor "cpp/memory/scoped allocator adaptor") (since C++11) [`scoped_lock`](thread/scoped_lock "cpp/thread/scoped lock") (since C++17) [`search<>()`](algorithm/search "cpp/algorithm/search") [`search_n<>()`](algorithm/search_n "cpp/algorithm/search n") [`seed_seq`](numeric/random/seed_seq "cpp/numeric/random/seed seq") (since C++11) [`semiregular<>`](concepts/semiregular "cpp/concepts/semiregular") (since C++20) [`sentinel_for<>`](iterator/sentinel_for "cpp/iterator/sentinel for") (since C++20) [`set<>`](container/set "cpp/container/set") [`set_difference<>()`](algorithm/set_difference "cpp/algorithm/set difference") [`set_intersection<>()`](algorithm/set_intersection "cpp/algorithm/set intersection") [`set_new_handler()`](memory/new/set_new_handler "cpp/memory/new/set new handler") [`set_symmetric_difference<>()`](algorithm/set_symmetric_difference "cpp/algorithm/set symmetric difference") [`set_terminate()`](error/set_terminate "cpp/error/set terminate") [`set_union<>()`](algorithm/set_union "cpp/algorithm/set union") [`setbase()`](io/manip/setbase "cpp/io/manip/setbase") [`setbuf()`](io/c/setbuf "cpp/io/c/setbuf") [`setfill<>()`](io/manip/setfill "cpp/io/manip/setfill") [`setiosflags()`](io/manip/setiosflags "cpp/io/manip/setiosflags") [`setlocale()`](locale/setlocale "cpp/locale/setlocale") [`setprecision()`](io/manip/setprecision "cpp/io/manip/setprecision") [`setvbuf()`](io/c/setvbuf "cpp/io/c/setvbuf") [`setw()`](io/manip/setw "cpp/io/manip/setw") [`shared_future<>`](thread/shared_future "cpp/thread/shared future") (since C++11) [`shared_lock<>`](thread/shared_lock "cpp/thread/shared lock") (since C++14) [`shared_mutex`](thread/shared_mutex "cpp/thread/shared mutex") (since C++17) [`shared_ptr<>`](memory/shared_ptr "cpp/memory/shared ptr") (since C++11) [`shared_timed_mutex`](thread/shared_timed_mutex "cpp/thread/shared timed mutex") (since C++14) [`shift_left<>()`](algorithm/shift "cpp/algorithm/shift") (since C++20) [`shift_right<>()`](algorithm/shift "cpp/algorithm/shift") (since C++20) [`showbase()`](io/manip/showbase "cpp/io/manip/showbase") [`showpoint()`](io/manip/showpoint "cpp/io/manip/showpoint") [`showpos()`](io/manip/showpos "cpp/io/manip/showpos") [`shuffle<>()`](algorithm/random_shuffle "cpp/algorithm/random shuffle") (since C++11) [`shuffle_order_engine<>`](numeric/random/shuffle_order_engine "cpp/numeric/random/shuffle order engine") (since C++11) [`sig_atomic_t`](utility/program/sig_atomic_t "cpp/utility/program/sig atomic t") [`signal()`](utility/program/signal "cpp/utility/program/signal") [`signbit()`](numeric/math/signbit "cpp/numeric/math/signbit") (since C++11) [`signed_integral<>`](concepts/signed_integral "cpp/concepts/signed integral") (since C++20) [`sin()`](numeric/math/sin "cpp/numeric/math/sin") [`sin<>()`](numeric/complex/sin "cpp/numeric/complex/sin") (std::complex) [`sin<>()`](numeric/valarray/sin "cpp/numeric/valarray/sin") (std::valarray) [`sinf()`](numeric/math/sin "cpp/numeric/math/sin") (since C++11) [`sinh()`](numeric/math/sinh "cpp/numeric/math/sinh") [`sinh<>()`](numeric/complex/sinh "cpp/numeric/complex/sinh") (std::complex) [`sinh<>()`](numeric/valarray/sinh "cpp/numeric/valarray/sinh") (std::valarray) [`sinhf()`](numeric/math/sinh "cpp/numeric/math/sinh") (since C++11) [`sinhl()`](numeric/math/sinh "cpp/numeric/math/sinh") (since C++11) [`sinl()`](numeric/math/sin "cpp/numeric/math/sin") (since C++11) [`size<>()`](iterator/size "cpp/iterator/size") (since C++17) [`size_t`](types/size_t "cpp/types/size t") [`sized_sentinel_for<>`](iterator/sized_sentinel_for "cpp/iterator/sized sentinel for") (since C++20) [`skipws()`](io/manip/skipws "cpp/io/manip/skipws") [`slice`](numeric/valarray/slice "cpp/numeric/valarray/slice") [`slice_array<>`](numeric/valarray/slice_array "cpp/numeric/valarray/slice array") [`smatch`](regex/match_results "cpp/regex/match results") (since C++11) [`snprintf()`](io/c/fprintf "cpp/io/c/fprintf") (since C++11) [`sort<>()`](algorithm/sort "cpp/algorithm/sort") [`sort_heap<>()`](algorithm/sort_heap "cpp/algorithm/sort heap") [`sortable<>`](iterator/sortable "cpp/iterator/sortable") (since C++20) [`sorted_equivalent`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/sorted_equivalent&action=edit&redlink=1 "cpp/container/sorted equivalent (page does not exist)") (since C++23) [`sorted_equivalent_t`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/sorted_equivalent&action=edit&redlink=1 "cpp/container/sorted equivalent (page does not exist)") (since C++23) [`sorted_unique`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/sorted_unique&action=edit&redlink=1 "cpp/container/sorted unique (page does not exist)") (since C++23) [`sorted_unique_t`](https://en.cppreference.com/mwiki/index.php?title=cpp/container/sorted_unique&action=edit&redlink=1 "cpp/container/sorted unique (page does not exist)") (since C++23) [`source_location`](utility/source_location "cpp/utility/source location") (since C++20) [`span<>`](container/span "cpp/container/span") (since C++20) [`spanbuf`](io/basic_spanbuf "cpp/io/basic spanbuf") (since C++23) [`spanstream`](io/basic_spanstream "cpp/io/basic spanstream") (since C++23) [`sph_bessel()`](numeric/special_functions/sph_bessel "cpp/numeric/special functions/sph bessel") (since C++17) [`sph_besself()`](numeric/special_functions/sph_bessel "cpp/numeric/special functions/sph bessel") (since C++17) [`sph_bessell()`](numeric/special_functions/sph_bessel "cpp/numeric/special functions/sph bessel") (since C++17) [`sph_legendre()`](numeric/special_functions/sph_legendre "cpp/numeric/special functions/sph legendre") (since C++17) [`sph_legendref()`](numeric/special_functions/sph_legendre "cpp/numeric/special functions/sph legendre") (since C++17) [`sph_legendrel()`](numeric/special_functions/sph_legendre "cpp/numeric/special functions/sph legendre") (since C++17) [`sph_neumann()`](numeric/special_functions/sph_neumann "cpp/numeric/special functions/sph neumann") (since C++17) [`sph_neumannf()`](numeric/special_functions/sph_neumann "cpp/numeric/special functions/sph neumann") (since C++17) [`sph_neumannl()`](numeric/special_functions/sph_neumann "cpp/numeric/special functions/sph neumann") (since C++17) [`sprintf()`](io/c/fprintf "cpp/io/c/fprintf") [`sqrt()`](numeric/math/sqrt "cpp/numeric/math/sqrt") [`sqrt<>()`](numeric/complex/sqrt "cpp/numeric/complex/sqrt") (std::complex) [`sqrt<>()`](numeric/valarray/sqrt "cpp/numeric/valarray/sqrt") (std::valarray) [`sqrtf()`](numeric/math/sqrt "cpp/numeric/math/sqrt") (since C++11) [`sqrtl()`](numeric/math/sqrt "cpp/numeric/math/sqrt") (since C++11) [`srand()`](numeric/random/srand "cpp/numeric/random/srand") [`sregex_iterator`](regex/regex_iterator "cpp/regex/regex iterator") (since C++11) [`sregex_token_iterator`](regex/regex_token_iterator "cpp/regex/regex token iterator") (since C++11) [`sscanf()`](io/c/fscanf "cpp/io/c/fscanf") [`ssize<>()`](iterator/size "cpp/iterator/size") (since C++20) [`ssub_match`](regex/sub_match "cpp/regex/sub match") (since C++11) [`stable_partition<>()`](algorithm/stable_partition "cpp/algorithm/stable partition") [`stable_sort<>()`](algorithm/stable_sort "cpp/algorithm/stable sort") [`stack<>`](container/stack "cpp/container/stack") [`stacktrace`](utility/basic_stacktrace "cpp/utility/basic stacktrace") (since C++23) [`stacktrace_entry`](utility/stacktrace_entry "cpp/utility/stacktrace entry") (since C++23) [`static_pointer_cast<>()`](memory/shared_ptr/pointer_cast "cpp/memory/shared ptr/pointer cast") (since C++11) [`stod()`](string/basic_string/stof "cpp/string/basic string/stof") (since C++11) [`stof()`](string/basic_string/stof "cpp/string/basic string/stof") (since C++11) [`stoi()`](string/basic_string/stol "cpp/string/basic string/stol") (since C++11) [`stol()`](string/basic_string/stol "cpp/string/basic string/stol") (since C++11) [`stold()`](string/basic_string/stof "cpp/string/basic string/stof") (since C++11) [`stoll()`](string/basic_string/stol "cpp/string/basic string/stol") (since C++11) [`stoul()`](string/basic_string/stoul "cpp/string/basic string/stoul") (since C++11) [`stoull()`](string/basic_string/stoul "cpp/string/basic string/stoul") (since C++11) [`stop_callback<>`](thread/stop_callback "cpp/thread/stop callback") (since C++20) [`stop_source`](thread/stop_source "cpp/thread/stop source") (since C++20) [`stop_token`](thread/stop_token "cpp/thread/stop token") (since C++20) [`strcat()`](string/byte/strcat "cpp/string/byte/strcat") [`strchr()`](string/byte/strchr "cpp/string/byte/strchr") [`strcmp()`](string/byte/strcmp "cpp/string/byte/strcmp") [`strcoll()`](string/byte/strcoll "cpp/string/byte/strcoll") [`strcpy()`](string/byte/strcpy "cpp/string/byte/strcpy") [`strcspn()`](string/byte/strcspn "cpp/string/byte/strcspn") [`streambuf`](io/basic_streambuf "cpp/io/basic streambuf") [`streamoff`](io/streamoff "cpp/io/streamoff") [`streampos`](io/fpos "cpp/io/fpos") [`streamsize`](io/streamsize "cpp/io/streamsize") [`strerror()`](string/byte/strerror "cpp/string/byte/strerror") [`strftime()`](chrono/c/strftime "cpp/chrono/c/strftime") [`strict_weak_order<>`](concepts/strict_weak_order "cpp/concepts/strict weak order") (since C++20) [`string`](string/basic_string "cpp/string/basic string") ▶ [`string_literals`](symbol_index/string_literals "cpp/symbol index/string literals") (since C++14) [`string_view`](string/basic_string_view "cpp/string/basic string view") (since C++17) ▶ [`string_view_literals`](symbol_index/string_view_literals "cpp/symbol index/string view literals") (since C++17) [`stringbuf`](io/basic_stringbuf "cpp/io/basic stringbuf") [`stringstream`](io/basic_stringstream "cpp/io/basic stringstream") [`strlen()`](string/byte/strlen "cpp/string/byte/strlen") [`strncat()`](string/byte/strncat "cpp/string/byte/strncat") [`strncmp()`](string/byte/strncmp "cpp/string/byte/strncmp") [`strncpy()`](string/byte/strncpy "cpp/string/byte/strncpy") [`strong_order`](utility/compare/strong_order "cpp/utility/compare/strong order") (since C++20) [`strong_ordering`](utility/compare/strong_ordering "cpp/utility/compare/strong ordering") (since C++20) [`strpbrk()`](string/byte/strpbrk "cpp/string/byte/strpbrk") [`strrchr()`](string/byte/strrchr "cpp/string/byte/strrchr") [`strspn()`](string/byte/strspn "cpp/string/byte/strspn") [`strstr()`](string/byte/strstr "cpp/string/byte/strstr") [`strstream`](io/strstream "cpp/io/strstream") (deprecated in C++98) [`strstreambuf`](io/strstreambuf "cpp/io/strstreambuf") (deprecated in C++98) [`strtod()`](string/byte/strtof "cpp/string/byte/strtof") [`strtof()`](string/byte/strtof "cpp/string/byte/strtof") (since C++11) [`strtoimax()`](string/byte/strtoimax "cpp/string/byte/strtoimax") (since C++11) [`strtok()`](string/byte/strtok "cpp/string/byte/strtok") [`strtol()`](string/byte/strtol "cpp/string/byte/strtol") [`strtold()`](string/byte/strtof "cpp/string/byte/strtof") [`strtoll()`](string/byte/strtol "cpp/string/byte/strtol") (since C++11) [`strtoul()`](string/byte/strtoul "cpp/string/byte/strtoul") [`strtoull()`](string/byte/strtoul "cpp/string/byte/strtoul") (since C++11) [`strtoumax()`](string/byte/strtoimax "cpp/string/byte/strtoimax") (since C++11) [`strxfrm()`](string/byte/strxfrm "cpp/string/byte/strxfrm") [`syncbuf`](io/basic_syncbuf "cpp/io/basic syncbuf") (since C++20) [`student_t_distribution<>`](numeric/random/student_t_distribution "cpp/numeric/random/student t distribution") (since C++11) [`sub_match<>`](regex/sub_match "cpp/regex/sub match") (since C++11) [`subtract_with_carry_engine<>`](numeric/random/subtract_with_carry_engine "cpp/numeric/random/subtract with carry engine") (since C++11) [`suspend_always`](coroutine/suspend_always "cpp/coroutine/suspend always") (since C++20) [`suspend_never`](coroutine/suspend_never "cpp/coroutine/suspend never") (since C++20) [`swap<>()`](algorithm/swap "cpp/algorithm/swap") [`swap_ranges<>()`](algorithm/swap_ranges "cpp/algorithm/swap ranges") [`swappable<>`](concepts/swappable "cpp/concepts/swappable") (since C++20) [`swappable_with<>`](concepts/swappable "cpp/concepts/swappable") (since C++20) [`swprintf()`](io/c/fwprintf "cpp/io/c/fwprintf") [`swscanf()`](io/c/fwscanf "cpp/io/c/fwscanf") [`system()`](utility/program/system "cpp/utility/program/system") [`system_category`](error/system_category "cpp/error/system category") (since C++11) [`system_error`](error/system_error "cpp/error/system error") (since C++11) ### T [`tan()`](numeric/math/tan "cpp/numeric/math/tan") [`tan<>()`](numeric/complex/tan "cpp/numeric/complex/tan") (std::complex) [`tan<>()`](numeric/valarray/tan "cpp/numeric/valarray/tan") (std::valarray) [`tan()`](numeric/math/tan "cpp/numeric/math/tan") [`tanf()`](numeric/math/tan "cpp/numeric/math/tan") (since C++11) [`tanh()`](numeric/math/tanh "cpp/numeric/math/tanh") [`tanh<>()`](numeric/complex/tanh "cpp/numeric/complex/tanh") (std::complex) [`tanh<>()`](numeric/valarray/tanh "cpp/numeric/valarray/tanh") (std::valarray) [`tanhf()`](numeric/math/tanh "cpp/numeric/math/tanh") (since C++11) [`tanhl()`](numeric/math/tanh "cpp/numeric/math/tanh") (since C++11) [`tanl()`](numeric/math/tan "cpp/numeric/math/tan") (since C++11) [`tera`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`terminate()`](error/terminate "cpp/error/terminate") [`terminate_handler`](error/terminate_handler "cpp/error/terminate handler") [`tgamma()`](numeric/math/tgamma "cpp/numeric/math/tgamma") (since C++11) [`tgammaf()`](numeric/math/tgamma "cpp/numeric/math/tgamma") (since C++11) [`tgammal()`](numeric/math/tgamma "cpp/numeric/math/tgamma") (since C++11) ▶ [`this_thread`](symbol_index/this_thread "cpp/symbol index/this thread") [`thread`](thread/thread "cpp/thread/thread") (since C++11) [`three_way_comparable<>`](utility/compare/three_way_comparable "cpp/utility/compare/three way comparable") (since C++20) [`three_way_comparable_with<>`](utility/compare/three_way_comparable "cpp/utility/compare/three way comparable") (since C++20) [`throw_with_nested<>()`](error/throw_with_nested "cpp/error/throw with nested") (since C++11) [`tie<>()`](utility/tuple/tie "cpp/utility/tuple/tie") (since C++11) [`time()`](chrono/c/time "cpp/chrono/c/time") [`time_base`](locale/time_base "cpp/locale/time base") [`time_get<>`](locale/time_get "cpp/locale/time get") [`time_get_byname<>`](locale/time_get_byname "cpp/locale/time get byname") [`time_put<>`](locale/time_put "cpp/locale/time put") [`time_put_byname<>`](locale/time_put_byname "cpp/locale/time put byname") [`time_t`](chrono/c "cpp/chrono/c") [`timed_mutex`](thread/timed_mutex "cpp/thread/timed mutex") (since C++11) [`timespec`](chrono/c/timespec "cpp/chrono/c/timespec") (since C++17) [`timespec_get()`](chrono/c/timespec_get "cpp/chrono/c/timespec get") (since C++17) [`tm`](chrono/c/tm "cpp/chrono/c/tm") [`tmpfile()`](io/c/tmpfile "cpp/io/c/tmpfile") [`tmpnam()`](io/c/tmpnam "cpp/io/c/tmpnam") [`to_address<>()`](memory/to_address "cpp/memory/to address") (since C++20) [`to_array<>()`](container/array/to_array "cpp/container/array/to array") (since C++20) [`to_chars()`](utility/to_chars "cpp/utility/to chars") (since C++17) [`to_chars_result`](utility/to_chars "cpp/utility/to chars") (since C++17) [`to_integer<>()`](types/byte "cpp/types/byte") (since C++17) [`to_string()`](string/basic_string/to_string "cpp/string/basic string/to string") (since C++11) [`to_wstring()`](string/basic_string/to_wstring "cpp/string/basic string/to wstring") (since C++11) [`tolower()`](string/byte/tolower "cpp/string/byte/tolower") [`tolower<>()`](locale/tolower "cpp/locale/tolower") (locale) [`totally_ordered<>`](concepts/totally_ordered "cpp/concepts/totally ordered") (since C++20) [`totally_ordered_with<>`](concepts/totally_ordered "cpp/concepts/totally ordered") (since C++20) [`toupper()`](string/byte/toupper "cpp/string/byte/toupper") [`toupper<>()`](locale/toupper "cpp/locale/toupper") (locale) [`to_underlying<>()`](utility/to_underlying "cpp/utility/to underlying") (since C++23) [`towctrans()`](string/wide/towctrans "cpp/string/wide/towctrans") [`towlower()`](string/wide/towlower "cpp/string/wide/towlower") [`towupper()`](string/wide/towupper "cpp/string/wide/towupper") [`transform<>()`](algorithm/transform "cpp/algorithm/transform") [`transform_exclusive_scan<>()`](algorithm/transform_exclusive_scan "cpp/algorithm/transform exclusive scan") (since C++17) [`transform_inclusive_scan<>()`](algorithm/transform_inclusive_scan "cpp/algorithm/transform inclusive scan") (since C++17) [`transform_reduce<>()`](algorithm/transform_reduce "cpp/algorithm/transform reduce") (since C++17) [`true_type`](types/integral_constant "cpp/types/integral constant") (since C++11) [`trunc()`](numeric/math/trunc "cpp/numeric/math/trunc") (since C++11) [`truncf()`](numeric/math/trunc "cpp/numeric/math/trunc") (since C++11) [`truncl()`](numeric/math/trunc "cpp/numeric/math/trunc") (since C++11) [`try_lock<>()`](thread/try_lock "cpp/thread/try lock") (since C++11) [`try_to_lock`](thread/lock_tag "cpp/thread/lock tag") (since C++11) [`try_to_lock_t`](thread/lock_tag_t "cpp/thread/lock tag t") (since C++11) [`tuple<>`](utility/tuple "cpp/utility/tuple") (since C++11) [`tuple_cat<>()`](utility/tuple/tuple_cat "cpp/utility/tuple/tuple cat") (since C++11) [`tuple_element<>`](utility/tuple_element "cpp/utility/tuple element") (since C++11) [`tuple_element_t<>`](utility/tuple_element "cpp/utility/tuple element") (since C++14) [`tuple_size<>`](utility/tuple_size "cpp/utility/tuple size") (since C++11) [`tuple_size_v<>`](utility/tuple_size "cpp/utility/tuple size") (since C++17) [`type_identity<>`](types/type_identity "cpp/types/type identity") (since C++20) [`type_identity_t<>`](types/type_identity "cpp/types/type identity") (since C++20) [`type_index`](types/type_index "cpp/types/type index") (since C++11) [`type_info`](types/type_info "cpp/types/type info") ### U [`u16streampos`](io/fpos "cpp/io/fpos") (since C++11) [`u16string`](string/basic_string "cpp/string/basic string") (since C++11) [`u16string_view`](string/basic_string_view "cpp/string/basic string view") (since C++17) [`u32streampos`](io/fpos "cpp/io/fpos") (since C++11) [`u32string`](string/basic_string "cpp/string/basic string") (since C++11) [`u32string_view`](string/basic_string_view "cpp/string/basic string view") (since C++17) [`u8streampos`](io/fpos "cpp/io/fpos") (since C++20) [`u8string`](string/basic_string "cpp/string/basic string") (since C++20) [`u8string_view`](string/basic_string_view "cpp/string/basic string view") (since C++20) [`uint_fast16_t`](types/integer "cpp/types/integer") (since C++11) [`uint_fast32_t`](types/integer "cpp/types/integer") (since C++11) [`uint_fast64_t`](types/integer "cpp/types/integer") (since C++11) [`uint_fast8_t`](types/integer "cpp/types/integer") (since C++11) [`uint_least16_t`](types/integer "cpp/types/integer") (since C++11) [`uint_least32_t`](types/integer "cpp/types/integer") (since C++11) [`uint_least64_t`](types/integer "cpp/types/integer") (since C++11) [`uint_least8_t`](types/integer "cpp/types/integer") (since C++11) [`uint16_t`](types/integer "cpp/types/integer") (since C++11) [`uint32_t`](types/integer "cpp/types/integer") (since C++11) [`uint64_t`](types/integer "cpp/types/integer") (since C++11) [`uint8_t`](types/integer "cpp/types/integer") (since C++11) [`uintmax_t`](types/integer "cpp/types/integer") (since C++11) [`uintptr_t`](types/integer "cpp/types/integer") (since C++11) [`uncaught_exceptions`](error/uncaught_exception "cpp/error/uncaught exception") (since C++17) [`underflow_error`](error/underflow_error "cpp/error/underflow error") [`underlying_type<>`](types/underlying_type "cpp/types/underlying type") (since C++11) [`underlying_type_t<>`](types/underlying_type "cpp/types/underlying type") (since C++14) [`unexpect`](https://en.cppreference.com/mwiki/index.php?title=cpp/utility/expected/unexpect&action=edit&redlink=1 "cpp/utility/expected/unexpect (page does not exist)") (since C++23) [`unexpect_t`](https://en.cppreference.com/mwiki/index.php?title=cpp/utility/expected/unexpect_t&action=edit&redlink=1 "cpp/utility/expected/unexpect t (page does not exist)") (since C++23) [`unexpected<>`](utility/expected/unexpected "cpp/utility/expected/unexpected") (since C++23) [`ungetc()`](io/c/ungetc "cpp/io/c/ungetc") [`ungetwc()`](io/c/ungetwc "cpp/io/c/ungetwc") [`uniform_int_distribution<>`](numeric/random/uniform_int_distribution "cpp/numeric/random/uniform int distribution") (since C++11) [`uniform_random_bit_generator<>`](numeric/random/uniform_random_bit_generator "cpp/numeric/random/uniform random bit generator") (since C++20) [`uniform_real_distribution<>`](numeric/random/uniform_real_distribution "cpp/numeric/random/uniform real distribution") (since C++11) [`uninitialized_construct_using_allocator<>()`](memory/uninitialized_construct_using_allocator "cpp/memory/uninitialized construct using allocator") (since C++20) [`uninitialized_copy<>()`](memory/uninitialized_copy "cpp/memory/uninitialized copy") [`uninitialized_copy_n<>()`](memory/uninitialized_copy_n "cpp/memory/uninitialized copy n") (since C++11) [`uninitialized_default_construct<>()`](memory/uninitialized_default_construct "cpp/memory/uninitialized default construct") (since C++17) [`uninitialized_default_construct_n<>()`](memory/uninitialized_default_construct_n "cpp/memory/uninitialized default construct n") (since C++17) [`uninitialized_fill<>()`](memory/uninitialized_fill "cpp/memory/uninitialized fill") [`uninitialized_fill_n<>()`](memory/uninitialized_fill_n "cpp/memory/uninitialized fill n") [`uninitialized_move<>()`](memory/uninitialized_move "cpp/memory/uninitialized move") (since C++17) [`uninitialized_move_n<>()`](memory/uninitialized_move_n "cpp/memory/uninitialized move n") (since C++17) [`uninitialized_value_construct<>()`](memory/uninitialized_value_construct "cpp/memory/uninitialized value construct") (since C++17) [`uninitialized_value_construct_n<>()`](memory/uninitialized_value_construct_n "cpp/memory/uninitialized value construct n") (since C++17) [`unique<>()`](algorithm/unique "cpp/algorithm/unique") [`unique_copy<>()`](algorithm/unique_copy "cpp/algorithm/unique copy") [`unique_lock<>`](thread/unique_lock "cpp/thread/unique lock") (since C++11) [`unique_ptr<>`](memory/unique_ptr "cpp/memory/unique ptr") (since C++11) [`unitbuf()`](io/manip/unitbuf "cpp/io/manip/unitbuf") [`unordered_map<>`](container/unordered_map "cpp/container/unordered map") (since C++11) [`unordered_multimap<>`](container/unordered_multimap "cpp/container/unordered multimap") (since C++11) [`unordered_multiset<>`](container/unordered_multiset "cpp/container/unordered multiset") (since C++11) [`unordered_set<>`](container/unordered_set "cpp/container/unordered set") (since C++11) [`unreachable()`](utility/unreachable "cpp/utility/unreachable") (since C++23) [`unreachable_sentinel`](iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t") (since C++20) [`unreachable_sentinel_t`](iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t") (since C++20) [`unsigned_integral<>`](concepts/unsigned_integral "cpp/concepts/unsigned integral") (since C++20) [`unwrap_ref_decay<>`](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference") (since C++20) [`unwrap_ref_decay_t<>`](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference") (since C++20) [`unwrap_reference<>`](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference") (since C++20) [`unwrap_reference_t<>`](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference") (since C++20) [`upper_bound<>()`](algorithm/upper_bound "cpp/algorithm/upper bound") [`uppercase()`](io/manip/uppercase "cpp/io/manip/uppercase") [`use_facet<>()`](locale/use_facet "cpp/locale/use facet") [`uses_allocator<>`](memory/uses_allocator "cpp/memory/uses allocator") (since C++11) [`uses_allocator_v<>`](memory/uses_allocator "cpp/memory/uses allocator") (since C++17) [`uses_allocator_construction_args<>()`](memory/uses_allocator_construction_args "cpp/memory/uses allocator construction args") (since C++20) ### V [`va_list`](utility/variadic/va_list "cpp/utility/variadic/va list") [`valarray<>`](numeric/valarray "cpp/numeric/valarray") [`variant<>`](utility/variant "cpp/utility/variant") (since C++17) [`variant_alternative<>`](utility/variant/variant_alternative "cpp/utility/variant/variant alternative") (since C++17) [`variant_alternative_t<>`](utility/variant/variant_alternative "cpp/utility/variant/variant alternative") (since C++17) [`variant_npos`](utility/variant/variant_npos "cpp/utility/variant/variant npos") (since C++17) [`variant_size<>`](utility/variant/variant_size "cpp/utility/variant/variant size") (since C++17) [`variant_size_v<>`](utility/variant/variant_size "cpp/utility/variant/variant size") (since C++17) [`vector<>`](container/vector "cpp/container/vector") [`vformat()`](utility/format/vformat "cpp/utility/format/vformat") (since C++20) [`vformat_to<>()`](utility/format/vformat_to "cpp/utility/format/vformat to") (since C++20) [`vfprintf()`](io/c/vfprintf "cpp/io/c/vfprintf") [`vfscanf()`](io/c/vfscanf "cpp/io/c/vfscanf") (since C++11) [`vfwprintf()`](io/c/vfwprintf "cpp/io/c/vfwprintf") [`vfwscanf()`](io/c/vfwscanf "cpp/io/c/vfwscanf") (since C++11) ▶ [`views`](symbol_index/views "cpp/symbol index/views") (since C++20) [`visit<>()`](utility/variant/visit "cpp/utility/variant/visit") (since C++17) [`visit_format_arg<>()`](utility/format/visit_format_arg "cpp/utility/format/visit format arg") (since C++20) [`void_t`](types/void_t "cpp/types/void t") (since C++17) [`vprint_nonunicode()`](io/vprint_nonunicode "cpp/io/vprint nonunicode") (since C++23) [`vprint_unicode()`](io/vprint_unicode "cpp/io/vprint unicode") (since C++23) [`vprintf()`](io/c/vfprintf "cpp/io/c/vfprintf") [`vscanf()`](io/c/vfscanf "cpp/io/c/vfscanf") (since C++11) [`vsnprintf()`](io/c/vfprintf "cpp/io/c/vfprintf") (since C++11) [`vsprintf()`](io/c/vfprintf "cpp/io/c/vfprintf") [`vsscanf()`](io/c/vfscanf "cpp/io/c/vfscanf") (since C++11) [`vswprintf()`](io/c/vfwprintf "cpp/io/c/vfwprintf") [`vswscanf()`](io/c/vfwscanf "cpp/io/c/vfwscanf") (since C++11) [`vwprintf()`](io/c/vfwprintf "cpp/io/c/vfwprintf") [`vwscanf()`](io/c/vfwscanf "cpp/io/c/vfwscanf") (since C++11) ### W [`wbuffer_convert<>`](locale/wbuffer_convert "cpp/locale/wbuffer convert") (since C++11)(deprecated in C++17) [`wcerr`](io/cerr "cpp/io/cerr") [`wcin`](io/cin "cpp/io/cin") [`wclog`](io/clog "cpp/io/clog") [`wcmatch`](regex/match_results "cpp/regex/match results") (since C++11) [`wcout`](io/cout "cpp/io/cout") [`wcregex_iterator`](regex/regex_iterator "cpp/regex/regex iterator") (since C++11) [`wcregex_token_iterator`](regex/regex_token_iterator "cpp/regex/regex token iterator") (since C++11) [`wcrtomb()`](string/multibyte/wcrtomb "cpp/string/multibyte/wcrtomb") [`wcscat()`](string/wide/wcscat "cpp/string/wide/wcscat") [`wcschr()`](string/wide/wcschr "cpp/string/wide/wcschr") [`wcscmp()`](string/wide/wcscmp "cpp/string/wide/wcscmp") [`wcscoll()`](string/wide/wcscoll "cpp/string/wide/wcscoll") [`wcscpy()`](string/wide/wcscpy "cpp/string/wide/wcscpy") [`wcscspn()`](string/wide/wcscspn "cpp/string/wide/wcscspn") [`wcsftime()`](chrono/c/wcsftime "cpp/chrono/c/wcsftime") [`wcslen()`](string/wide/wcslen "cpp/string/wide/wcslen") [`wcsncat()`](string/wide/wcsncat "cpp/string/wide/wcsncat") [`wcsncmp()`](string/wide/wcsncmp "cpp/string/wide/wcsncmp") [`wcsncpy()`](string/wide/wcsncpy "cpp/string/wide/wcsncpy") [`wcspbrk()`](string/wide/wcspbrk "cpp/string/wide/wcspbrk") [`wcsrchr()`](string/wide/wcsrchr "cpp/string/wide/wcsrchr") [`wcsrtombs()`](string/multibyte/wcsrtombs "cpp/string/multibyte/wcsrtombs") [`wcsspn()`](string/wide/wcsspn "cpp/string/wide/wcsspn") [`wcsstr()`](string/wide/wcsstr "cpp/string/wide/wcsstr") [`wcstod()`](string/wide/wcstof "cpp/string/wide/wcstof") [`wcstof()`](string/wide/wcstof "cpp/string/wide/wcstof") (since C++11) [`wcstoimax()`](string/wide/wcstoimax "cpp/string/wide/wcstoimax") (since C++11) [`wcstok()`](string/wide/wcstok "cpp/string/wide/wcstok") [`wcstol()`](string/wide/wcstol "cpp/string/wide/wcstol") [`wcstold()`](string/wide/wcstof "cpp/string/wide/wcstof") (since C++11) [`wcstoll()`](string/wide/wcstol "cpp/string/wide/wcstol") (since C++11) [`wcstombs()`](string/multibyte/wcstombs "cpp/string/multibyte/wcstombs") [`wcstoul()`](string/wide/wcstoul "cpp/string/wide/wcstoul") [`wcstoull()`](string/wide/wcstoul "cpp/string/wide/wcstoul") (since C++11) [`wcstoumax()`](string/wide/wcstoimax "cpp/string/wide/wcstoimax") (since C++11) [`wcsub_match`](regex/sub_match "cpp/regex/sub match") (since C++11) [`wcsxfrm()`](string/wide/wcsxfrm "cpp/string/wide/wcsxfrm") [`wctob()`](string/multibyte/wctob "cpp/string/multibyte/wctob") [`wctomb()`](string/multibyte/wctomb "cpp/string/multibyte/wctomb") [`wctrans()`](string/wide/wctrans "cpp/string/wide/wctrans") [`wctrans_t`](string/wide "cpp/string/wide") [`wctype()`](string/wide/wctype "cpp/string/wide/wctype") [`wctype_t`](string/wide "cpp/string/wide") [`weak_order`](utility/compare/weak_order "cpp/utility/compare/weak order") (since C++20) [`weak_ordering`](utility/compare/weak_ordering "cpp/utility/compare/weak ordering") (since C++20) [`weak_ptr<>`](memory/weak_ptr "cpp/memory/weak ptr") (since C++11) [`weakly_incrementable<>`](iterator/weakly_incrementable "cpp/iterator/weakly incrementable") (since C++20) [`weibull_distribution<>`](numeric/random/weibull_distribution "cpp/numeric/random/weibull distribution") (since C++11) [`wfilebuf`](io/basic_filebuf "cpp/io/basic filebuf") [`wformat_args`](utility/format/basic_format_args "cpp/utility/format/basic format args") (since C++20) [`wformat_context`](utility/format/basic_format_context "cpp/utility/format/basic format context") (since C++20) [`wformat_parse_context`](utility/format/basic_format_parse_context "cpp/utility/format/basic format parse context") (since C++20) [`wformat_string<>`](utility/format/basic_format_string "cpp/utility/format/basic format string") (since C++20) [`wfstream`](io/basic_fstream "cpp/io/basic fstream") [`wifstream`](io/basic_ifstream "cpp/io/basic ifstream") [`wint_t`](string/wide "cpp/string/wide") [`wios`](io/basic_ios "cpp/io/basic ios") [`wiostream`](io/basic_iostream "cpp/io/basic iostream") [`wispanstream`](io/basic_ispanstream "cpp/io/basic ispanstream") (since C++23) [`wistream`](io/basic_istream "cpp/io/basic istream") [`wistringstream`](io/basic_istringstream "cpp/io/basic istringstream") [`wmemchr()`](string/wide/wmemchr "cpp/string/wide/wmemchr") [`wmemcmp()`](string/wide/wmemcmp "cpp/string/wide/wmemcmp") [`wmemcpy()`](string/wide/wmemcpy "cpp/string/wide/wmemcpy") [`wmemmove()`](string/wide/wmemmove "cpp/string/wide/wmemmove") [`wmemset()`](string/wide/wmemset "cpp/string/wide/wmemset") [`wofstream`](io/basic_ofstream "cpp/io/basic ofstream") [`wospanstream`](io/basic_ospanstream "cpp/io/basic ospanstream") (since C++23) [`wostream`](io/basic_ostream "cpp/io/basic ostream") [`wosyncstream`](io/basic_osyncstream "cpp/io/basic osyncstream") (since C++20) [`ws<>()`](io/manip/ws "cpp/io/manip/ws") [`wspanbuf`](io/basic_spanbuf "cpp/io/basic spanbuf") (since C++23) [`wspanstream`](io/basic_spanstream "cpp/io/basic spanstream") (since C++23) [`wstreambuf`](io/basic_streambuf "cpp/io/basic streambuf") [`wstreampos`](io/fpos "cpp/io/fpos") [`wostringstream`](io/basic_ostringstream "cpp/io/basic ostringstream") [`wprintf()`](io/c/fwprintf "cpp/io/c/fwprintf") [`wregex`](regex/basic_regex "cpp/regex/basic regex") (since C++11) [`wscanf()`](io/c/fwscanf "cpp/io/c/fwscanf") [`wsmatch`](regex/match_results "cpp/regex/match results") (since C++11) [`wsregex_iterator`](regex/regex_iterator "cpp/regex/regex iterator") (since C++11) [`wsregex_token_iterator`](regex/regex_token_iterator "cpp/regex/regex token iterator") (since C++11) [`wssub_match`](regex/sub_match "cpp/regex/sub match") (since C++11) [`wstring`](string/basic_string "cpp/string/basic string") [`wstring_convert<>`](locale/wstring_convert "cpp/locale/wstring convert") (since C++11)(deprecated in C++17) [`wstring_view<>`](string/basic_string_view "cpp/string/basic string view") (since C++17) [`wstringbuf`](io/basic_stringbuf "cpp/io/basic stringbuf") [`wstringstream`](io/basic_stringstream "cpp/io/basic stringstream") [`wsyncbuf`](io/basic_syncbuf "cpp/io/basic syncbuf") (since C++20) ### X ### Y [`yocto`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`yotta`](numeric/ratio "cpp/numeric/ratio") (since C++11) ### Z [`zepto`](numeric/ratio "cpp/numeric/ratio") (since C++11) [`zetta`](numeric/ratio "cpp/numeric/ratio") (since C++11) ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/symbol_index "c/symbol index") for Symbol Index |
programming_docs
cpp Error handling Error handling ============== ### Exception handling The header `<exception>` provides several classes and functions related to exception handling in C++ programs. | Defined in header `[<exception>](header/exception "cpp/header/exception")` | | --- | | [exception](error/exception "cpp/error/exception") | base class for exceptions thrown by the standard library components (class) | | Capture and storage of exception objects | | [uncaught\_exceptionuncaught\_exceptions](error/uncaught_exception "cpp/error/uncaught exception") (removed in C++20)(C++17) | checks if exception handling is currently in progress (function) | | [exception\_ptr](error/exception_ptr "cpp/error/exception ptr") (C++11) | shared pointer type for handling exception objects (typedef) | | [make\_exception\_ptr](error/make_exception_ptr "cpp/error/make exception ptr") (C++11) | creates an `[std::exception\_ptr](error/exception_ptr "cpp/error/exception ptr")` from an exception object (function template) | | [current\_exception](error/current_exception "cpp/error/current exception") (C++11) | captures the current exception in a `[std::exception\_ptr](error/exception_ptr "cpp/error/exception ptr")` (function) | | [rethrow\_exception](error/rethrow_exception "cpp/error/rethrow exception") (C++11) | throws the exception from an `[std::exception\_ptr](error/exception_ptr "cpp/error/exception ptr")` (function) | | [nested\_exception](error/nested_exception "cpp/error/nested exception") (C++11) | a mixin type to capture and store current exceptions (class) | | [throw\_with\_nested](error/throw_with_nested "cpp/error/throw with nested") (C++11) | throws its argument with `[std::nested\_exception](error/nested_exception "cpp/error/nested exception")` mixed in (function template) | | [rethrow\_if\_nested](error/rethrow_if_nested "cpp/error/rethrow if nested") (C++11) | throws the exception from a `[std::nested\_exception](error/nested_exception "cpp/error/nested exception")` (function template) | | Handling of failures in exception handling | | Defined in header `[<exception>](header/exception "cpp/header/exception")` | | [terminate](error/terminate "cpp/error/terminate") | function called when exception handling fails (function) | | [terminate\_handler](error/terminate_handler "cpp/error/terminate handler") | the type of the function called by `[std::terminate](error/terminate "cpp/error/terminate")` (typedef) | | [get\_terminate](error/get_terminate "cpp/error/get terminate") (C++11) | obtains the current terminate\_handler (function) | | [set\_terminate](error/set_terminate "cpp/error/set terminate") | changes the function to be called by `[std::terminate](error/terminate "cpp/error/terminate")` (function) | | [bad\_exception](error/bad_exception "cpp/error/bad exception") | exception thrown when `[std::current\_exception](error/current_exception "cpp/error/current exception")` fails to copy the exception object (class) | | Handling of exception specification violations (removed in C++17) | | [unexpected](error/unexpected "cpp/error/unexpected") (removed in C++17) | function called when dynamic exception specification is violated (function) | | [unexpected\_handler](error/unexpected_handler "cpp/error/unexpected handler") (removed in C++17) | the type of the function called by `[std::unexpected](error/unexpected "cpp/error/unexpected")` (typedef) | | [get\_unexpected](error/get_unexpected "cpp/error/get unexpected") (C++11)(removed in C++17) | obtains the current unexpected\_handler (function) | | [set\_unexpected](error/set_unexpected "cpp/error/set unexpected") (removed in C++17) | changes the function to be called by `[std::unexpected](error/unexpected "cpp/error/unexpected")` (function) | ### Exception categories Several convenience classes are predefined in the header `<stdexcept>` to report particular error conditions. These classes can be divided into two categories: *logic* errors and *runtime* errors. Logic errors are a consequence of faulty logic within the program and may be preventable. Runtime errors are due to events beyond the scope of the program and can not be easily predicted. | Defined in header `[<stdexcept>](header/stdexcept "cpp/header/stdexcept")` | | --- | | [logic\_error](error/logic_error "cpp/error/logic error") | exception class to indicate violations of logical preconditions or class invariants (class) | | [invalid\_argument](error/invalid_argument "cpp/error/invalid argument") | exception class to report invalid arguments (class) | | [domain\_error](error/domain_error "cpp/error/domain error") | exception class to report domain errors (class) | | [length\_error](error/length_error "cpp/error/length error") | exception class to report attempts to exceed maximum allowed size (class) | | [out\_of\_range](error/out_of_range "cpp/error/out of range") | exception class to report arguments outside of expected range (class) | | [runtime\_error](error/runtime_error "cpp/error/runtime error") | exception class to indicate conditions only detectable at run time (class) | | [range\_error](error/range_error "cpp/error/range error") | exception class to report range errors in internal computations (class) | | [overflow\_error](error/overflow_error "cpp/error/overflow error") | exception class to report arithmetic overflows (class) | | [underflow\_error](error/underflow_error "cpp/error/underflow error") | exception class to report arithmetic underflows (class) | | [tx\_exception](error/tx_exception "cpp/error/tx exception") (TM TS) | exception class to cancel atomic transactions (class template) | ### Error numbers | Defined in header `[<cerrno>](header/cerrno "cpp/header/cerrno")` | | --- | | [errno](error/errno "cpp/error/errno") | macro which expands to POSIX-compatible thread-local error number variable(macro variable) | | [E2BIG, EACCES, ..., EXDEV](error/errno_macros "cpp/error/errno macros") | macros for standard POSIX-compatible error conditions (macro constant) | ### System error The header `<system_error>` defines types and functions used to report error conditions originating from the operating system, streams I/O, `[std::future](thread/future "cpp/thread/future")`, or other low-level APIs. | Defined in header `[<system\_error>](header/system_error "cpp/header/system error")` | | --- | | [error\_category](error/error_category "cpp/error/error category") (C++11) | base class for error categories (class) | | [generic\_category](error/generic_category "cpp/error/generic category") (C++11) | identifies the generic error category (function) | | [system\_category](error/system_category "cpp/error/system category") (C++11) | identifies the operating system error category (function) | | [error\_condition](error/error_condition "cpp/error/error condition") (C++11) | holds a portable error code (class) | | [errc](error/errc "cpp/error/errc") (C++11) | the `[std::error\_condition](error/error_condition "cpp/error/error condition")` enumeration listing all standard [`<cerrno>`](header/cerrno "cpp/header/cerrno") macro constants (class) | | [error\_code](error/error_code "cpp/error/error code") (C++11) | holds a platform-dependent error code (class) | | [system\_error](error/system_error "cpp/error/system error") (C++11) | exception class used to report conditions that have an error\_code (class) | ### Assertions Assertions help to implement checking of preconditions in programs. | Defined in header `[<cassert>](header/cassert "cpp/header/cassert")` | | --- | | [assert](error/assert "cpp/error/assert") | aborts the program if the user-specified condition is not `true`. May be disabled for release builds (function macro) | ### [Stacktrace](utility/basic_stacktrace "cpp/utility/basic stacktrace") | Defined in header `[<stacktrace>](header/stacktrace "cpp/header/stacktrace")` | | --- | | [stacktrace\_entry](utility/stacktrace_entry "cpp/utility/stacktrace entry") (C++23) | representation of an evaluation in a stacktrace (class) | | [basic\_stacktrace](utility/basic_stacktrace "cpp/utility/basic stacktrace") (C++23) | approximate representation of an invocation sequence consists of stacktrace entries (class template) | ### See also | | | | --- | --- | | [`static_assert` declaration](language/static_assert "cpp/language/static assert")(C++11) | performs compile-time assertion checking | | [C documentation](https://en.cppreference.com/w/c/error "c/error") for Error handling | cpp Preprocessor Preprocessor ============ The preprocessor is executed at [translation phase 4](language/translation_phases#Phase_4 "cpp/language/translation phases"), before the compilation. The result of preprocessing is a single file which is then passed to the actual compiler. ### Directives The preprocessing directives control the behavior of the preprocessor. Each directive occupies one line and has the following format: * the `#` character * a sequence of: + a standard-defined directive name (listed [below](#Capabilities)) followed by the corresponding arguments, or + one or more [preprocessing tokens](language/translation_phases#Phase_3 "cpp/language/translation phases") where the beginning token is not a standard-defined directive name, in this case the directive is conditionally-supported with implementation-defined semantics (e.g. a common non-standard extension is the directive `#warning` which emits a user-defined message during compilation), or + nothing, in this case the directive has no effect. * a line break | | | | --- | --- | | The [module and import directives](language/modules "cpp/language/modules") are also preprocessing directives. | (since C++20) | Preprocessing directives must not come from macro expansion. ``` #define EMPTY EMPTY # include <file.h> // not a preprocessing directive ``` ### Capabilities The preprocessor has the source file translation capabilities: * **[conditionally](preprocessor/conditional "cpp/preprocessor/conditional")** compile of parts of source file (controlled by directive `#if`, `#ifdef`, `#ifndef`, `#else`, `#elif`, `#elifdef`, `#elifndef` (since C++23), and `#endif`). * **[replace](preprocessor/replace "cpp/preprocessor/replace")** text macros while possibly concatenating or quoting identifiers (controlled by directives `#define` and `#undef`, and operators `#` and `##`) * **[include](preprocessor/include "cpp/preprocessor/include")** other files (controlled by directive `#include` and checked with `__has_include` (since C++17)) * cause an **[error](preprocessor/error "cpp/preprocessor/error")** or **[warning](preprocessor/error "cpp/preprocessor/error")** (since C++23) (controlled by directive `#error` or `#warning` respectively (since C++23)) The following aspects of the preprocessor can be controlled: * **[implementation-defined](preprocessor/impl "cpp/preprocessor/impl")** behavior (controlled by directive `#pragma` and operator `_Pragma` (since C++11)). In addition, some compilers support (to varying degrees) the operator `__pragma` as a *non-standard* extension. * **[file name and line information](preprocessor/line "cpp/preprocessor/line")** available to the preprocessor (controlled by directive `#line`) ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [CWG 2001](https://cplusplus.github.io/CWG/issues/2001.html) | C++98 | the behavior of using non-standard-defined directives was not clear | made conditionally-supported | ### See also | | | --- | | [C++ documentation](preprocessor/replace#Predefined_macros "cpp/preprocessor/replace") for Predefined Macro Symbols | | [C++ documentation](symbol_index/macro "cpp/symbol index/macro") for Macro Symbol Index | | [C documentation](https://en.cppreference.com/w/c/preprocessor "c/preprocessor") for preprocessor | cpp C++ keywords C++ keywords ============ This is a list of reserved keywords in C++. Since they are used by the language, these keywords are not available for re-definition or overloading. | A – C | D – P | R – Z | | --- | --- | --- | | [`alignas`](keyword/alignas "cpp/keyword/alignas") (C++11) [`alignof`](keyword/alignof "cpp/keyword/alignof") (C++11) [`and`](keyword/and "cpp/keyword/and") [`and_eq`](keyword/and_eq "cpp/keyword/and eq") [`asm`](keyword/asm "cpp/keyword/asm") [`atomic_cancel`](language/transactional_memory "cpp/language/transactional memory") (TM TS) [`atomic_commit`](language/transactional_memory "cpp/language/transactional memory") (TM TS) [`atomic_noexcept`](language/transactional_memory "cpp/language/transactional memory") (TM TS) [`auto`](keyword/auto "cpp/keyword/auto") (1) [`bitand`](keyword/bitand "cpp/keyword/bitand") [`bitor`](keyword/bitor "cpp/keyword/bitor") [`bool`](keyword/bool "cpp/keyword/bool") [`break`](keyword/break "cpp/keyword/break") [`case`](keyword/case "cpp/keyword/case") [`catch`](keyword/catch "cpp/keyword/catch") [`char`](keyword/char "cpp/keyword/char") [`char8_t`](keyword/char8_t "cpp/keyword/char8 t") (C++20) [`char16_t`](keyword/char16_t "cpp/keyword/char16 t") (C++11) [`char32_t`](keyword/char32_t "cpp/keyword/char32 t") (C++11) [`class`](keyword/class "cpp/keyword/class") (1) [`compl`](keyword/compl "cpp/keyword/compl") [`concept`](keyword/concept "cpp/keyword/concept") (C++20) [`const`](keyword/const "cpp/keyword/const") [`consteval`](keyword/consteval "cpp/keyword/consteval") (C++20) [`constexpr`](keyword/constexpr "cpp/keyword/constexpr") (C++11) [`constinit`](keyword/constinit "cpp/keyword/constinit") (C++20) [`const_cast`](keyword/const_cast "cpp/keyword/const cast") [`continue`](keyword/continue "cpp/keyword/continue") [`co_await`](keyword/co_await "cpp/keyword/co await") (C++20) [`co_return`](keyword/co_return "cpp/keyword/co return") (C++20) [`co_yield`](keyword/co_yield "cpp/keyword/co yield") (C++20). | [`decltype`](keyword/decltype "cpp/keyword/decltype") (C++11) [`default`](keyword/default "cpp/keyword/default") (1) [`delete`](keyword/delete "cpp/keyword/delete") (1) [`do`](keyword/do "cpp/keyword/do") [`double`](keyword/double "cpp/keyword/double") [`dynamic_cast`](keyword/dynamic_cast "cpp/keyword/dynamic cast") [`else`](keyword/else "cpp/keyword/else") [`enum`](keyword/enum "cpp/keyword/enum") [`explicit`](keyword/explicit "cpp/keyword/explicit") [`export`](keyword/export "cpp/keyword/export") (1) (3) [`extern`](keyword/extern "cpp/keyword/extern") (1) [`false`](keyword/false "cpp/keyword/false") [`float`](keyword/float "cpp/keyword/float") [`for`](keyword/for "cpp/keyword/for") [`friend`](keyword/friend "cpp/keyword/friend") [`goto`](keyword/goto "cpp/keyword/goto") [`if`](keyword/if "cpp/keyword/if") [`inline`](keyword/inline "cpp/keyword/inline") (1) [`int`](keyword/int "cpp/keyword/int") [`long`](keyword/long "cpp/keyword/long") [`mutable`](keyword/mutable "cpp/keyword/mutable") (1) [`namespace`](keyword/namespace "cpp/keyword/namespace") [`new`](keyword/new "cpp/keyword/new") [`noexcept`](keyword/noexcept "cpp/keyword/noexcept") (C++11) [`not`](keyword/not "cpp/keyword/not") [`not_eq`](keyword/not_eq "cpp/keyword/not eq") [`nullptr`](keyword/nullptr "cpp/keyword/nullptr") (C++11) [`operator`](keyword/operator "cpp/keyword/operator") [`or`](keyword/or "cpp/keyword/or") [`or_eq`](keyword/or_eq "cpp/keyword/or eq") [`private`](keyword/private "cpp/keyword/private") [`protected`](keyword/protected "cpp/keyword/protected") [`public`](keyword/public "cpp/keyword/public"). | [`reflexpr`](keyword/reflexpr "cpp/keyword/reflexpr") (reflection TS) [`register`](keyword/register "cpp/keyword/register") (2) [`reinterpret_cast`](keyword/reinterpret_cast "cpp/keyword/reinterpret cast") [`requires`](keyword/requires "cpp/keyword/requires") (C++20) [`return`](keyword/return "cpp/keyword/return") [`short`](keyword/short "cpp/keyword/short") [`signed`](keyword/signed "cpp/keyword/signed") [`sizeof`](keyword/sizeof "cpp/keyword/sizeof") (1) [`static`](keyword/static "cpp/keyword/static") [`static_assert`](keyword/static_assert "cpp/keyword/static assert") (C++11) [`static_cast`](keyword/static_cast "cpp/keyword/static cast") [`struct`](keyword/struct "cpp/keyword/struct") (1) [`switch`](keyword/switch "cpp/keyword/switch") [`synchronized`](language/transactional_memory "cpp/language/transactional memory") (TM TS) [`template`](keyword/template "cpp/keyword/template") [`this`](keyword/this "cpp/keyword/this") (4) [`thread_local`](keyword/thread_local "cpp/keyword/thread local") (C++11) [`throw`](keyword/throw "cpp/keyword/throw") [`true`](keyword/true "cpp/keyword/true") [`try`](keyword/try "cpp/keyword/try") [`typedef`](keyword/typedef "cpp/keyword/typedef") [`typeid`](keyword/typeid "cpp/keyword/typeid") [`typename`](keyword/typename "cpp/keyword/typename") [`union`](keyword/union "cpp/keyword/union") [`unsigned`](keyword/unsigned "cpp/keyword/unsigned") [`using`](keyword/using "cpp/keyword/using") (1) [`virtual`](keyword/virtual "cpp/keyword/virtual") [`void`](keyword/void "cpp/keyword/void") [`volatile`](keyword/volatile "cpp/keyword/volatile") [`wchar_t`](keyword/wchar_t "cpp/keyword/wchar t") [`while`](keyword/while "cpp/keyword/while") [`xor`](keyword/xor "cpp/keyword/xor") [`xor_eq`](keyword/xor_eq "cpp/keyword/xor eq"). | * (1) — meaning changed or new meaning added in C++11. * (2) — meaning changed in C++17. * (3) — meaning changed in C++20. * (4) — new meaning added in C++23. Note that [`and`](keyword/and "cpp/keyword/and"), [`bitor`](keyword/bitor "cpp/keyword/bitor"), [`or`](keyword/or "cpp/keyword/or"), [`xor`](keyword/xor "cpp/keyword/xor"), [`compl`](keyword/compl "cpp/keyword/compl"), [`bitand`](keyword/bitand "cpp/keyword/bitand"), [`and_eq`](keyword/and_eq "cpp/keyword/and eq"), [`or_eq`](keyword/or_eq "cpp/keyword/or eq"), [`xor_eq`](keyword/xor_eq "cpp/keyword/xor eq"), [`not`](keyword/not "cpp/keyword/not"), and [`not_eq`](keyword/not_eq "cpp/keyword/not eq") (along with the digraphs `<%`, `%>`, `<:`, `:>`, `%:`, and `%:%:`) provide an [alternative way to represent standard tokens](language/operator_alternative "cpp/language/operator alternative"). In addition to keywords, there are *identifiers with special meaning*, which may be used as names of objects or functions, but have special meaning in certain contexts. | | | --- | | [`final`](language/final "cpp/language/final") (C++11) [`override`](language/override "cpp/language/override") (C++11) [`transaction_safe`](language/transactional_memory "cpp/language/transactional memory") (TM TS) [`transaction_safe_dynamic`](language/transactional_memory "cpp/language/transactional memory") (TM TS) [`import`](keyword/import "cpp/keyword/import") (C++20) [`module`](keyword/module "cpp/keyword/module") (C++20). | Also, all [identifiers](language/identifiers "cpp/language/identifiers") that contain a double underscore `__` in any position and each identifier that begins with an underscore followed by an uppercase letter is always reserved and all identifiers that begin with an underscore are reserved for use as names in the global namespace. See [identifiers](language/identifiers "cpp/language/identifiers") for more details. The namespace `std` is used to place names of the standard C++ library. See [Extending namespace std](language/extending_std "cpp/language/extending std") for the rules about adding names to it. | | | | --- | --- | | The name `posix` is reserved for a future top-level namespace. The behavior is undefined if a program declares or defines anything in that namespace. | (since C++11) | The following tokens are recognized by the [preprocessor](preprocessor "cpp/preprocessor") when in context of a preprocessor directive: | | | | | | | | --- | --- | --- | --- | --- | --- | | [`if`](preprocessor/conditional "cpp/preprocessor/conditional") [`elif`](preprocessor/conditional "cpp/preprocessor/conditional") [`else`](preprocessor/conditional "cpp/preprocessor/conditional") [`endif`](preprocessor/conditional "cpp/preprocessor/conditional"). | [`ifdef`](preprocessor/conditional "cpp/preprocessor/conditional") [`ifndef`](preprocessor/conditional "cpp/preprocessor/conditional") [`elifdef`](preprocessor/conditional "cpp/preprocessor/conditional") (C++23) [`elifndef`](preprocessor/conditional "cpp/preprocessor/conditional") (C++23). | [`define`](preprocessor/replace "cpp/preprocessor/replace") [`undef`](preprocessor/replace "cpp/preprocessor/replace") [`include`](preprocessor/include "cpp/preprocessor/include") [`line`](preprocessor/line "cpp/preprocessor/line"). | [`error`](preprocessor/error "cpp/preprocessor/error") [`warning`](preprocessor/error "cpp/preprocessor/error") (C++23) [`pragma`](preprocessor/impl "cpp/preprocessor/impl"). | [`defined`](preprocessor/conditional "cpp/preprocessor/conditional") [`__has_include`](feature_test "cpp/feature test") (C++17) [`__has_cpp_attribute`](feature_test "cpp/feature test") (C++20). | [`export`](keyword/export "cpp/keyword/export") (C++20) [`import`](keyword/import "cpp/keyword/import") (C++20) [`module`](keyword/module "cpp/keyword/module") (C++20). | The following tokens are recognized by the [preprocessor](preprocessor "cpp/preprocessor") *outside* the context of a preprocessor directive: | | | --- | | [`_Pragma`](preprocessor/impl "cpp/preprocessor/impl") (C++11). | ### See also | | | --- | | [C documentation](https://en.cppreference.com/w/c/keyword "c/keyword") for C keywords |
programming_docs
cpp C++20 C++20 ===== The current revision of the C++ standard. New language features ---------------------- * [Feature test macros](feature_test "cpp/feature test") * [3-way comparison](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") operator `<=>` and [`operator==() = default`](language/default_comparisons "cpp/language/default comparisons") * [designated initializers](language/aggregate_initialization#Designated_initializers "cpp/language/aggregate initialization") * init-statements and initializers in [range-`for`](language/range-for "cpp/language/range-for") * [`char8_t`](keyword/char8_t "cpp/keyword/char8 t") * New [attributes](language/attributes "cpp/language/attributes"): `[[[no\_unique\_address](language/attributes/no_unique_address "cpp/language/attributes/no unique address")]]`, `[[[likely](language/attributes/likely "cpp/language/attributes/likely")]]`, `[[[unlikely](language/attributes/likely "cpp/language/attributes/likely")]]` * [`pack-expansions`](language/parameter_pack "cpp/language/parameter pack") in [lambda init-captures](language/lambda#Lambda_capture "cpp/language/lambda") * removed the requirement to use `typename` to disambiguate types in many contexts * [`consteval`](language/consteval "cpp/language/consteval"), [`constinit`](language/constinit "cpp/language/constinit") * further relaxed constexpr * signed integers are 2's complement * [aggregate initialization](language/aggregate_initialization "cpp/language/aggregate initialization") using parentheses * [Coroutines](language/coroutines "cpp/language/coroutines") * [Modules](language/modules "cpp/language/modules") * [Constraints and concepts](language/constraints "cpp/language/constraints") * [Abbreviated function templates](language/function_template#Abbreviated_function_template "cpp/language/function template") * DR: [array new](language/new "cpp/language/new") can deduce array size New library features --------------------- ### New headers * [`<bit>`](header/bit "cpp/header/bit") * [`<compare>`](header/compare "cpp/header/compare") * [`<concepts>`](header/concepts "cpp/header/concepts") * [`<coroutine>`](header/coroutine "cpp/header/coroutine") * [`<format>`](header/format "cpp/header/format") * [`<numbers>`](header/numbers "cpp/header/numbers") * [`<ranges>`](header/ranges "cpp/header/ranges") * [`<source_location>`](header/source_location "cpp/header/source location") * [`<span>`](header/span "cpp/header/span") * [`<syncstream>`](header/syncstream "cpp/header/syncstream") * [`<version>`](header/version "cpp/header/version") in [Thread support library](thread "cpp/thread"): * [`<barrier>`](header/barrier "cpp/header/barrier") * [`<latch>`](header/latch "cpp/header/latch") * [`<semaphore>`](header/semaphore "cpp/header/semaphore") * [`<stop_token>`](header/stop_token "cpp/header/stop token") ### Library features * [Library feature-test macros](utility/feature_test "cpp/utility/feature test") * [Formatting library](utility/format "cpp/utility/format") * [Calendar](chrono#Calendar "cpp/chrono") and [Time Zone](chrono#Time_zone "cpp/chrono") library * [`source_location`](utility/source_location "cpp/utility/source location") * [`span`](container/span "cpp/container/span") * [`endian`](types/endian "cpp/types/endian") * array support for [`make_shared`](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared") * [`remove_cvref`](types/remove_cvref "cpp/types/remove cvref") * [`to_address`](memory/to_address "cpp/memory/to address") * [`floating-point atomics`](atomic/atomic#Specializations_for_floating-point_types "cpp/atomic/atomic"), [`shared_ptr atomics`](memory/shared_ptr/atomic "cpp/memory/shared ptr/atomic") * thread-coordination classes: [`barrier`](thread/barrier "cpp/thread/barrier"), [`latch`](thread/latch "cpp/thread/latch"), and [`counting_semaphore`](thread/counting_semaphore "cpp/thread/counting semaphore") * [`jthread`](thread/jthread "cpp/thread/jthread") and [`thread cancellation`](thread#Thread_cancellation "cpp/thread") classes: [`stop_token`](thread/stop_token "cpp/thread/stop token"), [`stop_source`](thread/stop_source "cpp/thread/stop source"), and [`stop_callback`](thread/stop_callback "cpp/thread/stop callback") * [`basic_osyncstream`](io/basic_osyncstream "cpp/io/basic osyncstream") * [`u8string`](string/basic_string "cpp/string/basic string") and other [`char8_t`](language/types#char8_t "cpp/language/types") uses * constexpr for [`<algorithm>`](header/algorithm "cpp/header/algorithm"), [`<utility>`](header/utility "cpp/header/utility"), [`<complex>`](header/complex "cpp/header/complex") * [`string::starts_with`](string/basic_string/starts_with "cpp/string/basic string/starts with") / [`ends_with`](string/basic_string/ends_with "cpp/string/basic string/ends with") and [`string_view::starts_with`](string/basic_string_view/starts_with "cpp/string/basic string view/starts with") / [`ends_with`](string/basic_string_view/ends_with "cpp/string/basic string view/ends with") * [`assume_aligned`](memory/assume_aligned "cpp/memory/assume aligned") * [`bind_front`](utility/functional/bind_front "cpp/utility/functional/bind front") * [`c8rtomb`](string/multibyte/c8rtomb "cpp/string/multibyte/c8rtomb")/[`mbrtoc8`](string/multibyte/mbrtoc8 "cpp/string/multibyte/mbrtoc8") * [`make_obj_using_allocator`](memory/make_obj_using_allocator "cpp/memory/make obj using allocator") etc * [`make_shared_for_overwrite`](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared")/[`make_unique_for_overwrite`](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique") * heterogeneous lookup in unordered associative containers * [`polymorphic_allocator`](memory/polymorphic_allocator "cpp/memory/polymorphic allocator") with additional member functions and [`byte`](types/byte "cpp/types/byte") as its default template argument * [`execution::unseq`](algorithm/execution_policy_tag "cpp/algorithm/execution policy tag") * [`midpoint`](numeric/midpoint "cpp/numeric/midpoint") and [`lerp`](numeric/lerp "cpp/numeric/lerp") * [`ssize`](iterator/size "cpp/iterator/size") * [`is_bounded_array`](types/is_bounded_array "cpp/types/is bounded array"), [`is_unbounded_array`](types/is_unbounded_array "cpp/types/is unbounded array") * [Ranges](ranges "cpp/ranges") * uniform container erasure: `[std::erase](container/vector/erase2 "cpp/container/vector/erase2")`/`[std::erase\_if](container/vector/erase2 "cpp/container/vector/erase2")`, e.g. [`std::erase(std::list)`](container/list/erase2 "cpp/container/list/erase2") or [`erase_if(std::map)`](container/map/erase_if "cpp/container/map/erase if") etc * [Mathematical constants](numeric/constants "cpp/numeric/constants") in [`<numbers>`](header/numbers "cpp/header/numbers") Defect reports --------------- | Defect Reports fixed in C++20 (158 core, 345 library) | | --- | | * [CWG#555](https://wg21.cmeerw.net/cwg/issue555) * [CWG#581](https://wg21.cmeerw.net/cwg/issue581) * [CWG#682](https://wg21.cmeerw.net/cwg/issue682) * [CWG#943](https://wg21.cmeerw.net/cwg/issue943) * [CWG#1076](https://wg21.cmeerw.net/cwg/issue1076) * [CWG#1271](https://wg21.cmeerw.net/cwg/issue1271) * [CWG#1299](https://wg21.cmeerw.net/cwg/issue1299) * [CWG#1331](https://wg21.cmeerw.net/cwg/issue1331) * [CWG#1332](https://wg21.cmeerw.net/cwg/issue1332) * [CWG#1426](https://wg21.cmeerw.net/cwg/issue1426) * [CWG#1469](https://wg21.cmeerw.net/cwg/issue1469) * [CWG#1523](https://wg21.cmeerw.net/cwg/issue1523) * [CWG#1581](https://wg21.cmeerw.net/cwg/issue1581) * [CWG#1621](https://wg21.cmeerw.net/cwg/issue1621) * [CWG#1632](https://wg21.cmeerw.net/cwg/issue1632) * [CWG#1636](https://wg21.cmeerw.net/cwg/issue1636) * [CWG#1640](https://wg21.cmeerw.net/cwg/issue1640) * [CWG#1646](https://wg21.cmeerw.net/cwg/issue1646) * [CWG#1704](https://wg21.cmeerw.net/cwg/issue1704) * [CWG#1728](https://wg21.cmeerw.net/cwg/issue1728) * [CWG#1781](https://wg21.cmeerw.net/cwg/issue1781) * [CWG#1836](https://wg21.cmeerw.net/cwg/issue1836) * [CWG#1857](https://wg21.cmeerw.net/cwg/issue1857) * [CWG#1859](https://wg21.cmeerw.net/cwg/issue1859) * [CWG#1862](https://wg21.cmeerw.net/cwg/issue1862) * [CWG#1893](https://wg21.cmeerw.net/cwg/issue1893) * [CWG#1910](https://wg21.cmeerw.net/cwg/issue1910) * [CWG#1912](https://wg21.cmeerw.net/cwg/issue1912) * [CWG#1913](https://wg21.cmeerw.net/cwg/issue1913) * [CWG#1931](https://wg21.cmeerw.net/cwg/issue1931) * [CWG#1937](https://wg21.cmeerw.net/cwg/issue1937) * [CWG#1938](https://wg21.cmeerw.net/cwg/issue1938) * [CWG#1943](https://wg21.cmeerw.net/cwg/issue1943) * [CWG#1983](https://wg21.cmeerw.net/cwg/issue1983) * [CWG#2020](https://wg21.cmeerw.net/cwg/issue2020) * [CWG#2045](https://wg21.cmeerw.net/cwg/issue2045) * [CWG#2051](https://wg21.cmeerw.net/cwg/issue2051) * [CWG#2053](https://wg21.cmeerw.net/cwg/issue2053) * [CWG#2059](https://wg21.cmeerw.net/cwg/issue2059) * [CWG#2080](https://wg21.cmeerw.net/cwg/issue2080) * [CWG#2081](https://wg21.cmeerw.net/cwg/issue2081) * [CWG#2083](https://wg21.cmeerw.net/cwg/issue2083) * [CWG#2088](https://wg21.cmeerw.net/cwg/issue2088) * [CWG#2092](https://wg21.cmeerw.net/cwg/issue2092) * [CWG#2103](https://wg21.cmeerw.net/cwg/issue2103) * [CWG#2112](https://wg21.cmeerw.net/cwg/issue2112) * [CWG#2126](https://wg21.cmeerw.net/cwg/issue2126) * [CWG#2133](https://wg21.cmeerw.net/cwg/issue2133) * [CWG#2164](https://wg21.cmeerw.net/cwg/issue2164) * [CWG#2170](https://wg21.cmeerw.net/cwg/issue2170) * [CWG#2177](https://wg21.cmeerw.net/cwg/issue2177) * [CWG#2207](https://wg21.cmeerw.net/cwg/issue2207) * [CWG#2226](https://wg21.cmeerw.net/cwg/issue2226) * [CWG#2227](https://wg21.cmeerw.net/cwg/issue2227) * [CWG#2229](https://wg21.cmeerw.net/cwg/issue2229) * [CWG#2233](https://wg21.cmeerw.net/cwg/issue2233) * [CWG#2234](https://wg21.cmeerw.net/cwg/issue2234) * [CWG#2235](https://wg21.cmeerw.net/cwg/issue2235) * [CWG#2237](https://wg21.cmeerw.net/cwg/issue2237) * [CWG#2241](https://wg21.cmeerw.net/cwg/issue2241) * [CWG#2249](https://wg21.cmeerw.net/cwg/issue2249) * [CWG#2253](https://wg21.cmeerw.net/cwg/issue2253) * [CWG#2254](https://wg21.cmeerw.net/cwg/issue2254) * [CWG#2255](https://wg21.cmeerw.net/cwg/issue2255) * [CWG#2256](https://wg21.cmeerw.net/cwg/issue2256) * [CWG#2257](https://wg21.cmeerw.net/cwg/issue2257) * [CWG#2260](https://wg21.cmeerw.net/cwg/issue2260) * [CWG#2266](https://wg21.cmeerw.net/cwg/issue2266) * [CWG#2267](https://wg21.cmeerw.net/cwg/issue2267) * [CWG#2273](https://wg21.cmeerw.net/cwg/issue2273) * [CWG#2277](https://wg21.cmeerw.net/cwg/issue2277) * [CWG#2278](https://wg21.cmeerw.net/cwg/issue2278) * [CWG#2280](https://wg21.cmeerw.net/cwg/issue2280) * [CWG#2282](https://wg21.cmeerw.net/cwg/issue2282) * [CWG#2285](https://wg21.cmeerw.net/cwg/issue2285) * [CWG#2287](https://wg21.cmeerw.net/cwg/issue2287) * [CWG#2289](https://wg21.cmeerw.net/cwg/issue2289) * [CWG#2290](https://wg21.cmeerw.net/cwg/issue2290) * [CWG#2292](https://wg21.cmeerw.net/cwg/issue2292) * [CWG#2293](https://wg21.cmeerw.net/cwg/issue2293) * [CWG#2294](https://wg21.cmeerw.net/cwg/issue2294) * [CWG#2295](https://wg21.cmeerw.net/cwg/issue2295) * [CWG#2299](https://wg21.cmeerw.net/cwg/issue2299) * [CWG#2300](https://wg21.cmeerw.net/cwg/issue2300) * [CWG#2303](https://wg21.cmeerw.net/cwg/issue2303) * [CWG#2305](https://wg21.cmeerw.net/cwg/issue2305) * [CWG#2307](https://wg21.cmeerw.net/cwg/issue2307) * [CWG#2309](https://wg21.cmeerw.net/cwg/issue2309) * [CWG#2310](https://wg21.cmeerw.net/cwg/issue2310) * [CWG#2313](https://wg21.cmeerw.net/cwg/issue2313) * [CWG#2315](https://wg21.cmeerw.net/cwg/issue2315) * [CWG#2317](https://wg21.cmeerw.net/cwg/issue2317) * [CWG#2318](https://wg21.cmeerw.net/cwg/issue2318) * [CWG#2321](https://wg21.cmeerw.net/cwg/issue2321) * [CWG#2322](https://wg21.cmeerw.net/cwg/issue2322) * [CWG#2323](https://wg21.cmeerw.net/cwg/issue2323) * [CWG#2330](https://wg21.cmeerw.net/cwg/issue2330) * [CWG#2332](https://wg21.cmeerw.net/cwg/issue2332) * [CWG#2336](https://wg21.cmeerw.net/cwg/issue2336) * [CWG#2338](https://wg21.cmeerw.net/cwg/issue2338) * [CWG#2339](https://wg21.cmeerw.net/cwg/issue2339) * [CWG#2342](https://wg21.cmeerw.net/cwg/issue2342) * [CWG#2343](https://wg21.cmeerw.net/cwg/issue2343) * [CWG#2346](https://wg21.cmeerw.net/cwg/issue2346) * [CWG#2347](https://wg21.cmeerw.net/cwg/issue2347) * [CWG#2351](https://wg21.cmeerw.net/cwg/issue2351) * [CWG#2352](https://wg21.cmeerw.net/cwg/issue2352) * [CWG#2353](https://wg21.cmeerw.net/cwg/issue2353) * [CWG#2354](https://wg21.cmeerw.net/cwg/issue2354) * [CWG#2356](https://wg21.cmeerw.net/cwg/issue2356) * [CWG#2358](https://wg21.cmeerw.net/cwg/issue2358) * [CWG#2359](https://wg21.cmeerw.net/cwg/issue2359) * [CWG#2360](https://wg21.cmeerw.net/cwg/issue2360) * [CWG#2365](https://wg21.cmeerw.net/cwg/issue2365) * [CWG#2366](https://wg21.cmeerw.net/cwg/issue2366) * [CWG#2368](https://wg21.cmeerw.net/cwg/issue2368) * [CWG#2371](https://wg21.cmeerw.net/cwg/issue2371) * [CWG#2372](https://wg21.cmeerw.net/cwg/issue2372) * [CWG#2373](https://wg21.cmeerw.net/cwg/issue2373) * [CWG#2374](https://wg21.cmeerw.net/cwg/issue2374) * [CWG#2376](https://wg21.cmeerw.net/cwg/issue2376) * [CWG#2378](https://wg21.cmeerw.net/cwg/issue2378) * [CWG#2379](https://wg21.cmeerw.net/cwg/issue2379) * [CWG#2380](https://wg21.cmeerw.net/cwg/issue2380) * [CWG#2381](https://wg21.cmeerw.net/cwg/issue2381) * [CWG#2382](https://wg21.cmeerw.net/cwg/issue2382) * [CWG#2384](https://wg21.cmeerw.net/cwg/issue2384) * [CWG#2385](https://wg21.cmeerw.net/cwg/issue2385) * [CWG#2386](https://wg21.cmeerw.net/cwg/issue2386) * [CWG#2387](https://wg21.cmeerw.net/cwg/issue2387) * [CWG#2390](https://wg21.cmeerw.net/cwg/issue2390) * [CWG#2394](https://wg21.cmeerw.net/cwg/issue2394) * [CWG#2399](https://wg21.cmeerw.net/cwg/issue2399) * [CWG#2400](https://wg21.cmeerw.net/cwg/issue2400) * [CWG#2404](https://wg21.cmeerw.net/cwg/issue2404) * [CWG#2406](https://wg21.cmeerw.net/cwg/issue2406) * [CWG#2411](https://wg21.cmeerw.net/cwg/issue2411) * [CWG#2414](https://wg21.cmeerw.net/cwg/issue2414) * [CWG#2416](https://wg21.cmeerw.net/cwg/issue2416) * [CWG#2418](https://wg21.cmeerw.net/cwg/issue2418) * [CWG#2419](https://wg21.cmeerw.net/cwg/issue2419) * [CWG#2422](https://wg21.cmeerw.net/cwg/issue2422) * [CWG#2424](https://wg21.cmeerw.net/cwg/issue2424) * [CWG#2426](https://wg21.cmeerw.net/cwg/issue2426) * [CWG#2427](https://wg21.cmeerw.net/cwg/issue2427) * [CWG#2429](https://wg21.cmeerw.net/cwg/issue2429) * [CWG#2430](https://wg21.cmeerw.net/cwg/issue2430) * [CWG#2431](https://wg21.cmeerw.net/cwg/issue2431) * [CWG#2432](https://wg21.cmeerw.net/cwg/issue2432) * [CWG#2433](https://wg21.cmeerw.net/cwg/issue2433) * [CWG#2436](https://wg21.cmeerw.net/cwg/issue2436) * [CWG#2437](https://wg21.cmeerw.net/cwg/issue2437) * [CWG#2439](https://wg21.cmeerw.net/cwg/issue2439) * [CWG#2441](https://wg21.cmeerw.net/cwg/issue2441) * [CWG#2442](https://wg21.cmeerw.net/cwg/issue2442) * [CWG#2445](https://wg21.cmeerw.net/cwg/issue2445) * [CWG#2446](https://wg21.cmeerw.net/cwg/issue2446) * [CWG#2447](https://wg21.cmeerw.net/cwg/issue2447) * [LWG#1052](https://wg21.link/lwg1052) * [LWG#1203](https://wg21.link/lwg1203) * [LWG#2040](https://wg21.link/lwg2040) * [LWG#2055](https://wg21.link/lwg2055) * [LWG#2070](https://wg21.link/lwg2070) * [LWG#2089](https://wg21.link/lwg2089) * [LWG#2139](https://wg21.link/lwg2139) * [LWG#2151](https://wg21.link/lwg2151) * [LWG#2154](https://wg21.link/lwg2154) * [LWG#2155](https://wg21.link/lwg2155) * [LWG#2164](https://wg21.link/lwg2164) * [LWG#2183](https://wg21.link/lwg2183) * [LWG#2184](https://wg21.link/lwg2184) * [LWG#2243](https://wg21.link/lwg2243) * [LWG#2292](https://wg21.link/lwg2292) * [LWG#2318](https://wg21.link/lwg2318) * [LWG#2334](https://wg21.link/lwg2334) * [LWG#2412](https://wg21.link/lwg2412) * [LWG#2444](https://wg21.link/lwg2444) * [LWG#2498](https://wg21.link/lwg2498) * [LWG#2499](https://wg21.link/lwg2499) * [LWG#2511](https://wg21.link/lwg2511) * [LWG#2593](https://wg21.link/lwg2593) * [LWG#2597](https://wg21.link/lwg2597) * [LWG#2682](https://wg21.link/lwg2682) * [LWG#2693](https://wg21.link/lwg2693) * [LWG#2741](https://wg21.link/lwg2741) * [LWG#2780](https://wg21.link/lwg2780) * [LWG#2783](https://wg21.link/lwg2783) * [LWG#2797](https://wg21.link/lwg2797) * [LWG#2800](https://wg21.link/lwg2800) * [LWG#2808](https://wg21.link/lwg2808) * [LWG#2814](https://wg21.link/lwg2814) * [LWG#2816](https://wg21.link/lwg2816) * [LWG#2821](https://wg21.link/lwg2821) * [LWG#2831](https://wg21.link/lwg2831) * [LWG#2832](https://wg21.link/lwg2832) * [LWG#2836](https://wg21.link/lwg2836) * [LWG#2840](https://wg21.link/lwg2840) * [LWG#2841](https://wg21.link/lwg2841) * [LWG#2843](https://wg21.link/lwg2843) * [LWG#2849](https://wg21.link/lwg2849) * [LWG#2851](https://wg21.link/lwg2851) * [LWG#2856](https://wg21.link/lwg2856) * [LWG#2859](https://wg21.link/lwg2859) * [LWG#2870](https://wg21.link/lwg2870) * [LWG#2894](https://wg21.link/lwg2894) * [LWG#2897](https://wg21.link/lwg2897) * [LWG#2899](https://wg21.link/lwg2899) * [LWG#2929](https://wg21.link/lwg2929) * [LWG#2932](https://wg21.link/lwg2932) * [LWG#2935](https://wg21.link/lwg2935) * [LWG#2936](https://wg21.link/lwg2936) * [LWG#2937](https://wg21.link/lwg2937) * [LWG#2938](https://wg21.link/lwg2938) * [LWG#2940](https://wg21.link/lwg2940) * [LWG#2941](https://wg21.link/lwg2941) * [LWG#2942](https://wg21.link/lwg2942) * [LWG#2943](https://wg21.link/lwg2943) * [LWG#2944](https://wg21.link/lwg2944) * [LWG#2945](https://wg21.link/lwg2945) * [LWG#2946](https://wg21.link/lwg2946) * [LWG#2948](https://wg21.link/lwg2948) * [LWG#2950](https://wg21.link/lwg2950) * [LWG#2951](https://wg21.link/lwg2951) * [LWG#2952](https://wg21.link/lwg2952) * [LWG#2953](https://wg21.link/lwg2953) * [LWG#2954](https://wg21.link/lwg2954) * [LWG#2955](https://wg21.link/lwg2955) * [LWG#2957](https://wg21.link/lwg2957) * [LWG#2958](https://wg21.link/lwg2958) * [LWG#2961](https://wg21.link/lwg2961) * [LWG#2964](https://wg21.link/lwg2964) * [LWG#2965](https://wg21.link/lwg2965) * [LWG#2966](https://wg21.link/lwg2966) * [LWG#2968](https://wg21.link/lwg2968) * [LWG#2969](https://wg21.link/lwg2969) * [LWG#2970](https://wg21.link/lwg2970) * [LWG#2972](https://wg21.link/lwg2972) * [LWG#2974](https://wg21.link/lwg2974) * [LWG#2975](https://wg21.link/lwg2975) * [LWG#2976](https://wg21.link/lwg2976) * [LWG#2977](https://wg21.link/lwg2977) * [LWG#2978](https://wg21.link/lwg2978) * [LWG#2979](https://wg21.link/lwg2979) * [LWG#2980](https://wg21.link/lwg2980) * [LWG#2981](https://wg21.link/lwg2981) * [LWG#2982](https://wg21.link/lwg2982) * [LWG#2988](https://wg21.link/lwg2988) * [LWG#2989](https://wg21.link/lwg2989) * [LWG#2993](https://wg21.link/lwg2993) * [LWG#2995](https://wg21.link/lwg2995) * [LWG#2996](https://wg21.link/lwg2996) * [LWG#2998](https://wg21.link/lwg2998) * [LWG#2999](https://wg21.link/lwg2999) * [LWG#3000](https://wg21.link/lwg3000) * [LWG#3001](https://wg21.link/lwg3001) * [LWG#3004](https://wg21.link/lwg3004) * [LWG#3005](https://wg21.link/lwg3005) * [LWG#3006](https://wg21.link/lwg3006) * [LWG#3007](https://wg21.link/lwg3007) * [LWG#3008](https://wg21.link/lwg3008) * [LWG#3009](https://wg21.link/lwg3009) * [LWG#3012](https://wg21.link/lwg3012) * [LWG#3013](https://wg21.link/lwg3013) * [LWG#3014](https://wg21.link/lwg3014) * [LWG#3015](https://wg21.link/lwg3015) * [LWG#3017](https://wg21.link/lwg3017) * [LWG#3018](https://wg21.link/lwg3018) * [LWG#3022](https://wg21.link/lwg3022) * [LWG#3023](https://wg21.link/lwg3023) * [LWG#3024](https://wg21.link/lwg3024) * [LWG#3025](https://wg21.link/lwg3025) * [LWG#3026](https://wg21.link/lwg3026) * [LWG#3030](https://wg21.link/lwg3030) * [LWG#3031](https://wg21.link/lwg3031) * [LWG#3034](https://wg21.link/lwg3034) * [LWG#3035](https://wg21.link/lwg3035) * [LWG#3037](https://wg21.link/lwg3037) * [LWG#3038](https://wg21.link/lwg3038) * [LWG#3039](https://wg21.link/lwg3039) * [LWG#3040](https://wg21.link/lwg3040) * [LWG#3041](https://wg21.link/lwg3041) * [LWG#3042](https://wg21.link/lwg3042) * [LWG#3043](https://wg21.link/lwg3043) * [LWG#3045](https://wg21.link/lwg3045) * [LWG#3048](https://wg21.link/lwg3048) * [LWG#3050](https://wg21.link/lwg3050) * [LWG#3051](https://wg21.link/lwg3051) * [LWG#3054](https://wg21.link/lwg3054) * [LWG#3055](https://wg21.link/lwg3055) * [LWG#3058](https://wg21.link/lwg3058) * [LWG#3061](https://wg21.link/lwg3061) * [LWG#3062](https://wg21.link/lwg3062) * [LWG#3065](https://wg21.link/lwg3065) * [LWG#3067](https://wg21.link/lwg3067) * [LWG#3070](https://wg21.link/lwg3070) * [LWG#3074](https://wg21.link/lwg3074) * [LWG#3075](https://wg21.link/lwg3075) * [LWG#3076](https://wg21.link/lwg3076) * [LWG#3077](https://wg21.link/lwg3077) * [LWG#3079](https://wg21.link/lwg3079) * [LWG#3080](https://wg21.link/lwg3080) * [LWG#3083](https://wg21.link/lwg3083) * [LWG#3087](https://wg21.link/lwg3087) * [LWG#3091](https://wg21.link/lwg3091) * [LWG#3094](https://wg21.link/lwg3094) * [LWG#3096](https://wg21.link/lwg3096) * [LWG#3100](https://wg21.link/lwg3100) * [LWG#3101](https://wg21.link/lwg3101) * [LWG#3102](https://wg21.link/lwg3102) * [LWG#3103](https://wg21.link/lwg3103) * [LWG#3104](https://wg21.link/lwg3104) * [LWG#3110](https://wg21.link/lwg3110) * [LWG#3111](https://wg21.link/lwg3111) * [LWG#3112](https://wg21.link/lwg3112) * [LWG#3113](https://wg21.link/lwg3113) * [LWG#3115](https://wg21.link/lwg3115) * [LWG#3116](https://wg21.link/lwg3116) * [LWG#3119](https://wg21.link/lwg3119) * [LWG#3122](https://wg21.link/lwg3122) * [LWG#3125](https://wg21.link/lwg3125) * [LWG#3127](https://wg21.link/lwg3127) * [LWG#3128](https://wg21.link/lwg3128) * [LWG#3129](https://wg21.link/lwg3129) * [LWG#3130](https://wg21.link/lwg3130) * [LWG#3131](https://wg21.link/lwg3131) * [LWG#3132](https://wg21.link/lwg3132) * [LWG#3133](https://wg21.link/lwg3133) * [LWG#3137](https://wg21.link/lwg3137) * [LWG#3140](https://wg21.link/lwg3140) * [LWG#3141](https://wg21.link/lwg3141) * [LWG#3144](https://wg21.link/lwg3144) * [LWG#3145](https://wg21.link/lwg3145) * [LWG#3147](https://wg21.link/lwg3147) * [LWG#3148](https://wg21.link/lwg3148) * [LWG#3149](https://wg21.link/lwg3149) * [LWG#3150](https://wg21.link/lwg3150) * [LWG#3151](https://wg21.link/lwg3151) * [LWG#3153](https://wg21.link/lwg3153) * [LWG#3154](https://wg21.link/lwg3154) * [LWG#3156](https://wg21.link/lwg3156) * [LWG#3158](https://wg21.link/lwg3158) * [LWG#3160](https://wg21.link/lwg3160) * [LWG#3168](https://wg21.link/lwg3168) * [LWG#3169](https://wg21.link/lwg3169) * [LWG#3173](https://wg21.link/lwg3173) * [LWG#3175](https://wg21.link/lwg3175) * [LWG#3176](https://wg21.link/lwg3176) * [LWG#3178](https://wg21.link/lwg3178) * [LWG#3179](https://wg21.link/lwg3179) * [LWG#3180](https://wg21.link/lwg3180) * [LWG#3182](https://wg21.link/lwg3182) * [LWG#3183](https://wg21.link/lwg3183) * [LWG#3184](https://wg21.link/lwg3184) * [LWG#3185](https://wg21.link/lwg3185) * [LWG#3186](https://wg21.link/lwg3186) * [LWG#3187](https://wg21.link/lwg3187) * [LWG#3190](https://wg21.link/lwg3190) * [LWG#3191](https://wg21.link/lwg3191) * [LWG#3194](https://wg21.link/lwg3194) * [LWG#3196](https://wg21.link/lwg3196) * [LWG#3198](https://wg21.link/lwg3198) * [LWG#3199](https://wg21.link/lwg3199) * [LWG#3200](https://wg21.link/lwg3200) * [LWG#3201](https://wg21.link/lwg3201) * [LWG#3202](https://wg21.link/lwg3202) * [LWG#3206](https://wg21.link/lwg3206) * [LWG#3208](https://wg21.link/lwg3208) * [LWG#3209](https://wg21.link/lwg3209) * [LWG#3212](https://wg21.link/lwg3212) * [LWG#3213](https://wg21.link/lwg3213) * [LWG#3218](https://wg21.link/lwg3218) * [LWG#3221](https://wg21.link/lwg3221) * [LWG#3222](https://wg21.link/lwg3222) * [LWG#3224](https://wg21.link/lwg3224) * [LWG#3225](https://wg21.link/lwg3225) * [LWG#3226](https://wg21.link/lwg3226) * [LWG#3228](https://wg21.link/lwg3228) * [LWG#3230](https://wg21.link/lwg3230) * [LWG#3231](https://wg21.link/lwg3231) * [LWG#3232](https://wg21.link/lwg3232) * [LWG#3233](https://wg21.link/lwg3233) * [LWG#3235](https://wg21.link/lwg3235) * [LWG#3237](https://wg21.link/lwg3237) * [LWG#3238](https://wg21.link/lwg3238) * [LWG#3239](https://wg21.link/lwg3239) * [LWG#3241](https://wg21.link/lwg3241) * [LWG#3242](https://wg21.link/lwg3242) * [LWG#3243](https://wg21.link/lwg3243) * [LWG#3244](https://wg21.link/lwg3244) * [LWG#3245](https://wg21.link/lwg3245) * [LWG#3246](https://wg21.link/lwg3246) * [LWG#3247](https://wg21.link/lwg3247) * [LWG#3248](https://wg21.link/lwg3248) * [LWG#3250](https://wg21.link/lwg3250) * [LWG#3251](https://wg21.link/lwg3251) * [LWG#3252](https://wg21.link/lwg3252) * [LWG#3253](https://wg21.link/lwg3253) * [LWG#3254](https://wg21.link/lwg3254) * [LWG#3255](https://wg21.link/lwg3255) * [LWG#3256](https://wg21.link/lwg3256) * [LWG#3257](https://wg21.link/lwg3257) * [LWG#3259](https://wg21.link/lwg3259) * [LWG#3260](https://wg21.link/lwg3260) * [LWG#3262](https://wg21.link/lwg3262) * [LWG#3264](https://wg21.link/lwg3264) * [LWG#3266](https://wg21.link/lwg3266) * [LWG#3269](https://wg21.link/lwg3269) * [LWG#3270](https://wg21.link/lwg3270) * [LWG#3272](https://wg21.link/lwg3272) * [LWG#3273](https://wg21.link/lwg3273) * [LWG#3274](https://wg21.link/lwg3274) * [LWG#3276](https://wg21.link/lwg3276) * [LWG#3277](https://wg21.link/lwg3277) * [LWG#3278](https://wg21.link/lwg3278) * [LWG#3279](https://wg21.link/lwg3279) * [LWG#3280](https://wg21.link/lwg3280) * [LWG#3281](https://wg21.link/lwg3281) * [LWG#3282](https://wg21.link/lwg3282) * [LWG#3284](https://wg21.link/lwg3284) * [LWG#3285](https://wg21.link/lwg3285) * [LWG#3286](https://wg21.link/lwg3286) * [LWG#3290](https://wg21.link/lwg3290) * [LWG#3291](https://wg21.link/lwg3291) * [LWG#3292](https://wg21.link/lwg3292) * [LWG#3294](https://wg21.link/lwg3294) * [LWG#3295](https://wg21.link/lwg3295) * [LWG#3296](https://wg21.link/lwg3296) * [LWG#3299](https://wg21.link/lwg3299) * [LWG#3300](https://wg21.link/lwg3300) * [LWG#3301](https://wg21.link/lwg3301) * [LWG#3302](https://wg21.link/lwg3302) * [LWG#3303](https://wg21.link/lwg3303) * [LWG#3304](https://wg21.link/lwg3304) * [LWG#3307](https://wg21.link/lwg3307) * [LWG#3310](https://wg21.link/lwg3310) * [LWG#3313](https://wg21.link/lwg3313) * [LWG#3314](https://wg21.link/lwg3314) * [LWG#3315](https://wg21.link/lwg3315) * [LWG#3316](https://wg21.link/lwg3316) * [LWG#3317](https://wg21.link/lwg3317) * [LWG#3318](https://wg21.link/lwg3318) * [LWG#3319](https://wg21.link/lwg3319) * [LWG#3320](https://wg21.link/lwg3320) * [LWG#3321](https://wg21.link/lwg3321) * [LWG#3322](https://wg21.link/lwg3322) * [LWG#3323](https://wg21.link/lwg3323) * [LWG#3324](https://wg21.link/lwg3324) * [LWG#3325](https://wg21.link/lwg3325) * [LWG#3326](https://wg21.link/lwg3326) * [LWG#3327](https://wg21.link/lwg3327) * [LWG#3328](https://wg21.link/lwg3328) * [LWG#3329](https://wg21.link/lwg3329) * [LWG#3330](https://wg21.link/lwg3330) * [LWG#3331](https://wg21.link/lwg3331) * [LWG#3332](https://wg21.link/lwg3332) * [LWG#3334](https://wg21.link/lwg3334) * [LWG#3335](https://wg21.link/lwg3335) * [LWG#3336](https://wg21.link/lwg3336) * [LWG#3338](https://wg21.link/lwg3338) * [LWG#3340](https://wg21.link/lwg3340) * [LWG#3345](https://wg21.link/lwg3345) * [LWG#3346](https://wg21.link/lwg3346) * [LWG#3347](https://wg21.link/lwg3347) * [LWG#3348](https://wg21.link/lwg3348) * [LWG#3349](https://wg21.link/lwg3349) * [LWG#3350](https://wg21.link/lwg3350) * [LWG#3351](https://wg21.link/lwg3351) * [LWG#3352](https://wg21.link/lwg3352) * [LWG#3354](https://wg21.link/lwg3354) * [LWG#3355](https://wg21.link/lwg3355) * [LWG#3356](https://wg21.link/lwg3356) * [LWG#3358](https://wg21.link/lwg3358) * [LWG#3359](https://wg21.link/lwg3359) * [LWG#3360](https://wg21.link/lwg3360) * [LWG#3362](https://wg21.link/lwg3362) * [LWG#3363](https://wg21.link/lwg3363) * [LWG#3364](https://wg21.link/lwg3364) * [LWG#3367](https://wg21.link/lwg3367) * [LWG#3368](https://wg21.link/lwg3368) * [LWG#3369](https://wg21.link/lwg3369) * [LWG#3371](https://wg21.link/lwg3371) * [LWG#3372](https://wg21.link/lwg3372) * [LWG#3373](https://wg21.link/lwg3373) * [LWG#3374](https://wg21.link/lwg3374) * [LWG#3375](https://wg21.link/lwg3375) * [LWG#3377](https://wg21.link/lwg3377) * [LWG#3379](https://wg21.link/lwg3379) * [LWG#3380](https://wg21.link/lwg3380) * [LWG#3381](https://wg21.link/lwg3381) * [LWG#3382](https://wg21.link/lwg3382) * [LWG#3383](https://wg21.link/lwg3383) * [LWG#3384](https://wg21.link/lwg3384) * [LWG#3385](https://wg21.link/lwg3385) * [LWG#3386](https://wg21.link/lwg3386) * [LWG#3387](https://wg21.link/lwg3387) * [LWG#3388](https://wg21.link/lwg3388) * [LWG#3389](https://wg21.link/lwg3389) * [LWG#3390](https://wg21.link/lwg3390) * [LWG#3393](https://wg21.link/lwg3393) * [LWG#3395](https://wg21.link/lwg3395) * [LWG#3396](https://wg21.link/lwg3396) * [LWG#3397](https://wg21.link/lwg3397) * [LWG#3398](https://wg21.link/lwg3398) | Compiler support ----------------- Main Article: [C++20 compiler support](compiler_support#C.2B.2B20_features "cpp/compiler support"). ### C++20 core language features | C++20 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++ (ex Portland Group/PGI) | Nvidia nvcc | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Allow [lambda-capture](language/lambda#Lambda_capture "cpp/language/lambda") `[=, this]` | [P0409R2](https://wg21.link/P0409R2) | 8 | 6 | 19.22\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [`__VA_OPT__`](preprocessor/replace#Function-like_macros "cpp/preprocessor/replace") | [P0306R4](https://wg21.link/P0306R4)[P1042R1](https://wg21.link/P1042R1) | 8 (partial)\*10 (partial)\*12 | 9 | 19.25\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Designated initializers](language/aggregate_initialization#Designated_initializers "cpp/language/aggregate initialization") | [P0329R4](https://wg21.link/P0329R4) | 4.7 (partial)\*8 | 3.0 (partial)\*10 | 19.21\* | 12.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [template-parameter-list for generic lambdas](language/lambda#Syntax "cpp/language/lambda") | [P0428R2](https://wg21.link/P0428R2) | 8 | 9 | 19.22\* | 11.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Default member initializers for bit-fields](language/bit_field#Cpp20_Default_member_initializers_for_bit_fields "cpp/language/bit field") | [P0683R1](https://wg21.link/P0683R1) | 8 | 6 | 19.25\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | Initializer list constructors in class template argument deduction | [P0702R1](https://wg21.link/P0702R1) | 8 | 6 | 19.14\* | Yes | 5.0 | 2021.1 | | | | | 20.7 | | | `const&`-qualified pointers to members | [P0704R1](https://wg21.link/P0704R1) | 8 | 6 | 19.0 (2015)\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Concepts](language/constraints "cpp/language/constraints") | [P0734R0](https://wg21.link/P0734R0) | 6(TS only)10 | 10 | 19.23\* (partial)\*19.30\* | 12.0.0\* (partial). | 6.1 | 2021.5 | | | | | 20.11 | | | [Lambdas in unevaluated contexts](language/lambda#Lambdas_in_unevaluated_contexts "cpp/language/lambda") | [P0315R4](https://wg21.link/P0315R4) | 9 | 13 (partial)\*14 (partial)\* | 19.28 (16.8)\* | 13.1.6\* (partial). | | | | | | | | | | [Three-way comparison operator](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") | [P0515R3](https://wg21.link/P0515R3) | 10 | 8 (partial)10 | 19.20\* | | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Simplifying implicit lambda capture | [P0588R1](https://wg21.link/P0588R1) | 8 | | 19.24\* | | 5.1 | 2021.1 | | | | | 20.7 | | | [init-statements for range-based for](language/range-for#Syntax "cpp/language/range-for") | [P0614R1](https://wg21.link/P0614R1) | 9 | 8 | 19.25\* | 11.0.0\* | 6.0 | | | | | | 20.11 | | | Default constructible and assignable stateless [lambdas](language/lambda "cpp/language/lambda") | [P0624R2](https://wg21.link/P0624R2) | 9 | 8 | 19.22\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | `const` mismatch with defaulted copy constructor | [P0641R2](https://wg21.link/P0641R2) | 9 | 8 | 19.0 (2015)\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | Access checking on specializations | [P0692R1](https://wg21.link/P0692R1) | Yes | 8 (partial)14 | 19.26\* | 14.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | ADL and function templates that are not visible | [P0846R0](https://wg21.link/P0846R0) | 9 | 9 | 19.21\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Specify when `constexpr` function definitions are [needed for constant evaluation](language/constant_expression#Functions_and_variables_needed_for_constant_evaluation "cpp/language/constant expression") | [P0859R0](https://wg21.link/P0859R0) | 5.2 (partial)\*9 | 8 | 19.27\* (partial)\*19.31\* | 11.0.0\* | | | | | | | | | | Attributes `[[[likely](language/attributes/likely "cpp/language/attributes/likely")]]` and `[[[unlikely](language/attributes/likely "cpp/language/attributes/likely")]]` | [P0479R5](https://wg21.link/P0479R5) | 9 | 12 | 19.26\* | 13.0.0\* | 5.1 | | | | | | 20.7 | | | Make [`typename`](keywords/typename "cpp/keywords/typename") more optional | [P0634R3](https://wg21.link/P0634R3) | 9 | | 19.29 (16.10)\* | | 5.1 | | | | | | 20.7 | | | Pack expansion in lambda init-capture | [P0780R2](https://wg21.link/P0780R2) | 9 | 9 | 19.22\* | 11.0.3\* | 6.1 | | | | | | 20.11 | | | Attribute `[[[no\_unique\_address](language/attributes/no_unique_address "cpp/language/attributes/no unique address")]]` | [P0840R2](https://wg21.link/P0840R2) | 9 | 9 | 19.28 (16.9)\*\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Conditionally Trivial Special Member Functions | [P0848R3](https://wg21.link/P0848R3) | 10 | 16 | 19.28 (16.8)\* | | 6.1 | | | | | | 20.11 | | | DR: Relaxing the [structured bindings](language/structured_binding "cpp/language/structured binding") customization point finding rules | [P0961R1](https://wg21.link/P0961R1) | 8 | 8 | 19.21\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Relaxing the [range-`for` loop](language/range-for "cpp/language/range-for") customization point finding rules | [P0962R1](https://wg21.link/P0962R1) | 8 | 8 | 19.25\* | 11.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Allow structured bindings to accessible members | [P0969R0](https://wg21.link/P0969R0) | 8 | 8 | 19.21\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Destroying `operator delete`](memory/new/operator_delete "cpp/memory/new/operator delete") | [P0722R3](https://wg21.link/P0722R3) | 9 | 6 | 19.27\* | 10.0.0\* | 6.1 | | | | | | 20.11 | | | Class types in [non-type template parameters](language/template_parameters#Non-type_template_parameter "cpp/language/template parameters") | [P0732R2](https://wg21.link/P0732R2) | 9 | 12 (partial) | 19.26\*(partial)\*19.28 (16.9)\* | 13.0.0\* (partial). | 6.2 | | | | | | | | | Deprecate implicit [capture](language/lambda#Lambda_capture "cpp/language/lambda") of `this` via `[=]` | [P0806R2](https://wg21.link/P0806R2) | 9 | 7 | 19.22\* | 10.0.1\* | 5.1 | | | | | | 20.7 | | | [`explicit(bool)`](language/explicit "cpp/language/explicit") | [P0892R2](https://wg21.link/P0892R2) | 9 | 9 | 19.24\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Integrating [feature-test macros](feature_test "cpp/feature test") | [P0941R2](https://wg21.link/P0941R2) | 5 | 3.4 | 19.15\* (partial)19.20\* | Yes | 5.0 | 2021.1 | | | | | 20.7 | | | Prohibit aggregates with user-declared constructors | [P1008R1](https://wg21.link/P1008R1) | 9 | 8 | 19.20\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | `constexpr` [virtual function](language/virtual "cpp/language/virtual") | [P1064R0](https://wg21.link/P1064R0) | 9 | 9 | 19.28 (16.9)\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Consistency improvements for comparisons | [P1120R0](https://wg21.link/P1120R0) | 10 | 8 (partial)10 | 19.22\* | 12.0.0\* | 5.1 | | | | | | 20.7 | | | [`char8_t`](language/types#char8_t "cpp/language/types") | [P0482R6](https://wg21.link/P0482R6) | 9 | 7\* | 19.22\* | 10.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | [`std::is_constant_evaluated()`](types/is_constant_evaluated "cpp/types/is constant evaluated") | [P0595R2](https://wg21.link/P0595R2) | 9 | 9 | 19.25\* | 11.0.3\* | 5.1 | 19.1 | | | | | | | | `constexpr` `try`-`catch` blocks | [P1002R1](https://wg21.link/P1002R1) | 9 | 8 | 19.25\* | 10.0.1\* | 5.1 | | | | | | 20.7 | | | [Immediate functions](language/consteval "cpp/language/consteval") (`consteval`) | [P1073R3](https://wg21.link/P1073R3) | 10 (partial)\* 11 | 11 (partial)14 (partial)\* | 19.28 (16.8)\*\*(partial)19.29 (16.10)\* | 11.0.3\* (partial). | 5.1 | 2021.1 | | | | | 20.7 | | | [Nested inline namespaces](language/namespace "cpp/language/namespace") | [P1094R2](https://wg21.link/P1094R2) | 9 | 8 | 19.27\* | 10.0.1\* | 5.1 | 2021.1 | | | | | 20.7 | | | Yet another approach for [constrained](language/template_parameters#Type_template_parameter "cpp/language/template parameters") [declarations](language/auto "cpp/language/auto") | [P1141R2](https://wg21.link/P1141R2) | 10 | 10 | 19.26\* (partial)19.28 (16.9)\* | 12.0.5\* | 6.1 | | | | | | 20.11 | | | Signed integers are two's complement | [P1236R1](https://wg21.link/P1236R1) | 9 | 9 | Yes | 11.0.3\* | N/A | N/A | | | | | yes | | | [`dynamic_cast`](language/dynamic_cast "cpp/language/dynamic cast") and polymorphic [`typeid`](language/typeid "cpp/language/typeid") in [constant expressions](language/constant_expression "cpp/language/constant expression") | [P1327R1](https://wg21.link/P1327R1) | 10 | 9 | 19.29 (16.10)\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Changing the active member of a union inside `constexpr` | [P1330R0](https://wg21.link/P1330R0) | 9 | 9 | 19.10\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Coroutines](language/coroutines "cpp/language/coroutines") | [P0912R5](https://wg21.link/P0912R5) | 10 | 8 (partial) | 19.0 (2015)\* (partial)19.10\* (TS only)19.28 (16.8)\* | 10.0.1\* (partial). | 5.1 | 2021.1 | | | | | | | | Parenthesized [initialization of aggregates](language/aggregate_initialization "cpp/language/aggregate initialization") | [P0960R3](https://wg21.link/P0960R3) | 10 | | 19.28 (16.8)\* | | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Array size deduction in [`new`-expressions](language/new "cpp/language/new") | [P1009R2](https://wg21.link/P1009R2) | 11 | 9 | 19.27\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | [Modules](language/modules "cpp/language/modules") | [P1103R3](https://wg21.link/P1103R3) | 11 (partial) | 8 (partial) | 19.0 (2015)\* (partial)19.10\* (TS only)19.28 (16.8)\* | 10.0.1\* (partial). | | | | | | | | | | Stronger Unicode requirements | [P1041R4](https://wg21.link/P1041R4)[P1139R2](https://wg21.link/P1139R2) | 10 | Yes | 19.0 (2015)\* (P1041R4)19.26\* (P1139R2) | Yes | N/A | | | | | | | | | `<=> != ==` | [P1185R2](https://wg21.link/P1185R2) | 10 | 10 | 19.22\* | 12.0.0\* | 5.1 | 2021.1 | | | | | 20.7 | | | DR: Explicitly defaulted functions with different exception specifications | [P1286R2](https://wg21.link/P1286R2) | 10 | 9 | 19.28 (16.8)\* | 11.0.3\* | 5.1 | 2021.1 | | | | | 20.7 | | | Lambda capture and storage class specifiers of structured bindings | [P1091R3](https://wg21.link/P1091R3)[P1381R1](https://wg21.link/P1381R1) | 10 | 8 (partial)16 | 19.11\*(P1381R1)19.24\*(P1091R3) | 10.0.1\* (partial). | 5.1 | 2021.1 | | | | | 20.7 | | | Permit conversions to arrays of unknown bound | [P0388R4](https://wg21.link/P0388R4) | 10 | 14 | 19.27\* | 14.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | `constexpr` container operations | [P0784R7](https://wg21.link/P0784R7) | 10 | 10 | 19.28 (16.9)\* | 12.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | Deprecating some uses of [`volatile`](language/cv#Notes "cpp/language/cv") | [P1152R4](https://wg21.link/P1152R4) | 10 | 10 | 19.27\* | 12.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | [`constinit`](language/constinit "cpp/language/constinit") | [P1143R2](https://wg21.link/P1143R2) | 10 | 10 | 19.29 (16.10)\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | [Deprecate comma operator in subscripts](language/operator_other#Built-in_comma_operator "cpp/language/operator other") | [P1161R3](https://wg21.link/P1161R3) | 10 | 9 | 19.25\* | 11.0.3\* | 6.0 | 2021.5 | | | | | 20.11 | | | `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` with message | [P1301R4](https://wg21.link/P1301R4) | 10 | 9 | 19.25\* | 11.0.3\* | 6.0 | 2021.5 | | | | | 20.11 | | | Trivial default initialization in `constexpr` functions | [P1331R2](https://wg21.link/P1331R2) | 10 | 10 | 19.27\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | Unevaluated `asm`-declaration in `constexpr` functions | [P1668R1](https://wg21.link/P1668R1) | 10 | 10 | 19.28 (16.9)\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | [`using enum`](language/enum#Using-enum-declaration "cpp/language/enum") | [P1099R5](https://wg21.link/P1099R5) | 11 | 13 | 19.24\* | 13.1.6\* | 6.3 | | | | | | | | | Synthesizing [three-way comparison](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") for specified comparison category | [P1186R3](https://wg21.link/P1186R3) | 11 | 10 | 19.24\* | 12.0.0\* | 6.0 | 2021.5 | | | | | 20.11 | | | DR: `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` for constructors | [P1771R1](https://wg21.link/P1771R1) | 10 | 9 | 19.24\* | 11.0.3\* | 6.0 | 2021.5 | | | | | 20.11 | | | [Class template argument deduction](language/class_template_argument_deduction "cpp/language/class template argument deduction") for alias templates | [P1814R0](https://wg21.link/P1814R0) | 10 | | 19.27\* | | | | | | | | | | | Class template argument deduction for aggregates | [P1816R0](https://wg21.link/P1816R0)[P2082R1](https://wg21.link/P2082R1) | 10(P1816R0)11(P2082R1) | | 19.27\* | | 6.3 | | | | | | | | | DR: [Implicit move](language/return "cpp/language/return") for more local objects and rvalue references | [P1825R0](https://wg21.link/P1825R0) | 11\* | 13 | 19.24\* | 13.1.6\* | 6.0 | 2021.5 | | | | | 20.11 | | | Allow defaulting comparisons by value | [P1946R0](https://wg21.link/P1946R0) | 10 | 10 | 19.25\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | Remove `std::weak_equality` and `std::strong_equality` | [P1959R0](https://wg21.link/P1959R0) | 10 | 10 | 19.25\* | 12.0.0\* | 6.1 | | | | | | 20.11 | | | Inconsistencies with non-type template parameters | [P1907R1](https://wg21.link/P1907R1) | 10 (partial)11 | 12 (partial) | 19.26\* | 13.1.6\* (partial). | 6.2 | | | | | | | | | DR: Pseudo-destructors end object lifetimes | [P0593R6](https://wg21.link/P0593R6) | 11 | 11 | Yes | 12.0.5\* | N/A | N/A | | | | | | | | DR: Converting from `T*` to `bool` should be considered narrowing | [P1957R2](https://wg21.link/P1957R2) | 10\* 11\* | 11 | 19.27\* | 12.0.5\* | 6.1 | | | | | | | | | C++20 feature | Paper(s) | GCC | Clang | MSVC | Apple Clang | EDG eccp | Intel C++ | IBM XLC++ | Sun/Oracle C++ | Embarcadero C++ Builder | Cray | Nvidia HPC C++(ex Portland Group/PGI) | Nvidia nvcc | ### C++20 library features | C++20 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | [`std::endian`](types/endian "cpp/types/endian") | [P0463R1](https://wg21.link/P0463R1) | 8 | 7 | 19.22\* | 10.0.0\* | | | | | Extending `[std::make\_shared()](memory/shared_ptr/make_shared "cpp/memory/shared ptr/make shared")` to support arrays | [P0674R1](https://wg21.link/P0674R1) | 12 | 15 | 19.27\* | | | | | | [Floating-point atomic](atomic/atomic#Specializations_for_floating-point_types "cpp/atomic/atomic") | [P0020R6](https://wg21.link/P0020R6) | 10 | | 19.22\* | | | | | | [Synchronized buffered](io/basic_syncbuf "cpp/io/basic syncbuf") ([`std::basic_osyncstream`](io/basic_osyncstream "cpp/io/basic osyncstream")) | [P0053R7](https://wg21.link/P0053R7) | 11 | | 19.29 (16.10)\* | | | | | | `constexpr` for [`<algorithm>`](header/algorithm "cpp/header/algorithm") and [`<utility>`](header/utility "cpp/header/utility") | [P0202R3](https://wg21.link/P0202R3) | 10 | 8 (partial)12 | 19.26\* | 10.0.1\* (partial) 13.0.0\* | | | | | More `constexpr` for [`<complex>`](header/complex "cpp/header/complex") | [P0415R1](https://wg21.link/P0415R1) | 9 | 7 (partial) | 19.27\* | 10.0.0\* (partial). | | | | | Make `[std::memory\_order](atomic/memory_order "cpp/atomic/memory order")` a scoped enumeration | [P0439R0](https://wg21.link/P0439R0) | 9 | 9 | 19.25\* | 11.0.3\* | | | | | [String](string/basic_string "cpp/string/basic string") [prefix](string/basic_string/starts_with "cpp/string/basic string/starts with") and [suffix](string/basic_string/ends_with "cpp/string/basic string/ends with") checking: [`string`](string/basic_string/starts_with "cpp/string/basic string/starts with")[`(_view)`](string/basic_string_view/starts_with "cpp/string/basic string view/starts with") [`::starts_with`](string/basic_string/starts_with "cpp/string/basic string/starts with")/[`ends_with`](string/basic_string_view/ends_with "cpp/string/basic string view/ends with") | [P0457R2](https://wg21.link/P0457R2) | 9 | 6 | 19.21\* | 10.0.0\* | | | | | Library support for [`operator<=>`](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") [`<compare>`](header/compare "cpp/header/compare") | [P0768R1](https://wg21.link/P0768R1) | 10 | 7 (partial)12 | 19.20\* (partial)19.28 (16.9)\* | 13.0.0\* | | | | | [`std::remove_cvref`](types/remove_cvref "cpp/types/remove cvref") | [P0550R2](https://wg21.link/P0550R2) | 9 | 6 | 19.20\* | 10.0.0\* | | | | | `[[[nodiscard](language/attributes/nodiscard "cpp/language/attributes/nodiscard")]]` in the [standard library](language/attributes/nodiscard#Standard_library "cpp/language/attributes/nodiscard") | [P0600R1](https://wg21.link/P0600R1) | 9 | 7 (partial) | 19.13\* (partial)19.22\* | 10.0.0\* (partial). | | | | | Using `std::move` in [numeric algorithms](numeric "cpp/numeric") | [P0616R0](https://wg21.link/P0616R0) | 9 | 12 | 19.23\* | 13.0.0\* | | | | | [Utility](memory/to_address "cpp/memory/to address") to convert a pointer to a raw pointer | [P0653R2](https://wg21.link/P0653R2) | 8 | 6 | 19.22\* | Yes | | | | | Atomic [`std::shared_ptr`](memory/shared_ptr/atomic2 "cpp/memory/shared ptr/atomic2") and [`std::weak_ptr`](memory/weak_ptr/atomic2 "cpp/memory/weak ptr/atomic2") | [P0718R2](https://wg21.link/P0718R2) | 12 | | 19.27\* | | | | | | [`std::span`](container/span "cpp/container/span") | [P0122R7](https://wg21.link/P0122R7) | 10 | 7 | 19.26\* | 10.0.0\* | | | | | [Calendar](chrono#Calendar "cpp/chrono") and [timezone](chrono#Time_zone "cpp/chrono") | [P0355R7](https://wg21.link/P0355R7) | 11 (partial) | 7 (partial) | 19.29 (16.10)\* | 10.0.0\* (partial). | | | | | [`<version>`](header/version "cpp/header/version") | [P0754R2](https://wg21.link/P0754R2) | 9 | 7 | 19.22\* | 10.0.0\* | | | | | Comparing unordered containers | [P0809R0](https://wg21.link/P0809R0) | Yes | Yes | 16.0\* | Yes | | | | | [ConstexprIterator](named_req/constexpriterator "cpp/named req/ConstexprIterator") requirements | [P0858R0](https://wg21.link/P0858R0) | 9 | 12 | 19.11\* | 13.0.0\* | | | | | `[std::basic\_string::reserve()](string/basic_string/reserve "cpp/string/basic string/reserve")` should not shrink | [P0966R1](https://wg21.link/P0966R1) | 11 | 8 | 19.25\* | 10.0.1\* | | | | | [Atomic Compare-And-Exchange](atomic/atomic/compare_exchange "cpp/atomic/atomic/compare exchange") with padding bits | [P0528R3](https://wg21.link/P0528R3) | | | 19.28 (16.8)\* | | | | | | [`std::atomic_ref`](atomic/atomic_ref "cpp/atomic/atomic ref") | [P0019R8](https://wg21.link/P0019R8) | 10 | | 19.28 (16.8)\* | | | | | | `contains()` member function of associative containers, e.g. [`std::map::contains()`](container/map/contains "cpp/container/map/contains") | [P0458R2](https://wg21.link/P0458R2) | 9 | 13 | 19.21\* | 13.1.6\* | | | | | DR: Guaranteed copy elision for [piecewise construction](memory/scoped_allocator_adaptor/construct "cpp/memory/scoped allocator adaptor/construct") | [P0475R1](https://wg21.link/P0475R1) | 9 | Yes | 19.29 (16.10)\* | Yes | | | | | [`std::bit_cast()`](numeric/bit_cast "cpp/numeric/bit cast") | [P0476R2](https://wg21.link/P0476R2) | 11 | 14 | 19.27\* | | | | | | [Integral power-of-2 operations](header/bit "cpp/header/bit"): [`std::bit_ceil()`](numeric/bit_ceil "cpp/numeric/bit ceil"), [`std::bit_floor()`](numeric/bit_floor "cpp/numeric/bit floor"), [`std::bit_width()`](numeric/bit_width "cpp/numeric/bit width"), [`std::has_single_bit()`](numeric/has_single_bit "cpp/numeric/has single bit"). | [P0556R3](https://wg21.link/P0556R3) [P1956R1](https://wg21.link/P1956R1) | 9 (P0556R3)10 (P1956R1) | 9 (P0556R3)12 (P1956R1) | 19.25\* (P0556R3)\*19.27\* (P1956R1)\*19.28 (16.8)\* | 11.0.3\* (P0556R3) 13.0.0\* (P1956R1). | | | | | Improving the return value of erase-like algorithms | [P0646R1](https://wg21.link/P0646R1) | 9 | 10 | 19.21\* | 12.0.0\* | | | | | [`std::destroying_delete`](memory/new/destroying_delete_t "cpp/memory/new/destroying delete t") | [P0722R3](https://wg21.link/P0722R3) | 9 | 9 | 19.27\* | 11.0.3\* | | | | | [`std::is_nothrow_convertible`](types/is_convertible "cpp/types/is convertible") | [P0758R1](https://wg21.link/P0758R1) | 9 | 9 | 19.23\* | 11.0.3\* | | | | | Add [`std::shift_left/right`](algorithm/shift "cpp/algorithm/shift") to [`<algorithm>`](header/algorithm "cpp/header/algorithm") | [P0769R2](https://wg21.link/P0769R2) | 10 | 12 | 19.21\* | 13.0.0\* | | | | | Constexpr for `[std::swap()](algorithm/swap "cpp/algorithm/swap")` and `swap` related functions | [P0879R0](https://wg21.link/P0879R0) | 10 | 13 | 19.26\* | 13.1.6\* | | | | | [`std::type_identity`](types/type_identity "cpp/types/type identity") | [P0887R1](https://wg21.link/P0887R1) | 9 | 8 | 19.21\* | 10.0.1\* | | | | | [Concepts library](concepts "cpp/concepts") | [P0898R3](https://wg21.link/P0898R3) | 10 | 13 | 19.23\* | 13.1.6\* | | | | | `constexpr` [comparison operators](container/array/operator_cmp "cpp/container/array/operator cmp") for `[std::array](container/array "cpp/container/array")` | [P1023R0](https://wg21.link/P1023R0) | 10 | 8 | 19.27\* | 10.0.1\* | | | | | [`std::unwrap_ref_decay` and `std::unwrap_reference`](utility/functional/unwrap_reference "cpp/utility/functional/unwrap reference") | [P0318R1](https://wg21.link/P0318R1) | 9 | 8 | 19.21\* | 10.0.1\* | | | | | [`std::bind_front()`](utility/functional/bind_front "cpp/utility/functional/bind front") | [P0356R5](https://wg21.link/P0356R5) | 9 | 13 | 19.25\* | 13.1.6\* | | | | | `[std::reference\_wrapper](utility/functional/reference_wrapper "cpp/utility/functional/reference wrapper")` for incomplete types | [P0357R3](https://wg21.link/P0357R3) | 9 | 8 | 19.26\* | 10.0.1\* | | | | | Fixing [`operator>>(basic_istream&, CharT*)`](io/basic_istream/operator_gtgt2 "cpp/io/basic istream/operator gtgt2") | [P0487R1](https://wg21.link/P0487R1) | 11 | 8 | 19.23\* | 10.0.1\* | | | | | Library support for [`char8_t`](language/types#char8_t "cpp/language/types") | [P0482R6](https://wg21.link/P0482R6) | 9 | 8 (partial)\* | 19.22\* | 10.0.1\* (partial). | | | | | [Utility functions](memory/uses_allocator_construction_args "cpp/memory/uses allocator construction args") to implement [uses-allocator](memory/make_obj_using_allocator "cpp/memory/make obj using allocator") [construction](memory/uninitialized_construct_using_allocator "cpp/memory/uninitialized construct using allocator") | [P0591R4](https://wg21.link/P0591R4) | 9 | | 19.29 (16.10)\* | | | | | | DR: `[std::variant](utility/variant "cpp/utility/variant")` and `[std::optional](utility/optional "cpp/utility/optional")` should propagate copy/move triviality | [P0602R4](https://wg21.link/P0602R4) | 8.3 | 8 | 19.11\* | 10.0.1\* | | | | | A sane `[std::variant](utility/variant "cpp/utility/variant")` converting constructor | [P0608R3](https://wg21.link/P0608R3) | 10 | 9 | 19.29 (16.10)\* | 11.0.3\* | | | | | `[std::function](utility/functional/function "cpp/utility/functional/function")`'s move constructor should be [`noexcept`](language/noexcept "cpp/language/noexcept") | [P0771R1](https://wg21.link/P0771R1) | 7.2 | 6 | 19.22\* | Yes | | | | | The [One](iterator "cpp/iterator") [Ranges](ranges "cpp/ranges") [Proposal](algorithm/ranges "cpp/algorithm/ranges") | [P0896R4](https://wg21.link/P0896R4) | 10 (partial)\* | 13 (partial)15 | 19.29 (16.10)\* | | | | | | Heterogeneous lookup for [unordered containers](container#Unordered_associative_containers "cpp/container") | [P0919R3](https://wg21.link/P0919R3) [P1690R1](https://wg21.link/P1690R1) | 11 | 12 | 19.23\* (P0919R3)19.25\* (P1690R1) | 13.0.0\* | | | | | [`<chrono>`](header/chrono "cpp/header/chrono") `zero()`, `min()`, and `max()` should be [`noexcept`](language/noexcept "cpp/language/noexcept") | [P0972R0](https://wg21.link/P0972R0) | 9 | 8 | 19.14\* | 10.0.1\* | | | | | `constexpr` in `[std::pointer\_traits](memory/pointer_traits "cpp/memory/pointer traits")` | [P1006R1](https://wg21.link/P1006R1) | 9 | 8 | 19.26\* | 10.0.1\* | | | | | [`std::assume_aligned()`](memory/assume_aligned "cpp/memory/assume aligned") | [P1007R3](https://wg21.link/P1007R3) | 9\*11 | 15 | 19.28 (16.9)\* | | | | | | Smart pointer creation with default initialization (e.g. [`make_unique_for_overwrite`](memory/unique_ptr/make_unique "cpp/memory/unique ptr/make unique")) | [P1020R1](https://wg21.link/P1020R1)[P1973R1](https://wg21.link/P1973R1) | 11 (unique\_ptr)12 (shared\_ptr) | | 19.28 (16.9)\* | | | | | | Misc `constexpr` bits | [P1032R1](https://wg21.link/P1032R1) | 10 | 13 | 19.28 (16.8)\* | 13.1.6\* | | | | | Remove comparison operators of [`std::span`](container/span "cpp/container/span") | [P1085R2](https://wg21.link/P1085R2) | 10 | 8 | 19.26\* | 10.0.1\* | | | | | Make stateful allocator propagation more consistent for [`operator+(basic_string)`](string/basic_string/operator_plus_ "cpp/string/basic string/operator+") | [P1165R1](https://wg21.link/P1165R1) | 10 | 15 | 19.26\* | | | | | | Consistent container erasure, e.g. [`std::erase(std::vector)`](container/vector/erase2 "cpp/container/vector/erase2"), or [`std::erase_if(std::map)`](container/map/erase_if "cpp/container/map/erase if") | [P1209R0](https://wg21.link/P1209R0) [P1115R3](https://wg21.link/P1115R3) | 9 (P1209R0)10 (P1115R3) | 8 (P1209R0) 11 (P1115R3) | 19.25\* (P1209R0)19.27\* (P1115R3) | 10.0.1\* (P1209R0) 12.0.5\* (P1115R3). | | | | | Standard library header units | [P1502R1](https://wg21.link/P1502R1) | 11 | | 19.29 (16.10)\* | | | | | | [`polymorphic_allocator<>`](memory/polymorphic_allocator "cpp/memory/polymorphic allocator") as a vocabulary type | [P0339R6](https://wg21.link/P0339R6) | 9 | | 19.28 (16.9)\* | | | | | | [`std::execution::unseq`](algorithm/execution_policy_tag "cpp/algorithm/execution policy tag") | [P1001R2](https://wg21.link/P1001R2) | 9 | | 19.28 (16.8)\* | | | | | | [`std::lerp()`](numeric/lerp "cpp/numeric/lerp") and [`std::midpoint()`](numeric/midpoint "cpp/numeric/midpoint") | [P0811R3](https://wg21.link/P0811R3) | 9 | 9 | 19.23\* (partial)19.28 (16.8)\* | 11.0.3\* | | | | | Usability enhancements for [`std::span`](container/span "cpp/container/span") | [P1024R3](https://wg21.link/P1024R3) | 10 | 9\*14 | 19.26\* | 11.0.3\* | | | | | DR: Make [`create_directory()`](filesystem/create_directory "cpp/filesystem/create directory") intuitive | [P1164R1](https://wg21.link/P1164R1) | 8.3 | 12 | 19.20\* | 13.0.0\* | | | | | [`std::ssize()`](iterator/size "cpp/iterator/size") and unsigned extent for [`std::span`](container/span "cpp/container/span") | [P1227R2](https://wg21.link/P1227R2) | 10 | 9 | 19.25\* | 11.0.3\* | | | | | Traits for ([un](types/is_unbounded_array "cpp/types/is unbounded array"))[bounded](types/is_bounded_array "cpp/types/is bounded array") arrays | [P1357R1](https://wg21.link/P1357R1) | 9 | 9 | 19.25\* | 11.0.3\* | | | | | [`std::to_array()`](container/array/to_array "cpp/container/array/to array") | [P0325R4](https://wg21.link/P0325R4) | 10 | 10 | 19.25\* | 12.0.0\* | | | | | Efficient access to `[std::basic\_stringbuf](io/basic_stringbuf "cpp/io/basic stringbuf")`’s buffer | [P0408R7](https://wg21.link/P0408R7) | 11 | | 19.29 (16.10)\* | | | | | | [Layout](types/is_layout_compatible "cpp/types/is layout compatible")-[compatibility](types/is_corresponding_member "cpp/types/is corresponding member") and [pointer](types/is_pointer_interconvertible_base_of "cpp/types/is pointer interconvertible base of")-[interconvertibility](types/is_pointer_interconvertible_with_class "cpp/types/is pointer interconvertible with class") traits | [P0466R5](https://wg21.link/P0466R5) | 12 | | 19.29 (16.10)\*\* | | | | | | [Bit operations](numeric#Bit_manipulation_.28since_C.2B.2B20.29 "cpp/numeric"): `std::` [`rotl()`](numeric/rotl "cpp/numeric/rotl"), [`rotr()`](numeric/rotr "cpp/numeric/rotr"), [`countl_zero()`](numeric/countl_zero "cpp/numeric/countl zero"), [`countl_one()`](numeric/countl_one "cpp/numeric/countl one"), [`countr_zero()`](numeric/countr_zero "cpp/numeric/countr zero"), [`countr_one()`](numeric/countr_one "cpp/numeric/countr one"), [`popcount()`](numeric/popcount "cpp/numeric/popcount"). | [P0553R4](https://wg21.link/P0553R4) | 9 | 9 | 19.25\*\*19.28 (16.8)\* | 11.0.3\* | | | | | [Mathematical constants](numeric/constants "cpp/numeric/constants") | [P0631R8](https://wg21.link/P0631R8) | 10 | 11 | 19.25\* | 12.0.5\* | | | | | [Text formatting](utility/format "cpp/utility/format") | [P0645R10](https://wg21.link/P0645R10) | | 14\* | 19.29 (16.10)\* | | | | | | [`std::stop_token`](thread/stop_token "cpp/thread/stop token") and [`std::jthread`](thread/jthread "cpp/thread/jthread") | [P0660R10](https://wg21.link/P0660R10) | 10 | | 19.28 (16.9)\* | | | | | | `constexpr` `[std::allocator](memory/allocator "cpp/memory/allocator")` and related utilities | [P0784R7](https://wg21.link/P0784R7) | 10 | 12 | 19.29 (16.10)\* | 13.0.0\* | | | | | `constexpr` `[std::string](string/basic_string "cpp/string/basic string")` | [P0980R1](https://wg21.link/P0980R1) | 12 | 15 | 19.29 (16.10)\*19.30\*\* | | | | | | `constexpr` `[std::vector](container/vector "cpp/container/vector")` | [P1004R2](https://wg21.link/P1004R2) | 12 | 15 | 19.29 (16.10)\*19.30\*\* | | | | | | Input [range adaptors](ranges "cpp/ranges") | [P1035R7](https://wg21.link/P1035R7) | 10 | | 19.29 (16.10)\* | | | | | | `constexpr` `[std::invoke()](utility/functional/invoke "cpp/utility/functional/invoke")` and related utilities | [P1065R2](https://wg21.link/P1065R2) | 10 | 12 | 19.28 (16.8)\* | 13.0.0\* | | | | | Atomic waiting and notifying, [`std::counting_semaphore`](thread/counting_semaphore "cpp/thread/counting semaphore"), [`std::latch`](thread/latch "cpp/thread/latch") and [`std::barrier`](thread/barrier "cpp/thread/barrier") | [P1135R6](https://wg21.link/P1135R6) | 11 | 11 | 19.28 (16.8)\* | 13.1.6\* | | | | | [`std::source_location`](utility/source_location "cpp/utility/source location") | [P1208R6](https://wg21.link/P1208R6) | 11 | 15 (partial)\* | 19.29 (16.10)\* | | | | | | Adding [`<=>`](language/operator_comparison#Three-way_comparison "cpp/language/operator comparison") to the standard library | [P1614R2](https://wg21.link/P1614R2) | 10 | 14 (partial)\* | 19.29 (16.10)\* | 13.1.6\* (partial). | | | | | `constexpr` default constructor of `[std::atomic](atomic/atomic "cpp/atomic/atomic")` and `[std::atomic\_flag](atomic/atomic_flag "cpp/atomic/atomic flag")` | [P0883R2](https://wg21.link/P0883R2) | 10 | 13 | 19.26\* | 13.1.6\* | | | | | `constexpr` for [numeric algorithms](numeric#Numeric_operations "cpp/numeric") | [P1645R1](https://wg21.link/P1645R1) | 10 | 12 | 19.26\* | 13.0.0\* | | | | | [Safe integral comparisons](utility#Integer_comparison_functions "cpp/utility") | [P0586R2](https://wg21.link/P0586R2) | 10 | 13 | 19.27\* | 13.1.6\* | | | | | C++20 feature | Paper(s) | GCC libstdc++ | Clang libc++ | MSVC STL | Apple Clang | Sun/Oracle C++Standard Library | Embarcadero C++ BuilderStandard Library | Cray C++Standard Library | *\** - hover over a cell with the version number to see notes. ### External links * [Working C++20 examples](https://github.com/makelinux/examples/blob/HEAD/cpp/20.cpp)
programming_docs
cpp std::ranges::size std::ranges::size ================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr auto size = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr auto size( T&& t ); ``` | | (since C++20) | Calculates the number of elements in `t` in constant time. Let `t` be an object of type `T`. A call to `ranges::size` is expression-equivalent to: 1. `[std::extent\_v](http://en.cppreference.com/w/cpp/types/extent)<T>`, if `T` is an array type with a known bound. 2. Otherwise, `t.size()` converted to its [decayed type](../types/decay "cpp/types/decay"), if `[ranges::disable\_sized\_range](http://en.cppreference.com/w/cpp/ranges/sized_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `false`, and the converted expression is valid and has an [integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable") type. 3. Otherwise, `size(t)` converted to its [decayed type](../types/decay "cpp/types/decay"), if `[ranges::disable\_sized\_range](http://en.cppreference.com/w/cpp/ranges/sized_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `false`, and the converted expression is valid and has an [integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable") type, where the [overload resolution](../language/overload_resolution "cpp/language/overload resolution") is performed with the following candidates: * `void size(auto&) = delete;` * `void size(const auto&) = delete;` 4. Otherwise, `/\*to-unsigned-like\*/([ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(t) - [ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(t))`, if `T` models `[ranges::forward\_range](http://en.cppreference.com/w/cpp/ranges/forward_range)` and `[ranges::sentinel\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<T>` models `[std::sized\_sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sized_sentinel_for)<[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<T>>`, where `/*to-unsigned-like*/` denotes an explicit conversion to an [unsigned-integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable") type. In all other cases, a call to `ranges::size` is ill-formed, which can result in [substitution failure](../language/sfinae "cpp/language/sfinae") when `ranges::size(t)` appears in the immediate context of a template instantiation. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::size` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_size\_fn*`. All instances of `*\_\_size\_fn*` are equal. The effects of invoking different instances of type `*\_\_size\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::size` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::size` above, `*\_\_size\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__size_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __size_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__size_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __size_fn&, Args...>`. Otherwise, no function call operator of `*\_\_size\_fn*` participates in overload resolution. ### Notes Whenever `ranges::size(e)` is valid for an expression `e`, the return type is integer-like. The C++20 standard requires that if the underlying `size` function call returns a prvalue, the return value is move-constructed from the materialized temporary object. All implementations directly return the prvalue instead. The requirement is corrected by the post-C++20 proposal [P0849R8](https://wg21.link/P0849R8) to match the implementations. ### Example ``` #include <iostream> #include <ranges> #include <type_traits> #include <vector> int main() { auto v = std::vector<int>{}; std::cout << "ranges::size(v) == " << std::ranges::size(v) << '\n'; auto il = {7}; std::cout << "ranges::size(il) == " << std::ranges::size(il) << '\n'; int array[] = {4, 5}; // array has a known bound std::cout << "ranges::size(array) == " << std::ranges::size(array) << '\n'; std::cout << std::boolalpha << "is_signed: " << std::is_signed_v<decltype(std::ranges::size(v))> << '\n'; } ``` Output: ``` ranges::size(v) == 0 ranges::size(il) == 1 ranges::size(array) == 2 is_signed: false ``` ### See also | | | | --- | --- | | [ranges::ssize](ssize "cpp/ranges/ssize") (C++20) | returns a signed integer equal to the size of a range (customization point object) | | [ranges::sized\_range](sized_range "cpp/ranges/sized range") (C++20) | specifies that a range knows its size in constant time (concept) | | [sizessize](../iterator/size "cpp/iterator/size") (C++17)(C++20) | returns the size of a container or array (function template) | cpp std::ranges::crbegin std::ranges::crbegin ==================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ crbegin = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::input_or_output_iterator auto crbegin( T&& t ); ``` | | (since C++20) | Returns an iterator to the first element of the const-qualified argument that is treated as a reversed sequence. ![range-rbegin-rend.svg]() Let `CT` be. 1. `const [std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<T>&` if the argument is a lvalue (i.e. `T` is an lvalue reference type), 2. `const T` otherwise, a call to `ranges::crbegin` is expression-equivalent to `[ranges::rbegin](http://en.cppreference.com/w/cpp/ranges/rbegin)(static\_cast<CT&&>(t))`. The return type models `[std::input\_or\_output\_iterator](../iterator/input_or_output_iterator "cpp/iterator/input or output iterator")` in both cases. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::crbegin` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_crbegin\_fn*`. All instances of `*\_\_crbegin\_fn*` are equal. The effects of invoking different instances of type `*\_\_crbegin\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::crbegin` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::crbegin` above, `*\_\_crbegin\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__crbegin_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __crbegin_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__crbegin_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __crbegin_fn&, Args...>`. Otherwise, no function call operator of `*\_\_crbegin\_fn*` participates in overload resolution. ### Notes If the argument is an rvalue (i.e. `T` is an object type) and `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `false`, the call to `ranges::crbegin` is ill-formed, which also results in substitution failure. The return type models `std::input_or_output_iterator` in all cases. ### Example ``` #include <iostream> #include <vector> #include <iterator> #include <span> int main() { std::vector<int> v = { 3, 1, 4 }; auto vi = std::ranges::crbegin(v); std::cout << *vi << '\n'; int a[] = { -5, 10, 15 }; auto ai = std::ranges::crbegin(a); std::cout << *ai << '\n'; // auto x_x = std::ranges::crbegin(std::vector<int>{6,6,6}); // ill-formed: the argument is an rvalue (see Notes ↑) auto si = std::ranges::crbegin(std::span{a}); // OK: static_assert(std::ranges::enable_borrowed_range< std::remove_cv_t<decltype(std::span{a})>>); std::cout << *si << '\n'; } ``` Output: ``` 4 15 15 ``` ### See also | | | | --- | --- | | [ranges::rbegin](rbegin "cpp/ranges/rbegin") (C++20) | returns a reverse iterator to a range (customization point object) | | [rbegincrbegin](../iterator/rbegin "cpp/iterator/rbegin") (C++14) | returns a reverse iterator to the beginning of a container or array (function template) | cpp std::ranges::views::take_while, std::ranges::take_while_view std::ranges::views::take\_while, std::ranges::take\_while\_view =============================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::view V, class Pred > requires ranges::input_range<V> && std::is_object_v<Pred> && std::indirect_unary_predicate<const Pred, ranges::iterator_t<V>> class take_while_view : public ranges::view_interface<take_while_view<V, Pred>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /*unspecified*/ take_while = /*unspecified*/; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< ranges::viewable_range R, class Pred > requires /* see below */ constexpr ranges::view auto take_while( R&& r, Pred&& pred ); ``` | | (since C++20) | | ``` template< class Pred > constexpr /*range adaptor closure*/ take_while( Pred&& pred ); ``` | | (since C++20) | 1) A range adaptor that represents [`view`](view "cpp/ranges/view") of the elements from an underlying sequence, starting at the beginning and ending at the first element for which the predicate returns false. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::take_while(e, f)` is *expression-equivalent* to `take_while_view(e, f)` for any suitable subexpressions `e` and `f`. `take_while_view` models the concepts [`contiguous_range`](contiguous_range "cpp/ranges/contiguous range"), [`random_access_range`](random_access_range "cpp/ranges/random access range"), [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`forward_range`](forward_range "cpp/ranges/forward range"), and [`input_range`](input_range "cpp/ranges/input range") when the underlying view `V` models respective concepts. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Member functions | | | | --- | --- | | [(constructor)](take_while_view/take_while_view "cpp/ranges/take while view/take while view") (C++20) | constructs a `take_while_view` (public member function) | | [base](take_while_view/base "cpp/ranges/take while view/base") (C++20) | returns a copy of the underlying (adapted) view (public member function) | | [pred](take_while_view/pred "cpp/ranges/take while view/pred") (C++20) | returns a reference to the stored predicate (public member function) | | [begin](take_while_view/begin "cpp/ranges/take while view/begin") (C++20) | returns an iterator to the beginning (public member function) | | [end](take_while_view/end "cpp/ranges/take while view/end") (C++20) | returns a sentinel representing the end (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [data](view_interface/data "cpp/ranges/view interface/data") (C++20) | Gets the address of derived view's data. Provided if its iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"). (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](take_while_view/deduction_guides "cpp/ranges/take while view/deduction guides") ### Nested classes | | | | --- | --- | | [*sentinel*](take_while_view/sentinel "cpp/ranges/take while view/sentinel") | the sentinel type (exposition-only member class template) | ### Example ``` #include <ranges> #include <iostream> int main() { for (int year : std::views::iota(2017) | std::views::take_while([](int y) { return y <= 2020; })) { std::cout << year << ' '; } std::cout << '\n'; const char idea[] {"Today is yesterday's tomorrow!.."}; for (char x : std::ranges::take_while_view(idea, [](char c) { return c != '.'; })) { std::cout << x; } std::cout << '\n'; } ``` Output: ``` 2017 2018 2019 2020 Today is yesterday's tomorrow! ``` ### See also | | | | --- | --- | | [ranges::take\_viewviews::take](take_view "cpp/ranges/take view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the first N elements of another [`view`](view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::drop\_while\_viewviews::drop\_while](drop_while_view "cpp/ranges/drop while view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the elements of another [`view`](view "cpp/ranges/view"), skipping the initial subsequence of elements until the first element where the predicate returns false (class template) (range adaptor object) | cpp std::ranges::views::istream, std::ranges::basic_istream_view, std::ranges::istream_view, std::ranges::wistream_view std::ranges::views::istream, std::ranges::basic\_istream\_view, std::ranges::istream\_view, std::ranges::wistream\_view ======================================================================================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< std::movable Val, class CharT, class Traits = std::char_traits<CharT> > requires std::default_initializable<Val> && /*stream-extractable*/<Val,CharT,Traits> class basic_istream_view : public ranges::view_interface<basic_istream_view<Val,CharT,Traits>> ``` | (1) | (since C++20) | | Helper templates | | | | ``` template< class Val > using istream_view = ranges::basic_istream_view<Val, char>; ``` | (2) | (since C++20) | | ``` template< class Val > using wistream_view = ranges::basic_istream_view<Val, wchar_t>; ``` | (3) | (since C++20) | | Customization point objects | | | | ``` namespace views { template< class T > inline constexpr /*unspecified*/ istream = /*unspecified*/; } ``` | (4) | (since C++20) | | Helper concepts | | | | ``` template< class Val, class CharT, class Traits > concept /*stream-extractable*/ = requires(std::basic_istream<CharT,Traits>& is, Val& t) { is >> t; }; ``` | (5) | (since C++20) | 1) A range factory that generates a sequence of elements by repeatedly calling `operator>>`. 2-3) Convenience alias templates for character types `char` and `wchar_t`. 4) `views::istream<T>(e)` is *expression-equivalent* to (has the same effect as) `ranges::basic_istream_view<T, typename U::char_type, typename U::traits_type>(e)` for any suitable subexpressions `e`, where `U` is `[std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<decltype(e)>`. The program is ill-formed if `U` is not both publicly and unambiguously derived from `[std::basic\_istream](http://en.cppreference.com/w/cpp/io/basic_istream)<typename U::char\_type, typename U::traits\_type>`, which may result in a [substitution failure](../language/sfinae "cpp/language/sfinae"). 5) The exposition-only concept `/*stream-extractable*/<Val,CharT,Traits>` is satisfied when lvalue of `Val` can be extracted from lvalue of `[std::basic\_istream](http://en.cppreference.com/w/cpp/io/basic_istream)<CharT,Traits>`. The iterator type of `basic_istream_view` is move-only: it does not meet the [LegacyIterator](../named_req/iterator "cpp/named req/Iterator") requirements, and thus does not work with pre-C++20 [algorithms](../algorithm "cpp/algorithm"). ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `views::istream<T>` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_istream\_fn*<T>`. All instances of `*\_\_istream\_fn*<T>` are equal. The effects of invoking different instances of type `*\_\_istream\_fn*<T>` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::istream<T>` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::istream<T>` above, `*\_\_istream\_fn*<T>` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__istream_fn<T>, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __istream_fn<T>, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__istream_fn<T>&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __istream_fn<T>&, Args...>`. Otherwise, no function call operator of `*\_\_istream\_fn*<T>` participates in overload resolution. ### Member functions | | | | --- | --- | | **(constructor)** (C++20) | constructs a `basic_istream_view` (public member function) | | **begin** (C++20) | returns an iterator (public member function) | | **end** (C++20) | returns `[std::default\_sentinel](../iterator/default_sentinel "cpp/iterator/default sentinel")` (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | (none) | Although `basic_istream_view` is derived from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")`, it cannot use any of inherited member functions. | std::ranges::basic\_istream\_view::basic\_istream\_view -------------------------------------------------------- | | | | | --- | --- | --- | | ``` constexpr explicit basic_istream_view( std::basic_istream<CharT, Traits>& stream ); ``` | | (since C++20) | Initializes the stored pointer to stream with `[std::addressof](http://en.cppreference.com/w/cpp/memory/addressof)(stream)`, and value-initializes the stored value of `Val`. std::ranges::basic\_istream\_view::begin ----------------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto begin(); ``` | | (since C++20) | Equivalent to `*stream_ >> value_; return /*iterator*/{*this};`, where `stream_` is the stored pointer to stream and `value_` is the stored value of `Val`. std::ranges::basic\_istream\_view::end --------------------------------------- | | | | | --- | --- | --- | | ``` constexpr std::default_sentinel_t end() const noexcept; ``` | | (since C++20) | Equivalent to `return [std::default\_sentinel](http://en.cppreference.com/w/cpp/iterator/default_sentinel);`. ### Nested classes | | | | --- | --- | | [*iterator*](basic_istream_view/iterator "cpp/ranges/basic istream view/iterator") (C++20) | the iterator type of `basic_istream_view`, the name is exposition-only (exposition-only member class) | ### Example ``` #include <algorithm> #include <iomanip> #include <iostream> #include <iterator> #include <ranges> #include <sstream> #include <string> int main() { auto words = std::istringstream{"today is yesterday’s tomorrow"}; for (const auto& s: std::ranges::istream_view<std::string>(words)) { std::cout << std::quoted(s, '/') << ' '; } std::cout << '\n'; auto floats = std::istringstream{"1.1 2.2\t3.3\v4.4\f55\n66\r7.7 8.8"}; std::ranges::copy( std::ranges::istream_view<float>(floats), std::ostream_iterator<float>{std::cout, ", "}); std::cout << '\n'; } ``` Output: ``` /today/ /is/ /yesterday’s/ /tomorrow/ 1.1, 2.2, 3.3, 4.4, 55, 66, 7.7, 8.8, ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2325R3](https://wg21.link/P2325R3) | C++20 | default constructor was provided as [`view`](view "cpp/ranges/view")must be [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable") | removed along with the requirement | | [LWG 3568](https://cplusplus.github.io/LWG/issue3568) | C++20 | P2325R3 accidentally made the stored value default-initialized | restored to value-initialization | | [P2432R1](https://wg21.link/P2432R1) | C++20 | `ranges::istream_view` was a function template anddid not follow the naming convention | made an alias template;customization point objects added | ### See also | | | | --- | --- | | [istream\_iterator](../iterator/istream_iterator "cpp/iterator/istream iterator") | input iterator that reads from `[std::basic\_istream](../io/basic_istream "cpp/io/basic istream")` (class template) |
programming_docs
cpp std::ranges::view_interface std::ranges::view\_interface ============================ | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class D > requires std::is_class_v<D> && std::same_as<D, std::remove_cv_t<D>> class view_interface; ``` | | (since C++20) | `std::ranges::view_interface` is a helper class template for defining a view interface. `view_interface` is typically used with [CRTP](../language/crtp "cpp/language/crtp"): ``` class my_view : public std::ranges::view_interface<my_view> { public: auto begin() const { /*...*/ } auto end() const { /*...*/ } // empty() is provided if begin() returns a forward iterator // and end() returns a sentinel for it. }; ``` ### Member functions | | | | --- | --- | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function) | | [data](view_interface/data "cpp/ranges/view interface/data") (C++20) | Gets the address of derived view's data. Provided if its iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"). (public member function) | | [size](view_interface/size "cpp/ranges/view interface/size") (C++20) | Returns the number of elements in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range") and its sentinel and iterator type satisfy [`sized_sentinel_for`](../iterator/sized_sentinel_for "cpp/iterator/sized sentinel for"). (public member function) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function) | ### Example ``` #include <ranges> #include <vector> #include <iostream> template <class T, class A> class VectorView : public std::ranges::view_interface<VectorView<T, A>> { public: VectorView() = default; VectorView(const std::vector<T, A>& vec) : m_begin(vec.cbegin()), m_end(vec.cend()) {} auto begin() const { return m_begin; } auto end() const { return m_end; } private: typename std::vector<T, A>::const_iterator m_begin{}, m_end{}; }; int main() { std::vector<int> v = {1, 4, 9, 16}; VectorView view_over_v{v}; // We can iterate with begin() and end(). for (int n : view_over_v) { std::cout << n << ' '; } std::cout << '\n'; // We get operator[] for free when inheriting from view_interface // since we satisfy the random_access_range concept. for (std::ptrdiff_t i = 0; i < view_over_v.size(); i++) { std::cout << "v[" << i << "] = " << view_over_v[i] << '\n'; } } ``` Output: ``` 1 4 9 16 v[0] = 1 v[1] = 4 v[2] = 9 v[3] = 16 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3549](https://cplusplus.github.io/LWG/issue3549) | C++20 | `view_interface` was required to be derived from `view_base`,which sometimes required multiple `view_base` subobjects in a view | inheritance removed | ### See also | | | | --- | --- | | [ranges::subrange](subrange "cpp/ranges/subrange") (C++20) | combines an iterator-sentinel pair into a [`view`](view "cpp/ranges/view") (class template) | cpp std::ranges::random_access_range std::ranges::random\_access\_range ================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept random_access_range = ranges::bidirectional_range<T> && std::random_access_iterator<ranges::iterator_t<T>>; ``` | | (since C++20) | The `random_access_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a model of [`random_access_iterator`](../iterator/random_access_iterator "cpp/iterator/random access iterator"). ### Example ``` #include <ranges> #include <vector> #include <array> #include <deque> #include <valarray> #include <list> #include <set> template<typename T> concept RAR = std::ranges::random_access_range<T>; int main() { int a[4]; static_assert( RAR<std::vector<int>> and RAR<std::vector<bool>> and RAR<std::deque<int>> and RAR<std::valarray<int>> and RAR<decltype(a)> and not RAR<std::list<int>> and not RAR<std::set<int>> and RAR<std::array<std::list<int>,42>> ); } ``` ### See also | | | | --- | --- | | [ranges::sized\_range](sized_range "cpp/ranges/sized range") (C++20) | specifies that a range knows its size in constant time (concept) | | [ranges::contiguous\_range](contiguous_range "cpp/ranges/contiguous range") (C++20) | specifies a range whose iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator") (concept) | cpp std::ranges::views::as_rvalue, std::ranges::as_rvalue_view std::ranges::views::as\_rvalue, std::ranges::as\_rvalue\_view ============================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::view V > requires ranges::input_range<V> class as_rvalue_view : public ranges::view_interface<as_rvalue_view<V>> ``` | (1) | (since C++23) | | ``` namespace views { inline constexpr /* unspecified */ as_rvalue = /* unspecified */; } ``` | (2) | (since C++23) | | Call signature | | | | ``` template< ranges::viewable_range R > requires /* see below */ constexpr ranges::view auto as_rvalue( R&& r ); ``` | | (since C++23) | 1) A range adaptor that represents a view of underlying [`view`](view "cpp/ranges/view") whose elements are rvalues. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). Let `e` be a subexpression and let `T` be `decltype((e))`. Then the expression `views::as_rvalue(e)` is *expression-equivalent* to: * `[views::all](http://en.cppreference.com/w/cpp/ranges/all_view)(e)`, if it is a well-formed expression and `[std::same\_as](http://en.cppreference.com/w/cpp/concepts/same_as)<[ranges::range\_rvalue\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<T>, [ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<T>>` is `true`; * `as_rvalue_view{e}` otherwise. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `as_rvalue_view` only hold one non-static data member: * the underlying view of type `V` (shown here as `*base\_*`, the name is exposition only). ### Member functions | | | | --- | --- | | (constructor) (C++23) | constructs an `as_rvalue_view` (public member function) | | base (C++23) | returns the underlying view `V` (public member function) | | begin (C++23) | returns the beginning iterator of the `as_rvalue_view` (public member function) | | end (C++23) | returns the end iterator of the `as_rvalue_view` (public member function) | | size (C++23) | returns the size of the view if it is bounded (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::as\_rvalue\_view::as\_rvalue\_view ------------------------------------------------ | | | | | --- | --- | --- | | ``` as_rvalue_view() requires std::default_initializable<V> = default; ``` | (1) | (since C++23) | | ``` constexpr explicit as_rvalue_view(V base); ``` | (2) | (since C++23) | 1) Value-initializes `*base\_*` via its default member initializer (`= V()`). 2) Initializes `*base\_*` with `std::move(base)`. ### Parameters | | | | | --- | --- | --- | | base | - | a view | std::ranges::as\_rvalue\_view::base ------------------------------------ | | | | | --- | --- | --- | | ``` constexpr V base() const& requires std::copy_constructible<V>; ``` | (1) | (since C++23) | | ``` constexpr V base() &&; ``` | (2) | (since C++23) | Returns the underlying view. 1) Copy-constructs the result from the underlying view. Equivalent to `return base_;`. 2) Move-constructs the result from the underlying view. Equivalent to `return std::move(base_);`. std::ranges::as\_rvalue\_view::begin ------------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto begin() requires (!/*simple-view*/<V>); ``` | (1) | (since C++23) | | ``` constexpr auto begin() const requires ranges::range<const V>; ``` | (2) | (since C++23) | 1-2) Returns `[std::move\_iterator](http://en.cppreference.com/w/cpp/iterator/move_iterator)([ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(base_))`. std::ranges::as\_rvalue\_view::end ----------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto end() requires (!/*simple-view*/<V>); ``` | (1) | (since C++23) | | ``` constexpr auto end() const requires ranges::range<const V>; ``` | (2) | (since C++23) | 1) Returns `[std::move\_iterator](http://en.cppreference.com/w/cpp/iterator/move_iterator)([ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(base_))` if `V` models [`common_range`](common_range "cpp/ranges/common range"), otherwise `[std::move\_sentinel](http://en.cppreference.com/w/cpp/iterator/move_sentinel)([ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(base_))`. 2) Returns `[std::move\_iterator](http://en.cppreference.com/w/cpp/iterator/move_iterator)([ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(base_))` if `const V` models [`common_range`](common_range "cpp/ranges/common range"), otherwise `[std::move\_sentinel](http://en.cppreference.com/w/cpp/iterator/move_sentinel)([ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(base_))`. std::ranges::as\_rvalue\_view::size ------------------------------------ | | | | | --- | --- | --- | | ``` constexpr auto size() requires ranges::sized_range<V> { return ranges::size(base_); } ``` | (1) | (since C++23) | | ``` constexpr auto size() const requires ranges::sized_range<const V> { return ranges::size(base_); } ``` | (2) | (since C++23) | Returns the size of the view if the view is bounded. ### Deduction guides | | | | | --- | --- | --- | | ``` template<class R> as_rvalue_view(R&&) -> as_rvalue_view<views::all_t<R>>; ``` | | (since C++23) | ### Helper templates | | | | | --- | --- | --- | | ``` template<class T> inline constexpr bool enable_borrowed_range<std::ranges::as_rvalue_view<T>> = std::ranges::enable_borrowed_range<T>; ``` | | (since C++23) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `as_rvalue_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when the underlying view satisfies it. ### Example ### See also | | | | --- | --- | | [move\_iterator](../iterator/move_iterator "cpp/iterator/move iterator") (C++11) | iterator adaptor which dereferences to an rvalue reference (class template) | | [move\_sentinel](../iterator/move_sentinel "cpp/iterator/move sentinel") (C++20) | sentinel adaptor for use with `[std::move\_iterator](../iterator/move_iterator "cpp/iterator/move iterator")` (class template) | cpp std::ranges::views::split, std::ranges::split_view std::ranges::views::split, std::ranges::split\_view =================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::forward_range V, ranges::forward_range Pattern > requires ranges::view<V> && ranges::view<Pattern> && std::indirectly_comparable<ranges::iterator_t<V>, ranges::iterator_t<Pattern>, ranges::equal_to> class split_view : public ranges::view_interface<split_view<V, Pattern>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /*unspecified*/ split = /*unspecified*/; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< ranges::viewable_range R, class Pattern > requires /* see below */ constexpr ranges::view auto split( R&& r, Pattern&& pattern ); ``` | | (since C++20) | | ``` template< class Pattern > constexpr /*range adaptor closure*/ split( Pattern&& pattern ); ``` | | (since C++20) | 1) `split_view` takes a [`view`](view "cpp/ranges/view") and a delimiter, and splits the [`view`](view "cpp/ranges/view") into subranges on the delimiter. 2) [Ranges adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::split(e, p)` is *expression-equivalent* to `split_view(e, p)` for any suitable subexpressions `e` and `p`. `split_view` models the concepts [`forward_range`](forward_range "cpp/ranges/forward range"), and [`common_range`](common_range "cpp/ranges/common range") when the underlying [`view`](view "cpp/ranges/view") `V` models respective concepts. The inner range (`[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<split_view>`) is a `[ranges::subrange](http://en.cppreference.com/w/cpp/ranges/subrange)<[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>>`, which models [`common_range`](common_range "cpp/ranges/common range"), models [`sized_range`](sized_range "cpp/ranges/sized range") when `[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>` models `[std::sized\_sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sized_sentinel_for)<[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>>`, and models [`contiguous_range`](contiguous_range "cpp/ranges/contiguous range"), [`random_access_range`](random_access_range "cpp/ranges/random access range"), [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), and [`forward_range`](forward_range "cpp/ranges/forward range") when `V` models respective concepts. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `split_view` hold three non-static data members: * the underlying [`view`](view "cpp/ranges/view") of type `V` (shown here as `*base\_*` for exposition only), and * the pattern (shown here as `*pattern\_*` for exposition only) that is used as a delimiter to split the underlying [`view`](view "cpp/ranges/view"). * an object equivalent to `[std::optional](http://en.cppreference.com/w/cpp/utility/optional)<[ranges::subrange](http://en.cppreference.com/w/cpp/ranges/subrange)<[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>>>` (shown here as `*cached\_begin\_*` for exposition only) that caches the result of a first call to [`begin()`](split_view/begin "cpp/ranges/split view/begin"). ### Member functions | | | | --- | --- | | [(constructor)](split_view/split_view "cpp/ranges/split view/split view") (C++20) | constructs a `split_view` (public member function) | | [base](split_view/base "cpp/ranges/split view/base") (C++20) | returns a copy of the underlying (adapted) view (public member function) | | [begin](split_view/begin "cpp/ranges/split view/begin") (C++20) | returns an iterator to the beginning (public member function) | | [end](split_view/end "cpp/ranges/split view/end") (C++20) | returns an iterator or a sentinel to the end (public member function) | | [*find\_next*](split_view/find_next "cpp/ranges/split view/find next") (C++20) | searches for the next occurrence of the pattern (exposition-only member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | ### Nested classes | | | | --- | --- | | [*iterator*](split_view/iterator "cpp/ranges/split view/iterator") (C++20) | the iterator type (exposition-only member class) | | [*sentinel*](split_view/sentinel "cpp/ranges/split view/sentinel") (C++20) | the sentinel type (exposition-only member class) | ### [Deduction guides](split_view/deduction_guides "cpp/ranges/split view/deduction guides") ### Notes Before [P2210R2](https://wg21.link/P2210R2), `split_view` used a *lazy* mechanism for splitting, and thus could not keep the bidirectional, random access, or contiguous properties of the underlying view, or make the iterator type of the inner range same as that of the underlying view. Consequently, it is redesigned by [P2210R2](https://wg21.link/P2210R2), and the lazy mechanism is moved to `lazy_split_view`. ### Example A link to check the example: [wandbox](https://wandbox.org/permlink/pbMnbXteVLwQ5wnb). ``` #include <iostream> #include <iomanip> #include <ranges> #include <string_view> int main() { constexpr std::string_view words{"Hello-_-C++-_-20-_-!"}; constexpr std::string_view delim{"-_-"}; for (const auto word : std::views::split(words, delim)) { std::cout << std::quoted(std::string_view(word.begin(), word.end())) << ' '; } } ``` Output: ``` "Hello" "C++" "20" "!" ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2210R2](https://wg21.link/P2210R2) | C++20 | the old `split_view` was too lazy to be easily used | it is redesigned | ### See also | | | | --- | --- | | [ranges::lazy\_split\_viewviews::lazy\_split](lazy_split_view "cpp/ranges/lazy split view") (C++20) | a [`view`](view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) | | [ranges::join\_viewviews::join](join_view "cpp/ranges/join view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the sequence obtained from flattening a [`view`](view "cpp/ranges/view") of [`range`s](range "cpp/ranges/range") (class template) (range adaptor object) |
programming_docs
cpp std::ranges::views::zip_transform, std::ranges::zip_transform_view std::ranges::views::zip\_transform, std::ranges::zip\_transform\_view ===================================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< std::copy_constructible F, ranges::input_range... Views > requires (ranges::view<Views> && ...) && (sizeof...(Views) > 0) && std::is_object_v<F> && std::regular_invocable< F&, ranges::range_reference_t<Views>...> && /*can-reference*/<std::invoke_result_t< F&, ranges::range_reference_t<Views>...>> class zip_transform_view : public ranges::view_interface<zip_transform_view<F, Views...>> ``` | (1) | (since C++23) | | ``` namespace views { inline constexpr /*unspecified*/ zip_transform = /*unspecified*/; } ``` | (2) | (since C++23) | | Call signature | | | | ``` template< class F, ranges::viewable_range... Rs > requires /* see below */ constexpr auto zip_transform( F&& f, Rs&&... rs ); ``` | | (since C++23) | 1) `zip_transform_view` is a range adaptor that takes an invocable object and one or more [`view`s](view "cpp/ranges/view"), and produces a [`view`](view "cpp/ranges/view") whose `*i*`th element is the result of applying the invocable object to the `*i*`th elements of all views. A type `T` models the exposition-only concept `*can-reference*` if and only if `T&` is a valid type. 2) `views::zip_transform` is a customization point object. When calling with one argument `f`, let `FD` be `[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<decltype(f)>`, if: * `FD` models [`copy_constructible`](../concepts/copy_constructible "cpp/concepts/copy constructible"), * `FD&` models [`regular_invocable`](../concepts/invocable "cpp/concepts/invocable"), and * `[std::invoke\_result\_t](http://en.cppreference.com/w/cpp/types/result_of)<FD&>` is an object type, then `views::zip_transform(f)` is *expression-equivalent* to `((void)f, auto([views::empty](http://en.cppreference.com/w/cpp/ranges/empty_view)<[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<[std::invoke\_result\_t](http://en.cppreference.com/w/cpp/types/result_of)<FD&>>>))`. Otherwise, the call to `views::zip_transform` is ill-formed. When calling with more than one arguments `f` and `rs...`, `views::zip_transform(f, rs...)` is. *expression-equivalent* to `ranges::zip_transform_view(f, rs...)`. `zip_transform_view` models the concepts [`random_access_range`](random_access_range "cpp/ranges/random access range"), [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`forward_range`](forward_range "cpp/ranges/forward range"), [`input_range`](input_range "cpp/ranges/input range"), [`common_range`](common_range "cpp/ranges/common range"), and [`sized_range`](sized_range "cpp/ranges/sized range") when the underlying `ranges::zip_view<Views...>` models respective concepts. ### Customization point objects The name `views::zip_transform` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_zip\_transform\_fn*`. All instances of `*\_\_zip\_transform\_fn*` are equal. The effects of invoking different instances of type `*\_\_zip\_transform\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::zip_transform` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::zip_transform` above, `*\_\_zip\_transform\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__zip_transform_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __zip_transform_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__zip_transform_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __zip_transform_fn&, Args...>`. Otherwise, no function call operator of `*\_\_zip\_transform\_fn*` participates in overload resolution. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `zip_transform_view` hold two non-static data members: * an underlying view object of type [`*InnerView*`](#Member_types) (shown here as `*zip\_*` for exposition only), and * a wrapped invocable object of type `[copyable-box](copyable_wrapper "cpp/ranges/copyable wrapper")<F>` (shown here as `*fun\_*` for exposition only). ### Member functions | | | | --- | --- | | [(constructor)](zip_transform_view/zip_transform_view "cpp/ranges/zip transform view/zip transform view") (C++23) | constructs a `zip_transform_view` (public member function) | | [begin](zip_transform_view/begin "cpp/ranges/zip transform view/begin") (C++23) | returns an iterator to the beginning (public member function) | | [end](zip_transform_view/end "cpp/ranges/zip transform view/end") (C++23) | returns an iterator or a sentinel to the end (public member function) | | [size](zip_transform_view/size "cpp/ranges/zip transform view/size") (C++23) | returns the number of elements. Provided only if each underlying (adapted) range satisfies [`sized_range`](sized_range "cpp/ranges/sized range"). (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](zip_transform_view/deduction_guides "cpp/ranges/zip transform view/deduction guides") ### Member types | Member type | Definition | | --- | --- | | `*InnerView*` (private) | `ranges::zip_view<Views...>`. The name is for exposition only. | | `*ziperator*` (private) | * `[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<const InnerView>` if `Const` is `true`, otherwise * `[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<InnerView>`. The name is for exposition only. | | `*zentinel*` (private) | * `[ranges::sentinel\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<const InnerView>` if `Const` is `true`, otherwise * `[ranges::sentinel\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<InnerView>`. The name is for exposition only. | ### Nested classes | | | | --- | --- | | [*iterator*](zip_transform_view/iterator "cpp/ranges/zip transform view/iterator") (C++23) | the iterator type (exposition-only member class template) | | [*sentinel*](zip_transform_view/sentinel "cpp/ranges/zip transform view/sentinel") (C++23) | the sentinel type used when the underlying `zip_view` is not a [`common_range`](common_range "cpp/ranges/common range") (exposition-only member class template) | ### Notes | [Feature-test](../utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_ranges_zip`](../feature_test#Library_features "cpp/feature test") | `202110L` | (C++23) | ### Example ``` #include <list> #include <array> #include <ranges> #include <vector> #include <iostream> void print(auto const rem, auto const& r) { for (std::cout << rem; auto const& e : r) std::cout << e << ' '; std::cout << '\n'; } int main() { auto v1 = std::vector<float>{1, 2, 3}; auto v2 = std::list<short>{1, 2, 3, 4}; auto v3 = std::to_array({1, 2, 3, 4, 5}); auto add = [](auto a, auto b, auto c) { return a + b + c; }; auto sum = std::views::zip_transform(add, v1, v2, v3); print("v1: ", v1); print("v2: ", v2); print("v3: ", v3); print("sum: ", sum); } ``` Output: ``` v1: 1 2 3 v2: 1 2 3 4 v3: 1 2 3 4 5 sum: 3 6 9 ``` ### See also | | | | --- | --- | | [ranges::zip\_viewviews::zip](zip_view "cpp/ranges/zip view") (C++23) | a [`view`](view "cpp/ranges/view") consisting of tuples of references to corresponding elements of the adapted views (class template) (customization point object) | | [ranges::transform\_viewviews::transform](transform_view "cpp/ranges/transform view") (C++20) | a [`view`](view "cpp/ranges/view") of a sequence that applies a transformation function to each element (class template) (range adaptor object) | | [ranges::elements\_viewviews::elements](elements_view "cpp/ranges/elements view") (C++20) | takes a [`view`](view "cpp/ranges/view") consisting of tuple-like values and a number N and produces a [`view`](view "cpp/ranges/view") of N'th element of each tuple (class template) (range adaptor object) | cpp std::ranges::owning_view std::ranges::owning\_view ========================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template<ranges::range R> requires std::movable<R> && (!/*is-initializer-list*/<R>) class owning_view : public ranges::view_interface<owning_view<R>> ``` | | (since C++20) | `owning_view` is a [`view`](view "cpp/ranges/view") that has unique ownership of a [`range`](range "cpp/ranges/range"). It is move only and stores that `range` within it. The constant `/*is-initializer-list*/<R>` in the requires-clause is `true` if and only if `[std::remove\_cvref\_t](http://en.cppreference.com/w/cpp/types/remove_cvref)<R>` is a specialization of `[std::initializer\_list](../utility/initializer_list "cpp/utility/initializer list")`. ### Data members Typical implementations of `owning_view` have only one non-static data member: the underlying range of type `R`. The member is shown as `*r\_*` here (the name is exposition-only). ### Member functions | | | | --- | --- | | (constructor) (C++20) | constructs a `owning_view` by value-initializing or move-constructing the stored range (public member function) | | operator= (C++20) | move-assigns the stored range (public member function) | | base (C++20) | returns a reference to the stored range (public member function) | | begin (C++20) | returns the beginning iterator of the stored range (public member function) | | end (C++20) | returns the sentinel of the stored range (public member function) | | empty (C++20) | checks whether the stored range is empty (public member function) | | size (C++20) | returns the size of the stored [`sized_range`](sized_range "cpp/ranges/sized range") (public member function) | | data (C++20) | returns the pointer to the beginning of the stored [`contiguous_range`](contiguous_range "cpp/ranges/contiguous range") (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::owning\_view::owning\_view ---------------------------------------- | | | | | --- | --- | --- | | ``` owning_view() requires std::default_initializable<R> = default; ``` | (1) | (since C++20) | | ``` owning_view( owning_view&& other ) = default; ``` | (2) | (since C++20) | | ``` constexpr owning_view( R&& t ); ``` | (3) | (since C++20) | | ``` owning_view( const owning_view& ) = delete; ``` | (4) | (since C++20) | 1) Default constructor. Value-initializes the stored range by its default member initializer (`= R()`). 2) Move constructor. Move constructs the stored range from that of `other`. 3) Move constructs the stored range from `t`. 4) Copy constructor is deleted. `owning_view` is move-only. ### Parameters | | | | | --- | --- | --- | | other | - | another `owning_view` to move from | | t | - | range to move from | std::ranges::owning\_view::operator= ------------------------------------- | | | | | --- | --- | --- | | ``` owning_view& operator=( owning_view&& other ) = default; ``` | (1) | (since C++20) | | ``` owning_view& operator=( const owning_view& ) = delete; ``` | (2) | (since C++20) | 1) Move assignment operator. Move assigns the stored range from that of `other`. 2) Copy assignment operator is deleted. `owning_view` is move-only. ### Parameters | | | | | --- | --- | --- | | other | - | another `owning_view` to move from | ### Return value `*this`. std::ranges::owning\_view::base -------------------------------- | | | | | --- | --- | --- | | ``` constexpr R& base() & noexcept; ``` | (1) | (since C++20) | | ``` constexpr const R& base() const & noexcept; ``` | (2) | (since C++20) | | ``` constexpr R&& base() && noexcept; ``` | (3) | (since C++20) | | ``` constexpr const R&& base() const && noexcept; ``` | (4) | (since C++20) | Returns a reference to the stored range, keeping value category and const-qualification. 1-2) Equivalent to `return r_;`. 3-4) Equivalent to `return std::move(r_);`. std::ranges::owning\_view::begin --------------------------------- | | | | | --- | --- | --- | | ``` constexpr ranges::iterator_t<R> begin(); ``` | (1) | (since C++20) | | ``` constexpr auto begin() const requires ranges::range<const R>; ``` | (2) | (since C++20) | Equivalent to `return [ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(r_);`. std::ranges::owning\_view::end ------------------------------- | | | | | --- | --- | --- | | ``` constexpr ranges::sentinel_t<R> end(); ``` | (1) | (since C++20) | | ``` constexpr auto end() const requires ranges::range<const R>; ``` | (2) | (since C++20) | Equivalent to `return [ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(r_);`. std::ranges::owning\_view::empty --------------------------------- | | | | | --- | --- | --- | | ``` constexpr bool empty() requires requires { ranges::empty(r_); }; ``` | (1) | (since C++20) | | ``` constexpr bool empty() const requires requires { ranges::empty(r_); }; ``` | (2) | (since C++20) | Equivalent to `return [ranges::empty](http://en.cppreference.com/w/cpp/ranges/empty)(r_);`. std::ranges::owning\_view::size -------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto size() requires ranges::sized_range<R>; ``` | (1) | (since C++20) | | ``` constexpr auto size() const requires ranges::sized_range<const R>; ``` | (2) | (since C++20) | Equivalent to `return [ranges::size](http://en.cppreference.com/w/cpp/ranges/size)(r_);`. std::ranges::owning\_view::data -------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto data() requires ranges::contiguous_range<R>; ``` | (1) | (since C++20) | | ``` constexpr auto data() const requires ranges::contiguous_range<const R>; ``` | (2) | (since C++20) | Equivalent to `return [ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)(r_);`. ### Helper templates | | | | | --- | --- | --- | | ``` template<class T> inline constexpr bool enable_borrowed_range<std::ranges::owning_view<T>> = std::ranges::enable_borrowed_range<T>; ``` | | (since C++20) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `owning_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when the underlying range satisfies it. ### Example ``` #include <ranges> #include <string> #include <cassert> #include <iostream> int main() { using namespace std::literals; std::ranges::owning_view ov{ "cosmos"s }; // the deduced type of R is std::string; // `ov` is the only owner of this string std::cout << std::boolalpha << "call empty() : " << ov.empty() << '\n' << "call size() : " << ov.size() << '\n' << "call front() : " << ov.front() << '\n' // same as *(ov.begin()) << "call back() : " << ov.back() << '\n' // equals to *(ov.end()-1) << "call data() : " << ov.data() << '\n' << "call base() : " << ov.base().size() << '\n' // ~> ov.size() << "sizeof(ov) : " << sizeof(ov) << '\n' // typically equals to sizeof(R) << "range-for : \""; for (const char c: ov) std::cout << c; std::cout << "\"\n"; std::ranges::owning_view<std::string> ov2; assert(ov2.empty()); // ov2 = ov; // compile-time error: copy assignment operator is deleted ov2 = std::move(ov); // OK assert(ov.empty()); assert(ov2.size() == 6); } ``` Possible output: ``` call empty() : false call size() : 6 call front() : c call back() : s call data() : cosmos call base() : 6 sizeof(ov) : 32 range-for : "cosmos" ``` ### See also | | | | --- | --- | | [ranges::ref\_view](ref_view "cpp/ranges/ref view") (C++20) | a [`view`](view "cpp/ranges/view") of the elements of some other [`range`](range "cpp/ranges/range") (class template) | | [views::all\_tviews::all](all_view "cpp/ranges/all view") (C++20) | a [`view`](view "cpp/ranges/view") that includes all elements of a [`range`](range "cpp/ranges/range") (alias template) (range adaptor object) |
programming_docs
cpp std::ranges::rbegin std::ranges::rbegin =================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ rbegin = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::input_or_output_iterator auto rbegin( T&& t ); ``` | | (since C++20) | Returns an iterator to the last element of the argument. ![range-rbegin-rend.svg]() Let `t` be an object of type `T`. If the argument is an lvalue or `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `true`, then a call to `ranges::rbegin` is expression-equivalent to: 1. `t.rbegin()` converted to its [decayed type](../types/decay "cpp/types/decay"), if that expression with conversion is valid, and its converted type models `[std::input\_or\_output\_iterator](../iterator/input_or_output_iterator "cpp/iterator/input or output iterator")`. 2. Otherwise, `rbegin(t)` converted to its [decayed type](../types/decay "cpp/types/decay"), if `T` is a class or enumeration type, the aforementioned unqualified call with conversion is valid, its converted type models `[std::input\_or\_output\_iterator](../iterator/input_or_output_iterator "cpp/iterator/input or output iterator")`, where the [overload resolution](../language/overload_resolution "cpp/language/overload resolution") is performed with the following candidates: * `void rbegin(auto&) = delete;` * `void rbegin(const auto&) = delete;` * any declarations of `rbegin` found by [argument-dependent lookup](../language/adl "cpp/language/adl"). 3. Otherwise, `[std::make\_reverse\_iterator](http://en.cppreference.com/w/cpp/iterator/make_reverse_iterator)([ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(t))` if both `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(t)` and `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(t)` are valid expressions, have the same type, and that type models `[std::bidirectional\_iterator](../iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator")`. In all other cases, a call to `ranges::rbegin` is ill-formed, which can result in [substitution failure](../language/sfinae "cpp/language/sfinae") when `ranges::rbegin(t)` appears in the immediate context of a template instantiation. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::rbegin` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_rbegin\_fn*`. All instances of `*\_\_rbegin\_fn*` are equal. The effects of invoking different instances of type `*\_\_rbegin\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::rbegin` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::rbegin` above, `*\_\_rbegin\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__rbegin_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __rbegin_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__rbegin_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __rbegin_fn&, Args...>`. Otherwise, no function call operator of `*\_\_rbegin\_fn*` participates in overload resolution. ### Notes If the argument is an rvalue (i.e. `T` is an object type) and `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `false`, the call to `ranges::rbegin` is ill-formed, which also results in substitution failure. The return type models `[std::input\_or\_output\_iterator](../iterator/input_or_output_iterator "cpp/iterator/input or output iterator")` in all cases. The C++20 standard requires that if the underlying `rbegin` function call returns a prvalue, the return value is move-constructed from the materialized temporary object. All implementations directly return the prvalue instead. The requirement is corrected by the post-C++20 proposal [P0849R8](https://wg21.link/P0849R8) to match the implementations. ### Example ``` #include <iostream> #include <vector> #include <ranges> #include <span> int main() { std::vector<int> v = { 3, 1, 4 }; auto vi = std::ranges::rbegin(v); std::cout << *vi << '\n'; *vi = 42; // OK int a[] = { -5, 10, 15 }; auto ai = std::ranges::rbegin(a); std::cout << *ai << '\n'; *ai = 42; // OK // auto x_x = std::ranges::rbegin(std::vector{6,6,6}); // ill-formed: the argument is an rvalue (see Notes ↑) auto si = std::ranges::rbegin(std::span{a}); // OK: static_assert(std::ranges::enable_borrowed_range< std::remove_cv_t<decltype(std::span{a})>>); *si = 42; // OK } ``` Output: ``` 4 15 ``` ### See also | | | | --- | --- | | [ranges::crbegin](crbegin "cpp/ranges/crbegin") (C++20) | returns a reverse iterator to a read-only range (customization point object) | | [rbegincrbegin](../iterator/rbegin "cpp/iterator/rbegin") (C++14) | returns a reverse iterator to the beginning of a container or array (function template) | cpp std::ranges::views::keys, std::ranges::keys_view std::ranges::views::keys, std::ranges::keys\_view ================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class R > using keys_view = ranges::elements_view<R, 0>; ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr auto keys = ranges::elements<0>; } ``` | (2) | (since C++20) | Takes a [`view`](view "cpp/ranges/view") of *tuple-like* values (e.g. `[std::tuple](../utility/tuple "cpp/utility/tuple")` or `[std::pair](../utility/pair "cpp/utility/pair")`), and produces a view with a *value-type* of the *first* element of the adapted view's value-type. 1) An alias for `[ranges::elements\_view](http://en.cppreference.com/w/cpp/ranges/elements_view)<R, 0>`. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::keys(e)` is *expression-equivalent* to `keys_view<[views::all\_t](http://en.cppreference.com/w/cpp/ranges/all_view)<decltype((e))>>{e}` for any suitable subexpression `e`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Notes `keys_view` can be useful for extracting *keys* from associative containers, e.g. `for (auto const& key : std::views::keys(map)) { /*...*/ }`. ### Example Displays values for each type of [quark](https://en.wikipedia.org/wiki/Quark "enwiki:Quark") in particle phyics. ``` #include <iomanip> #include <iostream> #include <ranges> #include <utility> #include <vector> #include <string> int main() { const std::vector<std::pair<std::string, double>> quark_mass{ // MeV/c² {"up", 2.3}, {"down", 4.8}, {"charm", 1275}, {"strange", 95}, {"top", 173'210}, {"bottom", 4'180}, }; std::cout << "quark name: │ "; for (std::string const& name : std::views::keys(quark_mass)) std::cout << std::setw(9) << name << " │ "; std::cout << "\n" "mass MeV/c²: │ "; for (const double mass : std::views::values(quark_mass)) std::cout << std::setw(9) << mass << " │ "; std::cout << '\n'; } ``` Output: ``` quark name: │ up │ down │ charm │ strange │ top │ bottom │ mass MeV/c²: │ 2.3 │ 4.8 │ 1275 │ 95 │ 173210 │ 4180 │ ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3563](https://cplusplus.github.io/LWG/issue3563) | C++20 | `keys_view` is unable to participate in CTAD due to its use of `views::all_t` | `views::all_t` removed | ### See also | | | | --- | --- | | [ranges::values\_viewviews::values](values_view "cpp/ranges/values view") (C++20) | takes a [`view`](view "cpp/ranges/view") consisting of pair-like values and produces a [`view`](view "cpp/ranges/view") of the second elements of each pair (class template) (range adaptor object) | | [ranges::elements\_viewviews::elements](elements_view "cpp/ranges/elements view") (C++20) | takes a [`view`](view "cpp/ranges/view") consisting of tuple-like values and a number N and produces a [`view`](view "cpp/ranges/view") of N'th element of each tuple (class template) (range adaptor object) | | [slice](../numeric/valarray/slice "cpp/numeric/valarray/slice") | BLAS-like slice of a valarray: starting index, length, stride (class) | cpp std::ranges::bidirectional_range std::ranges::bidirectional\_range ================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept bidirectional_range = ranges::forward_range<T> && std::bidirectional_iterator<ranges::iterator_t<T>>; ``` | | (since C++20) | The `bidirectional_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a model of [`bidirectional_iterator`](../iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator"). ### Example ``` #include <ranges> #include <list> #include <forward_list> #include <set> #include <unordered_set> int main() { static_assert( std::ranges::bidirectional_range<std::set<int>> and not std::ranges::bidirectional_range<std::unordered_set<int>> and std::ranges::bidirectional_range<std::list<int>> and not std::ranges::bidirectional_range<std::forward_list<int>> ); } ``` ### See also | | | | --- | --- | | [ranges::forward\_range](forward_range "cpp/ranges/forward range") (C++20) | specifies a range whose iterator type satisfies [`forward_iterator`](../iterator/forward_iterator "cpp/iterator/forward iterator") (concept) | | [ranges::random\_access\_range](random_access_range "cpp/ranges/random access range") (C++20) | specifies a range whose iterator type satisfies [`random_access_iterator`](../iterator/random_access_iterator "cpp/iterator/random access iterator") (concept) | cpp std::from_range, std::from_range_t std::from\_range, std::from\_range\_t ===================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` struct from_range_t { explicit from_range_t() = default; }; ``` | | (since C++23) | | ``` inline constexpr std::from_range from_range {}; ``` | | (since C++23) | `std::from_range` is a disambiguation tag that can be passed to the constructors of the suitable containers to indicate that the contained member is range constructed. The corresponding type `std::from_range_t` can be used in the constructor's parameter list to match the intended tag. ### See also | | | | --- | --- | | [ranges::to](to "cpp/ranges/to") (C++23) | constructs a non-view range from another range (function template) | cpp std::ranges::cdata std::ranges::cdata ================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ cdata = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::remove_reference_t<ranges::range_reference_t</*CT*/>>* cdata( T&& t ); ``` | | (since C++20) | Returns a pointer to the first element of a contiguous range denoted by a const-qualified argument. Let `CT` be. 1. `const [std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<T>&` if the argument is an lvalue (i.e. `T` is an lvalue reference type), 2. `const T` otherwise, a call to `ranges::cdata` is expression-equivalent to `[ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)(static\_cast<CT&&>(t))`. If `[ranges::cdata](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/cdata)(t)` is valid, then it returns a pointer to a object. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::cdata` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_cdata\_fn*`. All instances of `*\_\_cdata\_fn*` are equal. The effects of invoking different instances of type `*\_\_cdata\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::cdata` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::cdata` above, `*\_\_cdata\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__cdata_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __cdata_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__cdata_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __cdata_fn&, Args...>`. Otherwise, no function call operator of `*\_\_cdata\_fn*` participates in overload resolution. ### Example ``` #include <cstring> #include <iostream> #include <ranges> #include <string> int main() { std::string src {"hello world!\n"}; // std::ranges::cdata(src)[0] = 'H'; // error, src.data() is treated as read-only std::ranges::data(src)[0] = 'H'; // OK, src.data() is a non-const storage char dst[20]; // storage for a C-style string std::strcpy(dst, std::ranges::cdata(src)); // [data(src), data(src) + size(src)] is guaranteed to be an NTBS std::cout << dst; } ``` Output: ``` Hello world! ``` ### See also | | | | --- | --- | | [ranges::data](data "cpp/ranges/data") (C++20) | obtains a pointer to the beginning of a contiguous range (customization point object) | | [data](../iterator/data "cpp/iterator/data") (C++17) | obtains the pointer to the underlying array (function template) | cpp std::ranges::subrange_kind std::ranges::subrange\_kind =========================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` enum class subrange_kind : bool { unsized, sized }; ``` | | (since C++20) | Specifies if a `[std::ranges::subrange](subrange "cpp/ranges/subrange")` models `[std::ranges::sized\_range](sized_range "cpp/ranges/sized range")` or not. | Constant | Explanation | | --- | --- | | `unsized` | specifies that the `subrange` does not model [`sized_range`](sized_range "cpp/ranges/sized range") | | `sized` | specifies that the `subrange` does model [`sized_range`](sized_range "cpp/ranges/sized range") | cpp std::ranges::views::lazy_split, std::ranges::lazy_split_view std::ranges::views::lazy\_split, std::ranges::lazy\_split\_view =============================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::input_range V, ranges::forward_range Pattern > requires ranges::view<V> && ranges::view<Pattern> && std::indirectly_comparable<ranges::iterator_t<V>, ranges::iterator_t<Pattern>, ranges::equal_to> && (ranges::forward_range<V> || /*tiny_range*/<Pattern>) class lazy_split_view : public ranges::view_interface<lazy_split_view<V, Pattern>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /*unspecified*/ lazy_split = /*unspecified*/; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< ranges::viewable_range R, class Pattern > requires /* see below */ constexpr ranges::view auto lazy_split( R&& r, Pattern&& pattern ); ``` | | (since C++20) | | ``` template< class Pattern > constexpr /*range adaptor closure*/ lazy_split( Pattern&& pattern ); ``` | | (since C++20) | | Helper concepts | | | | ``` template< class R > concept /*tiny_range*/ = // exposition only ranges::sized_range<R> && requires /*is-statically-constexpr-sized*/<R> && (std::remove_reference_t<R>::size() <= 1); ``` | (3) | (since C++20) | 1) `lazy_split_view` takes a [`view`](view "cpp/ranges/view") and a delimiter, and splits the [`view`](view "cpp/ranges/view") into subranges on the delimiter. Two major scenarios are supported: * The view is an [`input_range`](input_range "cpp/ranges/input range"), the delimiter is a single element (wrapped in a [`single_view`](single_view "cpp/ranges/single view")). * The view is a [`forward_range`](forward_range "cpp/ranges/forward range"), the delimiter is a [`view`](view "cpp/ranges/view") of elements. 2) A [range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::lazy_split(e, f)` is *expression-equivalent* to `lazy_split_view(e, f)`. 3) The exposition-only concept `/*tiny_range*/<Pattern>` is satisfied if `Pattern` satisfies [`sized_range`](sized_range "cpp/ranges/sized range"), `Pattern::size()` is a constant expression and suitable as a template non-type argument, and the value of `Pattern::size()` is less than or equal to `1`. Notably, [`empty_view`](empty_view "cpp/ranges/empty view") and [`single_view`](single_view "cpp/ranges/single view") satisfy this concept. `lazy_split_view` models the concepts [`forward_range`](forward_range "cpp/ranges/forward range") and [`input_range`](input_range "cpp/ranges/input range") when the underlying [`view`](view "cpp/ranges/view") `V` models respective concepts, and models [`common_range`](common_range "cpp/ranges/common range") when `V` models both [`forward_range`](forward_range "cpp/ranges/forward range") and [`common_range`](common_range "cpp/ranges/common range"). The inner range (`[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<lazy_split_view>`) models the concepts [`forward_range`](forward_range "cpp/ranges/forward range") and [`input_range`](input_range "cpp/ranges/input range") when the underlying [`view`](view "cpp/ranges/view") `V` models respective concepts. It does not model [`common_range`](common_range "cpp/ranges/common range"), and cannot be used with algorithms that expect a [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") or higher. ### Data members Typical implementations of `lazy_split_view` hold two or three non-static data members: * the underlying [`view`](view "cpp/ranges/view") of type `V` (shown here as `*base\_*` for exposition only), * the pattern (shown here as `*pattern\_*` for exposition only) that is used as a delimiter to split the underlying [`view`](view "cpp/ranges/view"), and * the caching object (shown here as `*current\_*` for exposition only) of the `[std::optional](../utility/optional "cpp/utility/optional")`-*like* exposition-only type `/\*non\_propagating\_cache\*/<[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>>`. The `*current\_*` caches the result of calls to [`begin()`](lazy_split_view/begin "cpp/ranges/lazy split view/begin"). Present only if `V` does not satisfy [`forward_range`](forward_range "cpp/ranges/forward range"). ### Member functions | | | | --- | --- | | [(constructor)](lazy_split_view/lazy_split_view "cpp/ranges/lazy split view/lazy split view") (C++20) | constructs a `lazy_split_view` (public member function) | | [base](lazy_split_view/base "cpp/ranges/lazy split view/base") (C++20) | returns a copy of the underlying (adapted) view (public member function) | | [begin](lazy_split_view/begin "cpp/ranges/lazy split view/begin") (C++20) | returns an iterator to the beginning (public member function) | | [end](lazy_split_view/end "cpp/ranges/lazy split view/end") (C++20) | returns an iterator or a sentinel to the end (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | ### Nested classes | | | | --- | --- | | [*outer\_iterator*](lazy_split_view/outer_iterator "cpp/ranges/lazy split view/outer iterator") (C++20) | the iterator type (exposition-only member class template) | | [*inner\_iterator*](lazy_split_view/inner_iterator "cpp/ranges/lazy split view/inner iterator") (C++20) | the iterator type of the inner range (exposition-only member class template) | ### [Deduction guides](lazy_split_view/deduction_guides "cpp/ranges/lazy split view/deduction guides") ### Notes `lazy_split_view` is introduced by the post-C++20 defect report [P2210R2](https://wg21.link/P2210R2). It has the same lazy mechanism as that of the old `split_view` before change. ### Example ``` #include <algorithm> #include <iostream> #include <ranges> #include <string_view> // P2210R2: a temporary patch until online g++ >= 12 #define lazy_split_view split_view #define lazy_split split auto print = [](auto const& view) { // `view` is a std::views::lazy_split_view::/*outer_iterator*/::value_type for (std::cout << "{ "; const auto element : view) std::cout << element << ' '; std::cout << "} "; }; int main() { constexpr static auto source = { 0, 1,0, 2,3,0, 4,5,6,0, 7,8,9 }; constexpr int delimiter {0}; constexpr std::ranges::lazy_split_view outer_view{source, delimiter}; std::cout << "splits[" << std::ranges::distance(outer_view) << "]: "; for (auto const& inner_view: outer_view) print(inner_view); constexpr std::string_view hello { "Hello C++ 20 !" }; std::cout << "\n" "substrings: "; std::ranges::for_each(hello | std::views::lazy_split(' '), print); constexpr std::string_view text { "Hello-+-C++-+-20-+-!" }; constexpr std::string_view delim { "-+-" }; std::cout << "\n" "substrings: "; std::ranges::for_each(text | std::views::lazy_split(delim), print); } ``` Output: ``` splits[5]: { } { 1 } { 2 3 } { 4 5 6 } { 7 8 9 } substrings: { H e l l o } { C + + } { 2 0 } { ! } substrings: { H e l l o } { C + + } { 2 0 } { ! } ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2210R2](https://wg21.link/P2210R2) | C++20 | the old `split_view` was too lazy to be easily used | moves its functionality to `lazy_split_view` | ### See also | | | | --- | --- | | [ranges::split\_viewviews::split](split_view "cpp/ranges/split view") (C++20) | a [`view`](view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) | | [ranges::join\_viewviews::join](join_view "cpp/ranges/join view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the sequence obtained from flattening a [`view`](view "cpp/ranges/view") of [`range`s](range "cpp/ranges/range") (class template) (range adaptor object) |
programming_docs
cpp std::ranges::ref_view std::ranges::ref\_view ====================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template<ranges::range R> requires std::is_object_v<R> class ref_view : public ranges::view_interface<ref_view<R>> ``` | | (since C++20) | `ref_view` is a [`view`](view "cpp/ranges/view") of the elements of some other [`range`](range "cpp/ranges/range"). It wraps a reference to that `range`. ### Data members `ref_view` behaves as if it only stores a `R*` member subobject pointing to the referenced range. The member is shown as `*r\_*` here (the name is exposition-only). ### Member functions | | | | --- | --- | | (constructor) (C++20) | constructs a `ref_view` that references to the given range (public member function) | | base (C++20) | returns the references to the referenced range (public member function) | | begin (C++20) | returns the beginning iterator of the referenced range (public member function) | | end (C++20) | returns the sentinel of the referenced range (public member function) | | empty (C++20) | checks whether the referenced range is empty (public member function) | | size (C++20) | returns the size of the referenced [`sized_range`](sized_range "cpp/ranges/sized range") (public member function) | | data (C++20) | returns the pointer to the beginning of the referenced [`contiguous_range`](contiguous_range "cpp/ranges/contiguous range") (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::ref\_view::ref\_view ---------------------------------- | | | | | --- | --- | --- | | ``` template</*different-from*/<ref_view> T> requires std::convertible_to<T, R&> && requires { _FUN(std::declval<T>()); } constexpr ref_view(T&& t); ``` | | (since C++20) | Initializes `*r\_*` with `[std::addressof](http://en.cppreference.com/w/cpp/memory/addressof)(static\_cast<R&>([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t)))`. `/*different-from*/<T, U>` is satisfied if and only if `[std::remove\_cvref\_t](http://en.cppreference.com/w/cpp/types/remove_cvref)<T>` and `[std::remove\_cvref\_t](http://en.cppreference.com/w/cpp/types/remove_cvref)<U>` are not the same type, and overloads of `*\_FUN*` are declared as `void _FUN(R&); void _FUN(R&&) = delete;`. ### Parameters | | | | | --- | --- | --- | | t | - | range to reference | std::ranges::ref\_view::base ----------------------------- | | | | | --- | --- | --- | | ``` constexpr R& base() const; ``` | | (since C++20) | Equivalent to `return *r_;` std::ranges::ref\_view::begin ------------------------------ | | | | | --- | --- | --- | | ``` constexpr ranges::iterator_t<R> begin() const; ``` | | (since C++20) | Equivalent to `return [ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(\*r_);` std::ranges::ref\_view::end ---------------------------- | | | | | --- | --- | --- | | ``` constexpr ranges::sentinel_t<R> end() const; ``` | | (since C++20) | Equivalent to `return [ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(\*r_);` std::ranges::ref\_view::empty ------------------------------ | | | | | --- | --- | --- | | ``` constexpr bool empty() const requires requires { ranges::empty(*r_); }; ``` | | (since C++20) | Equivalent to `return [ranges::empty](http://en.cppreference.com/w/cpp/ranges/empty)(\*r_);` std::ranges::ref\_view::size ----------------------------- | | | | | --- | --- | --- | | ``` constexpr auto size() const requires ranges::sized_range<R> { return ranges::size(*r_); } ``` | | (since C++20) | std::ranges::ref\_view::data ----------------------------- | | | | | --- | --- | --- | | ``` constexpr auto data() const requires ranges::contiguous_range<R> { return ranges::data(*r_); } ``` | | (since C++20) | ### Deduction guides | | | | | --- | --- | --- | | ``` template<class R> ref_view(R&) -> ref_view<R>; ``` | | (since C++20) | ### Helper templates | | | | | --- | --- | --- | | ``` template<class T> inline constexpr bool enable_borrowed_range<ranges::ref_view<T>> = true; ``` | | (since C++20) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `ref_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"). ### Example ``` #include <ranges> #include <iostream> int main() { const std::string s{"cosmos"}; const std::ranges::take_view tv{s, 3}; const std::ranges::ref_view rv{tv}; std::cout << std::boolalpha << "call empty() : " << rv.empty() << '\n' << "call size() : " << rv.size() << '\n' << "call begin() : " << *rv.begin() << '\n' << "call end() : " << *(rv.end()-1) << '\n' << "call data() : " << rv.data() << '\n' << "call base() : " << rv.base().size() << '\n' // ~> tv.size() << "range-for : "; for (const auto c: rv) { std::cout << c; } std::cout << '\n'; } ``` Output: ``` call empty() : false call size() : 3 call begin() : c call end() : s call data() : cosmos call base() : 3 range-for : cos ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2325R3](https://wg21.link/P2325R3) | C++20 | default constructor was provided as [`view`](view "cpp/ranges/view")must be [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable") | removed along with the requirement | ### See also | | | | --- | --- | | [reference\_wrapper](../utility/functional/reference_wrapper "cpp/utility/functional/reference wrapper") (C++11) | [CopyConstructible](../named_req/copyconstructible "cpp/named req/CopyConstructible") and [CopyAssignable](../named_req/copyassignable "cpp/named req/CopyAssignable") reference wrapper (class template) | | [ranges::owning\_view](owning_view "cpp/ranges/owning view") (C++20) | a [`view`](view "cpp/ranges/view") with unique ownership of some [`range`](range "cpp/ranges/range") (class template) | | [views::all\_tviews::all](all_view "cpp/ranges/all view") (C++20) | a [`view`](view "cpp/ranges/view") that includes all elements of a [`range`](range "cpp/ranges/range") (alias template) (range adaptor object) | cpp std::ranges::views::adjacent, std::ranges::adjacent_view std::ranges::views::adjacent, std::ranges::adjacent\_view ========================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::forward_range V, std::size_t N > requires ranges::view<V> && (N > 0) class adjacent_view : public ranges::view_interface<adjacent_view<V, N>> ``` | (1) | (since C++23) | | ``` namespace views { template< size_t N > inline constexpr /* unspecified */ adjacent = /* unspecified */ ; } ``` | (2) | (since C++23) | | ``` namespace views { inline constexpr auto pairwise = adjacent<2>; } ``` | (3) | (since C++23) | | Call signature | | | | ``` template< ranges::viewable_range R > requires /* see below */ constexpr ranges::view auto adjacent<N>( R&& r ); ``` | | (since C++23) | 1) `adjacent_view` is a range adaptor that takes a [`view`](view "cpp/ranges/view"), and produces a [`view`](view "cpp/ranges/view") whose `*i*`th element (a "window") is a tuple-like value that holds `*N*` references to the elements of the original view, from `i`th up to `i + N - 1`th inclusively. Let `*S*` be the size of the original view. Then the size of produced view is: * `*S - N + 1*`, if `*S >= N*`, * `​0​` otherwise, and the resulting view is empty. 2) The name `views::adjacent<N>` denotes a [ranges adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). Given a subexpression `e` and a constant expression `N`, the expression `views::adjacent<N>(e)` is *expression-equivalent* to * `((void)e, auto([views::empty](http://en.cppreference.com/w/cpp/ranges/empty_view)<tuple<>>))` if `N` is equal to `​0​`, * `adjacent_view<[views::all\_t](http://en.cppreference.com/w/cpp/ranges/all_view)<decltype((e))>, N>(e)` otherwise. `adjacent_view` always models [`forward_range`](forward_range "cpp/ranges/forward range"), and models [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`random_access_range`](random_access_range "cpp/ranges/random access range"), or [`sized_range`](sized_range "cpp/ranges/sized range") if adapted [`view`](view "cpp/ranges/view") type models the corresponding concept. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `adjacent_view` hold only one non-static data member `*base\_*` of type `V`. The name is for exposition only. ### Member functions | | | | --- | --- | | [(constructor)](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_view/adjacent_view&action=edit&redlink=1 "cpp/ranges/adjacent view/adjacent view (page does not exist)") (C++23) | constructs a `adjacent_view` (public member function) | | [begin](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_view/begin&action=edit&redlink=1 "cpp/ranges/adjacent view/begin (page does not exist)") (C++23) | returns an iterator to the beginning (public member function) | | [end](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_view/end&action=edit&redlink=1 "cpp/ranges/adjacent view/end (page does not exist)") (C++23) | returns an iterator or a sentinel to the end (public member function) | | [size](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_view/size&action=edit&redlink=1 "cpp/ranges/adjacent view/size (page does not exist)") (C++23) | returns the number of elements. Provided only if the underlying (adapted) range satisfies [`sized_range`](sized_range "cpp/ranges/sized range"). (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### Deduction guides (none). ### Nested classes | | | | --- | --- | | [*iterator*](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_view/iterator&action=edit&redlink=1 "cpp/ranges/adjacent view/iterator (page does not exist)") (C++23) | the iterator type (exposition-only member class template) | | [*sentinel*](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_view/sentinel&action=edit&redlink=1 "cpp/ranges/adjacent view/sentinel (page does not exist)") (C++23) | the sentinel type used when `adjacent_view` is not a [`common_range`](common_range "cpp/ranges/common range") (exposition-only member class template) | ### Helper templates | | | | | --- | --- | --- | | ``` template< class V, size_t N > inline constexpr bool enable_borrowed_range<adjacent_view<V, N>> = enable_borrowed_range<V>; ``` | | (since C++23) | This specialization of [`ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes **`adjacent_view`** satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when the underlying view satisfies it. ### Notes There is a similarity between `ranges::adjacent_view` and `ranges::slide_view` — they both produce a "sliding window" of the size `N`, and, given a [`view`](view "cpp/ranges/view") of the size `*S*`, they both will have the same size: `S - N + 1`. The difference between these [`view`](view "cpp/ranges/view") adaptors are: | View adaptor | `value_type` | The window size `*N*` is | | --- | --- | --- | | `ranges::adjacent_view` | tuple-like object | a template parameter | | `ranges::slide_view` | a [`range`](range "cpp/ranges/range") | a runtime parameter | | [Feature-test](../utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_ranges_zip`](../feature_test#Library_features "cpp/feature test") | `202110L` | (C++23) | ### Example ``` #include <array> #include <tuple> #include <ranges> #include <string> #include <cstddef> #include <iostream> int main() { constexpr std::size_t window{3}; constexpr std::array v {1, 2, 3, 4, 5, 6}; std::cout << "v = [1 2 3 4 5 6]\n"; for (int i{}; auto const e: v | std::views::adjacent<window>(v)) { std::cout << "e = " << std::string(2 * i++, ' ') << '[' << std::get<0>(e) << ' ' << std::get<1>(e) << ' ' << std::get<2>(e) << "]\n"; } } ``` Output: ``` v = [1 2 3 4 5 6] e = [1 2 3] e = [2 3 4] e = [3 4 5] e = [4 5 6] ``` ### See also | | | | --- | --- | | [ranges::adjacent\_transform\_viewviews::adjacent\_transform](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_transform_view&action=edit&redlink=1 "cpp/ranges/adjacent transform view (page does not exist)") (C++23) | a [`view`](view "cpp/ranges/view") consisting of tuples of results of application of a transformation function to adjacent elements of the adapted view (class template) (range adaptor object) | | [ranges::slide\_viewviews::slide](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/slide_view&action=edit&redlink=1 "cpp/ranges/slide view (page does not exist)") (C++23) | a [`view`](view "cpp/ranges/view") whose Mth element is a [`view`](view "cpp/ranges/view") over the Mth through (M + N - 1)th elements of another [`view`](view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::chunk\_viewviews::chunk](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/chunk_view&action=edit&redlink=1 "cpp/ranges/chunk view (page does not exist)") (C++23) | a range of [`view`s](view "cpp/ranges/view") that are `N`-sized non-overlapping successive chunks of the elements of another [`view`](view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::stride\_viewviews::stride](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/stride_view&action=edit&redlink=1 "cpp/ranges/stride view (page does not exist)") (C++23) | a [`view`](view "cpp/ranges/view") consisting of elements of another [`view`](view "cpp/ranges/view"), advancing over N elements at a time (class template) (range adaptor object) | cpp std::ranges::output_range std::ranges::output\_range ========================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template<class R, class T> concept output_range = ranges::range<R> && std::output_iterator<ranges::iterator_t<R>, T>; ``` | | (since C++20) | The `output_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a model of [`output_iterator`](../iterator/output_iterator "cpp/iterator/output iterator"). cpp std::ranges::input_range std::ranges::input\_range ========================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept input_range = ranges::range<T> && std::input_iterator<ranges::iterator_t<T>>; ``` | | (since C++20) | The `input_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a model of [`input_iterator`](../iterator/input_iterator "cpp/iterator/input iterator"). cpp std::ranges::subrange std::ranges::subrange ===================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< std::input_or_output_iterator I, std::sentinel_for<I> S = I, ranges::subrange_kind K = std::sized_sentinel_for<S, I> ? ranges::subrange_kind::sized : ranges::subrange_kind::unsized > requires (K == ranges::subrange_kind::sized || !std::sized_sentinel_for<S, I>) class subrange : public ranges::view_interface<subrange<I, S, K>> ``` | | (since C++20) | The `subrange` class template combines together an iterator and a sentinel into a single [`view`](view "cpp/ranges/view"). Additionally, the subrange is a [`sized_range`](sized_range "cpp/ranges/sized range") whenever the final template parameter is `subrange_kind​::​sized` (which happens when `[std::sized\_sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sized_sentinel_for)<S, I>` is satisfied or when size is passed explicitly as a constructor argument). The size record is needed to be stored if and only if `[std::sized\_sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sized_sentinel_for)<S, I>` is `false` and `K` is `subrange_kind::sized`. ### Member functions | | | | --- | --- | | [(constructor)](subrange/subrange "cpp/ranges/subrange/subrange") (C++20) | creates a new `subrange` (public member function) | | [operator PairLike](subrange/operator_pairlike "cpp/ranges/subrange/operator PairLike") (C++20) | converts the `subrange` to a pair-like type (public member function) | | Observers | | [begin](subrange/begin "cpp/ranges/subrange/begin") (C++20) | obtains the iterator (public member function) | | [end](subrange/end "cpp/ranges/subrange/end") (C++20) | obtains the sentinel (public member function) | | [empty](subrange/empty "cpp/ranges/subrange/empty") (C++20) | checks whether the `subrange` is empty (public member function) | | [size](subrange/size "cpp/ranges/subrange/size") (C++20) | obtains the size of the `subrange` (public member function) | | Iterator operations | | [advance](subrange/advance "cpp/ranges/subrange/advance") (C++20) | advances the iterator by given distance (public member function) | | [prev](subrange/prev "cpp/ranges/subrange/prev") (C++20) | obtains a copy of the `subrange` with its iterator decremented by a given distance (public member function) | | [next](subrange/next "cpp/ranges/subrange/next") (C++20) | obtains a copy of the `subrange` with its iterator advanced by a given distance (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [data](view_interface/data "cpp/ranges/view interface/data") (C++20) | Gets the address of derived view's data. Provided if its iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"). (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](subrange/deduction_guides "cpp/ranges/subrange/deduction guides") ### Non-member functions | | | | --- | --- | | [get(std::ranges::subrange)](subrange/get "cpp/ranges/subrange/get") (C++20) | obtains iterator or sentinel from a `std::ranges::subrange` (function template) | ### Helper types | | | | --- | --- | | [ranges::subrange\_kind](subrange_kind "cpp/ranges/subrange kind") (C++20) | specifies whether a `std::ranges::subrange` models `[std::ranges::sized\_range](sized_range "cpp/ranges/sized range")` (enum) | | [std::tuple\_size<std::ranges::subrange>](subrange/tuple_size "cpp/ranges/subrange/tuple size") (C++20) | obtains the number of components of a `std::ranges::subrange` (class template specialization) | | [std::tuple\_element<std::ranges::subrange>](subrange/tuple_element "cpp/ranges/subrange/tuple element") (C++20) | obtains the type of the iterator or the sentinel of a `std::ranges::subrange` (class template specialization) | ### Helper templates | | | | | --- | --- | --- | | ``` template<class I, class S, ranges::subrange_kind K> inline constexpr bool enable_borrowed_range<ranges::subrange<I, S, K>> = true; ``` | | | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `subrange` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"). ### Example ``` #include <iostream> #include <map> #include <ranges> #include <string_view> template <class V> void mutate(V& v) { v += 'A' - 'a'; } template <class K, class V> void mutate_map_values(std::multimap<K, V>& m, K k) { auto [first, last] = m.equal_range(k); for (auto& [_, v] : std::ranges::subrange(first, last)) { mutate(v); } } int main() { auto print = [](std::string_view rem, auto const& mm) { std::cout << rem << "{ "; for (const auto& [k, v] : mm) std::cout << "{" << k << ",'" << v << "'} "; std::cout << "}\n"; }; std::multimap<int,char> mm{ {4,'a'}, {3,'-'}, {4,'b'}, {5,'-'}, {4,'c'} }; print("Before: ", mm); mutate_map_values(mm, 4); print("After: ", mm); } ``` Output: ``` Before: { {3,'-'} {4,'a'} {4,'b'} {4,'c'} {5,'-'} } After: { {3,'-'} {4,'A'} {4,'B'} {4,'C'} {5,'-'} } ``` ### See also | | | | --- | --- | | [ranges::view\_interface](view_interface "cpp/ranges/view interface") (C++20) | helper class template for defining a [`view`](view "cpp/ranges/view"), using the [curiously recurring template pattern](../language/crtp "cpp/language/crtp") (class template) |
programming_docs
cpp std::ranges::constant_range std::ranges::constant\_range ============================ | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept constant_range = ranges::input_range<T> && /*constant-iterator*/<ranges::iterator_t<T>>; ``` | (1) | (since C++23) | | Helper concepts | | | | ``` template< class T > concept /*constant-iterator*/ = std::input_iterator<T> && std::same_as<std::iter_const_reference_t<T>, std::iter_reference_t<T>>; ``` | (2) | (since C++23) | 1) The `constant_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a [constant iterator](../iterator#Iterator_categories "cpp/iterator"). 2) The exposition-only concept `/*constant-iterator*/<T>` is satisfied when the result of the indirection operation of the input iterator is its const reference type which implies read-only. ### Example ``` #include <ranges> #include <vector> #include <string_view> #include <span> int main() { static_assert(not std::ranges::constant_range<std::vector<int>> and std::ranges::constant_range<const std::vector<int>> and std::ranges::constant_range<std::string_view> and not std::ranges::constant_range<std::span<int>> and not std::ranges::constant_range<const std::span<int>> and std::ranges::constant_range<std::span<const int>>); } ``` cpp std::ranges::views::filter, std::ranges::filter_view std::ranges::views::filter, std::ranges::filter\_view ===================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::input_range V, std::indirect_unary_predicate<ranges::iterator_t<V>> Pred > requires ranges::view<V> && std::is_object_v<Pred> class filter_view : public ranges::view_interface<filter_view<V, Pred>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /* unspecified */ filter = /* unspecified */; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< ranges::viewable_range R, class Pred > requires /* see below */ constexpr ranges::view auto filter( R&& r, Pred&& pred ); ``` | | (since C++20) | | ``` template< class Pred > constexpr /*range adaptor closure*/ filter( Pred&& pred ); ``` | | (since C++20) | 1) A range adaptor that represents [`view`](view "cpp/ranges/view") of an underlying sequence without the elements that fail to satisfy a predicate. 2) [Ranges adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::filter(e, p)` is *expression-equivalent* to `filter_view(e, p)` for any suitable subexpressions `e` and `p`. `filter_view` models the concepts [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`forward_range`](forward_range "cpp/ranges/forward range"), [`input_range`](input_range "cpp/ranges/input range"), and [`common_range`](common_range "cpp/ranges/common range") when the underlying [`view`](view "cpp/ranges/view") `V` models respective concepts. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `filter_view` hold two or three non-static data members: * the underlying [`view`](view "cpp/ranges/view") of type `V` (shown here as `*base\_*` for exposition only), * a wrapper that wraps the predicate used to filter out elements of `*base\_*` of type `/*copyable-box*/<Pred>` (shown here as `*pred\_*` for exposition only), where [`copyable-box`](copyable_wrapper "cpp/ranges/copyable wrapper") is a wrapper class template that always satisfies [`copyable`](../concepts/copyable "cpp/concepts/copyable"), * an object of `[std::optional](../utility/optional "cpp/utility/optional")`-like type (shown here as `*begin\_*` for exposition only) that caches an iterator to the first element of `*base\_*` that satisfies the `*pred\_*`. The `*begin\_*` may be present only if `filter_view` models [`forward_range`](forward_range "cpp/ranges/forward range"). ### Member functions | | | | --- | --- | | (constructor) (C++20) | constructs a `filter_view` (public member function) | | base (C++20) | returns the underlying view `V` (public member function) | | pred (C++20) | returns a reference to the predicate stored within `filter_view` (public member function) | | begin (C++20) | returns the beginning iterator of the `filter_view` (public member function) | | end (C++20) | returns the sentinel of the `filter_view` (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::filter\_view::filter\_view ---------------------------------------- | | | | | --- | --- | --- | | ``` filter_view() requires std::default_initializable<V> && std::default_initializable<Pred> = default; ``` | (1) | (since C++20) | | ``` constexpr filter_view( V base, Pred pred ); ``` | (2) | (since C++20) | 1) Value-initializes `*base\_*` via its default member initializer (`= V()`) and default-initializes `*pred\_*` (which value-initializes the contained `Pred`). 2) Initializes `*base\_*` with `std::move(base)` and initializes `*pred\_*` with `std::move(pred)`. ### Parameters | | | | | --- | --- | --- | | base | - | range to filter | | pred | - | predicate to filter out elements | std::ranges::filter\_view::base -------------------------------- | | | | | --- | --- | --- | | ``` constexpr V base() const& requires std::copy_constructible<V>; ``` | (1) | (since C++20) | | ``` constexpr V base() &&; ``` | (2) | (since C++20) | 1) Equivalent to `return base_;`. 2) Equivalent to `return std::move(base_);`. std::ranges::filter\_view::pred -------------------------------- | | | | | --- | --- | --- | | ``` constexpr const Pred& pred() const; ``` | | (since C++20) | Returns a reference to the contained `Pred` object. The behavior is undefined if `*pred\_*` does not contain a value. std::ranges::filter\_view::begin --------------------------------- | | | | | --- | --- | --- | | ``` constexpr /*iterator*/ begin(); ``` | | (since C++20) | In order to provide the amortized constant time complexity required by the [`range`](range "cpp/ranges/range") concept, this function caches the result within the `filter_view` object for use on subsequent calls. Equivalent to. ``` if constexpr (!ranges::forward_range<V>) { return /*iterator*/{*this, ranges::find_if(base_, std::ref(*pred_))}; } else { if (!begin_.has_value()) begin_ = ranges::find_if(base_, std::ref(*pred_)); // caching return /*iterator*/{*this, begin_.value())}; } ``` The behavior is undefined if `*pred\_*` does not contain a value. std::ranges::filter\_view::end ------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto end() { if constexpr (ranges::common_range<V>) return /*iterator*/{*this, ranges::end(base_)}; else return /*sentinel*/{*this}; } ``` | | (since C++20) | ### Deduction guides | | | | | --- | --- | --- | | ``` template< class R, class Pred > filter_view( R&&, Pred ) -> filter_view<views::all_t<R>, Pred>; ``` | | (since C++20) | ### Nested classes | | | | --- | --- | | [*iterator*](filter_view/iterator "cpp/ranges/filter view/iterator") (C++20) | the iterator type of `filter_view` (exposition-only member class) | | [*sentinel*](filter_view/sentinel "cpp/ranges/filter view/sentinel") (C++20) | the sentinel type of `filter_view` when the underlying view is not a [`common_range`](common_range "cpp/ranges/common range") (exposition-only member class) | ### Example ``` #include <iostream> #include <ranges> int main() { auto even = [](int i) { return 0 == i % 2; }; auto square = [](int i) { return i * i; }; for (int i : std::views::iota(0, 6) | std::views::filter(even) | std::views::transform(square)) { std::cout << i << ' '; } } ``` Output: ``` 0 4 16 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2325R3](https://wg21.link/P2325R3) | C++20 | if `Pred` is not [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable"), the default constructorconstructs a `filter_view` which does not contain an `Pred` | the `filter_view` is alsonot [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable") | ### See also | | | | --- | --- | | [ranges::take\_while\_viewviews::take\_while](take_while_view "cpp/ranges/take while view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the initial elements of another [`view`](view "cpp/ranges/view"), until the first element on which a predicate returns false (class template) (range adaptor object) | cpp std::ranges::views::single, std::ranges::single_view std::ranges::views::single, std::ranges::single\_view ===================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< std::copy_constructible T > requires std::is_object_v<T> class single_view : public ranges::view_interface<single_view<T>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /*unspecified*/ single = /*unspecified*/; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr /*see below*/ single( T&& t ); ``` | | (since C++20) | 1) Produces a [`view`](view "cpp/ranges/view") that contains exactly one element of a specified value. 2) The expression `views::single(e)` is *expression-equivalent* to `single_view<[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<decltype((e))>>(e)` for any suitable subexpression `e`. Typical implementations of `single_view` store a single member of type `[*copyable-box*](copyable_wrapper "cpp/ranges/copyable wrapper")<T>`. For description purposes, this member is hereinafter called `*value\_*`. The lifetime of the element is bound to the parent `single_view`. Copying `single_view` makes a copy of the element. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `views::single` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_single\_fn*`. All instances of `*\_\_single\_fn*` are equal. The effects of invoking different instances of type `*\_\_single\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::single` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::single` above, `*\_\_single\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__single_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __single_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__single_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __single_fn&, Args...>`. Otherwise, no function call operator of `*\_\_single\_fn*` participates in overload resolution. ### Member functions | | | | --- | --- | | **(constructor)** (C++20) | constructs a `single_view` (public member function) | | **begin** (C++20) | returns a pointer to the element (public member function) | | **end** (C++20) | returns a pointer past the element (public member function) | | **size** [static] (C++20) | returns 1 (one) (public static member function) | | **data** (C++20) | returns a pointer to the element (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::single\_view::single\_view ---------------------------------------- | | | | | --- | --- | --- | | ``` single_view() requires std::default_initializable<T> = default; ``` | (1) | (since C++20) | | ``` constexpr explicit single_view( const T& t ); ``` | (2) | (since C++20) | | ``` constexpr explicit single_view( T&& t ); ``` | (3) | (since C++20) | | ``` template< class... Args > requires std::constructible_from<T, Args...> constexpr explicit single_view( std::in_place_t, Args&&... args ); ``` | (4) | (since C++20) | Constructs a `single_view`. 1) Default initializes `value_`, which value-initializes its contained value. 2) Initializes `value_` with `t`. 3) Initializes `value_` with `std::move(t)`. 4) Initializes `value_` as if by `value_{[std::in\_place](http://en.cppreference.com/w/cpp/utility/in_place), [std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...}`. std::ranges::single\_view::begin --------------------------------- | | | | | --- | --- | --- | | ``` constexpr T* begin() noexcept; constexpr const T* begin() const noexcept; ``` | | (since C++20) | Equivalent to `return data();`. std::ranges::single\_view::end ------------------------------- | | | | | --- | --- | --- | | ``` constexpr T* end() noexcept; constexpr const T* end() const noexcept; ``` | | (since C++20) | Equivalent to `return data() + 1;`. std::ranges::single\_view::size -------------------------------- | | | | | --- | --- | --- | | ``` static constexpr std::size_t size() noexcept; ``` | | (since C++20) | Equivalent to `return 1;`. This is a static function. This makes `single_view` model `/*tiny-range*/` as required by [`split_view`](split_view "cpp/ranges/split view"). std::ranges::single\_view::data -------------------------------- | | | | | --- | --- | --- | | ``` constexpr T* data() noexcept; constexpr const T* data() const noexcept; ``` | | (since C++20) | Returns a pointer to the contained value of `value_`. The behavior is undefined if `value_` does not contains a value. ### Deduction guides | | | | | --- | --- | --- | | ``` template<class T> single_view(T) -> single_view<T>; ``` | | (since C++20) | ### Notes For a `single_view`, the inherited `empty` member function always returns `false`, and the inherited `operator bool` conversion function always returns `true`. ### Example ``` #include <iostream> #include <iomanip> #include <ranges> #include <string> #include <tuple> int main() { constexpr std::ranges::single_view sv1{3.1415}; // uses (const T&) constructor static_assert(sv1); static_assert(not sv1.empty()); std::cout << "1) *sv1.data(): " << *sv1.data() << '\n' << "2) *sv1.begin(): " << *sv1.begin() << '\n' << "3) sv1.size(): " << sv1.size() << '\n' << "4) distance: " << std::distance(sv1.begin(), sv1.end()) << '\n'; std::string str{"C++20"}; std::cout << "5) str = " << std::quoted(str) << '\n'; std::ranges::single_view sv2{std::move(str)}; // uses (T&&) constructor std::cout << "6) *sv2.data(): " << quoted(*sv2.data()) << '\n' << "7) str = " << quoted(str) << '\n'; std::ranges::single_view<std::tuple<int, double, std::string>> sv3{std::in_place, 42, 3.14, "😄"}; // uses (std::in_place_t, Args&&... args) std::cout << "8) sv3 holds a tuple: { " << std::get<0>(sv3[0]) << ", " << std::get<1>(sv3[0]) << ", " << std::get<2>(sv3[0]) << " }\n"; } ``` Output: ``` 1) *sv1.data(): 3.1415 2) *sv1.begin(): 3.1415 3) sv1.size(): 1 4) distance: 1 5) str = "C++20" 6) *sv2.data(): "C++20" 7) str = "" 8) sv3 holds a tuple: { 42, 3.14, 😄 } ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3428](https://cplusplus.github.io/LWG/issue3428) | C++20 | `single_view` was convertible from `[std::in\_place\_t](../utility/in_place "cpp/utility/in place")` | the constructor is made explicit | | [P2367R0](https://wg21.link/P2367R0) | C++20 | deduction guides for `single_view` failed to decay the argument;`views::single` copied but not wrapped a `single_view` | a decaying guide provided;made always wrapping | ### See also | | | | --- | --- | | [ranges::empty\_viewviews::empty](empty_view "cpp/ranges/empty view") (C++20) | an empty [`view`](view "cpp/ranges/view") with no elements (class template) (variable template) | | [ranges::split\_viewviews::split](split_view "cpp/ranges/split view") (C++20) | a [`view`](view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) |
programming_docs
cpp std::ranges::views::common, std::ranges::common_view std::ranges::views::common, std::ranges::common\_view ===================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::view V > requires (!ranges::common_range<V> && std::copyable<ranges::iterator_t<V>>) class common_view : public ranges::view_interface<common_view<V>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /* unspecified */ common = /* unspecified */; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< ranges::viewable_range R > requires /* see below */ constexpr ranges::view auto common( R&& r ); ``` | | (since C++20) | 1) Adapts a given [`view`](view "cpp/ranges/view") with different types for iterator/sentinel pair into a [`view`](view "cpp/ranges/view") that is also a [`common_range`](common_range "cpp/ranges/common range"). A `common_view` always has the same iterator/sentinel type. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). Let `e` be a subexpression. Then the expression `views::common(e)` is *expression-equivalent* to: * `[views::all](http://en.cppreference.com/w/cpp/ranges/all_view)(e)`, if it is a well-formed expression and `decltype((e))` models [`common_range`](common_range "cpp/ranges/common range"); * `common_view{e}` otherwise. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `common_view` only hold one non-static data member: the underlying view of type `V`. ### Member functions | | | | --- | --- | | [(constructor)](common_view/common_view "cpp/ranges/common view/common view") (C++20) | constructs a `common_view` (public member function) | | [base](common_view/base "cpp/ranges/common view/base") (C++20) | returns a copy of the underlying (adapted) view (public member function) | | [begin](common_view/begin "cpp/ranges/common view/begin") (C++20) | returns an iterator to the beginning (public member function) | | [end](common_view/end "cpp/ranges/common view/end") (C++20) | returns an iterator to the end (public member function) | | [size](common_view/size "cpp/ranges/common view/size") (C++20) | returns the number of elements. Provided only if the underlying (adapted) range satisfies [`sized_range`](sized_range "cpp/ranges/sized range"). (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [data](view_interface/data "cpp/ranges/view interface/data") (C++20) | Gets the address of derived view's data. Provided if its iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"). (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](common_view/deduction_guides "cpp/ranges/common view/deduction guides") ### Helper templates | | | | | --- | --- | --- | | ``` template<class T> inline constexpr bool enable_borrowed_range<std::ranges::common_view<T>> = std::ranges::enable_borrowed_range<T>; ``` | | (since C++20) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `common_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when the underlying view satisfies it. ### Notes `common_view` can be useful for working with legacy algorithms that expect the iterator and sentinel are of the same type. ### Example ``` #include <iostream> #include <iterator> #include <numeric> #include <ranges> #include <list> int main() { auto v1 = { 1, 2, 3, 4, 5 }; auto i1 = std::counted_iterator{v1.begin(), std::ssize(v1)}; auto r1 = std::ranges::subrange{i1, std::default_sentinel}; // auto e1 = std::accumulate(r1.begin(), r1.end(), 0); // error: "common range" required auto c1 = std::ranges::common_view{r1}; std::cout << "accumulate: " << std::accumulate(c1.begin(), c1.end(), 0) << '\n'; // inherited from ranges::view_interface: std::cout << "c1.front(): " << c1.front() << '\n'; std::cout << "c1.back(): " << c1.back() << '\n'; std::cout << "c1.data(): " << c1.data() << '\n'; std::cout << "c1[0]: " << c1[0] << '\n'; auto v2 = std::list{ 1, 2, 3, 4, 5 }; auto i2 = std::counted_iterator{v2.begin(), std::ssize(v2)}; auto r2 = std::ranges::subrange{i2, std::default_sentinel}; // auto e2 = std::accumulate(r2.begin(), r2.end(), 0); // error: "common range" required auto c2 = std::ranges::common_view{r2}; std::cout << "accumulate: " << std::accumulate(c2.begin(), c2.end(), 0) << '\n'; // inherited from ranges::view_interface: std::cout << "c2.front(): " << c2.front() << '\n'; // auto e3 = c2.back(); // error: "contiguous range" required // auto e4 = c2.data(); // error: "contiguous range" required // auto e5 = c2[0]; // error: "contiguous range" required } ``` Possible output: ``` accumulate: 15 c1.front(): 1 c1.back(): 5 c1.data(): 0x400ec0 c1[0]: 1 accumulate: 15 c2.front(): 1 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3494](https://cplusplus.github.io/LWG/issue3494) | C++20 | `common_view` was never a `borrowed_range` | it is a `borrowed_range` if its underlying view is | ### See also | | | | --- | --- | | [ranges::common\_range](common_range "cpp/ranges/common range") (C++20) | specifies that a range has identical iterator and sentinel types (concept) | | [common\_iterator](../iterator/common_iterator "cpp/iterator/common iterator") (C++20) | adapts an iterator type and its sentinel into a common iterator type (class template) | cpp std::ranges::contiguous_range std::ranges::contiguous\_range ============================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept contiguous_range = ranges::random_access_range<T> && std::contiguous_iterator<ranges::iterator_t<T>> && requires(T& t) { { ranges::data(t) } -> std::same_as<std::add_pointer_t<ranges::range_reference_t<T>>>; }; ``` | | (since C++20) | The `contiguous_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a model of [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator") and the customization point `[ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)` is usable. ### Semantic requirements `T` models `contiguous_range` only if given an expression `e` such that `decltype((e))` is `T&`, `[std::to\_address](http://en.cppreference.com/w/cpp/memory/to_address)([ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(e)) == [ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)(e)`. ### Example ``` #include <ranges> #include <vector> #include <array> #include <deque> #include <valarray> #include <list> #include <set> template<typename T> concept CR = std::ranges::contiguous_range<T>; int main() { int a[4]; static_assert( CR<std::vector<int>> and not CR<std::vector<bool>> and not CR<std::deque<int>> and CR<std::valarray<int>> and CR<decltype(a)> and not CR<std::list<int>> and not CR<std::set<int>> and CR<std::array<std::list<int>,42>> ); } ``` ### See also | | | | --- | --- | | [ranges::sized\_range](sized_range "cpp/ranges/sized range") (C++20) | specifies that a range knows its size in constant time (concept) | | [ranges::random\_access\_range](random_access_range "cpp/ranges/random access range") (C++20) | specifies a range whose iterator type satisfies [`random_access_iterator`](../iterator/random_access_iterator "cpp/iterator/random access iterator") (concept) | cpp std::ranges::views::empty, std::ranges::empty_view std::ranges::views::empty, std::ranges::empty\_view =================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template<class T> requires std::is_object_v<T> class empty_view : public ranges::view_interface<empty_view<T>> ``` | (1) | (since C++20) | | ``` namespace views { template<class T> inline constexpr empty_view<T> empty{}; } ``` | (2) | (since C++20) | 1) A range factory that produces a [`view`](view "cpp/ranges/view") of no elements of a particular type. 2) Variable template for `empty_view`. ### Member functions | | | | --- | --- | | begin [static] (C++20) | returns `nullptr` (public static member function) | | end [static] (C++20) | returns `nullptr` (public static member function) | | data [static] (C++20) | returns `nullptr` (public static member function) | | size [static] (C++20) | returns `​0​` (zero) (public static member function) | | empty [static] (C++20) | returns `true` (public static member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::empty\_view::begin -------------------------------- | | | | | --- | --- | --- | | ``` static constexpr T* begin() noexcept { return nullptr; } ``` | | (since C++20) | `empty_view` does not reference any element. std::ranges::empty\_view::end ------------------------------ | | | | | --- | --- | --- | | ``` static constexpr T* end() noexcept { return nullptr; } ``` | | (since C++20) | `empty_view` does not reference any element. std::ranges::empty\_view::data ------------------------------- | | | | | --- | --- | --- | | ``` static constexpr T* data() noexcept { return nullptr; } ``` | | (since C++20) | `empty_view` does not reference any element. std::ranges::empty\_view::size ------------------------------- | | | | | --- | --- | --- | | ``` static constexpr std::size_t size() noexcept { return 0; } ``` | | (since C++20) | `empty_view` is always empty. std::ranges::empty\_view::empty -------------------------------- | | | | | --- | --- | --- | | ``` static constexpr bool empty() noexcept { return true; } ``` | | (since C++20) | `empty_view` is always empty. ### Helper templates | | | | | --- | --- | --- | | ``` template<class T> inline constexpr bool enable_borrowed_range<ranges::empty_view<T>> = true; ``` | | (since C++20) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `empty_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"). ### Notes Although `empty_view` obtains `front`, `back`, and `operator[]` member functions from `view_interface`, calls to them always result in undefined behavior since an `empty_view` is always empty. The inherited `operator bool` conversion function always returns `false`. ### Example ``` #include <ranges> int main() { std::ranges::empty_view<long> e; static_assert(std::ranges::empty(e)); static_assert(0 == e.size()); static_assert(nullptr == e.data()); static_assert(nullptr == e.begin()); static_assert(nullptr == e.end()); } ``` ### See also | | | | --- | --- | | [ranges::single\_viewviews::single](single_view "cpp/ranges/single view") (C++20) | a [`view`](view "cpp/ranges/view") that contains a single element of a specified value (class template) (customization point object) | | [views::all\_tviews::all](all_view "cpp/ranges/all view") (C++20) | a [`view`](view "cpp/ranges/view") that includes all elements of a [`range`](range "cpp/ranges/range") (alias template) (range adaptor object) | | [ranges::ref\_view](ref_view "cpp/ranges/ref view") (C++20) | a [`view`](view "cpp/ranges/view") of the elements of some other [`range`](range "cpp/ranges/range") (class template) | cpp std::ranges::to std::ranges::to =============== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class C, ranges::input_range R, class... Args > requires (!ranges::view<C>) constexpr C to( R&& r, Args&&... args ); ``` | (1) | (since C++23) | | ``` template< template< class... > class C, ranges::input_range R, class... Args > constexpr auto to( R&& r, Args&&... args ); ``` | (2) | (since C++23) | | ``` template< class C, class... Args > requires (!ranges::view<C>) constexpr /*range adaptor closure*/ to( Args&&... args ); ``` | (3) | (since C++23) | | ``` template< template< class... > class C, class... Args > constexpr /*range adaptor closure*/ to( Args&&... args ); ``` | (4) | (since C++23) | | Helper templates | | | | ``` template< class Container > constexpr bool /*reservable-container*/ = ranges::sized_range<Container> && requires (Container& c, ranges::range_size_t<Container> n) { c.reserve(n); { c.capacity() } -> std::same_as<decltype(n)>; { c.max_size() } -> std::same_as<decltype(n)>; }; ``` | (5) | (since C++23) | | ``` template< class Container, class Reference > constexpr bool /*container-insertable*/ = requires (Container& c, Reference&& ref) { requires (requires { c.push_back(std::forward<Reference>(ref)); } || requires { c.insert(c.end(), std::forward<Reference>(ref)); }); }; ``` | (6) | (since C++23) | | ``` template< class Reference, class C > constexpr auto /*container-inserter*/(C& c) { if constexpr (requires { c.push_back(std::declval<Reference>()); }) return std::back_inserter(c); else return std::inserter(c, c.end()); } ``` | (7) | (since C++23) | | ``` template< class R, class T > concept /*container-compatible-range*/ = ranges::input_range<R> && std::convertible_to<ranges::range_reference_t<R>, T>; ``` | (8) | (since C++23) | The overloads of the range conversion function construct a new non-view range object from a source range as its first argument by calling a constructor taking a range, a `std::from_range_t` tagged ranged constructor, a constructor taking an iterator-sentinel pair, or by back inserting each element of the source range into the arguments-constructed object. 1) Constructs an object of type `C` from the elements of `r` in the following: a) If `[std::convertible\_to](http://en.cppreference.com/w/cpp/concepts/convertible_to)<[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<R>, [ranges::range\_value\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<R>>` is `true`: 1) Constructing a non-view range object as if [direct-initializing](../language/direct_initialization "cpp/language/direct initialization") (but not direct-list-initializing) an object of type `C` from the source range `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<R>(r)` and the rest of the functional arguments `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...` if `[std::constructible\_from](http://en.cppreference.com/w/cpp/concepts/constructible_from)<C, R, Args...>` is `true`. 2) Otherwise, constructing a non-view range object as if [direct-initializing](../language/direct_initialization "cpp/language/direct initialization") (but not direct-list-initializing) an object of type `C` from additional disambiguation tag `std::from_range`, the source range `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<R>(r)` and the rest of the functional arguments `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...` if `[std::constructible\_from](http://en.cppreference.com/w/cpp/concepts/constructible_from)<C, std::from\_range\_t, R, Args...>` is `true`. 3) Otherwise, constructing a non-view range object as if [direct-initializing](../language/direct_initialization "cpp/language/direct initialization") (but not direct-list-initializing) an object of type `C` from the iterator-sentinel pair (`[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(r)` as iterator and `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(r)` as sentinel where iterator and sentinel has the same type. In other words, the source range must be a common range), and the rest of function arguments `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...` if all of the conditions below are `true`: * `[ranges::common\_range](http://en.cppreference.com/w/cpp/ranges/common_range)<R>` * `[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<R>` satisfies [LegacyInputIterator](../named_req/inputiterator "cpp/named req/InputIterator") * `[std::constructible\_from](http://en.cppreference.com/w/cpp/concepts/constructible_from)<C, [ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<R>, [ranges::sentinel\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<R>, Args...>` 4) Otherwise, constructing a non-view range object as if [direct-initializing](../language/direct_initialization "cpp/language/direct initialization") (but not direct-list-initializing) an object of type `C` from the rest of the function arguments `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...` with the following equivalent call below after the construction: | | | | | --- | --- | --- | | ``` if constexpr (ranges::sized_range<R> && /*reservable-container*/<C>) c.reserve(ranges::size(r)); ranges::copy(r, /*container-inserter*/<ranges::range_reference_t<R>>(c)); ``` | | | If the `R` satisfies [`sized_range`](sized_range "cpp/ranges/sized range") and `C` satisfies `/*reservable-container*/`, the constructed object `c` of type `C` is able to reserve storage with the initial storage size `[ranges::size](http://en.cppreference.com/w/cpp/ranges/size)(r)` to prevent additional allocations during inserting new elements. Each range reference element of `r` is back inserted to `c` through `ranges::copy` with back inserter adaptor. The operations above are valid if both of the conditions below are `true`: * `[std::constructible\_from](http://en.cppreference.com/w/cpp/concepts/constructible_from)<C, Args...>` * `/\*container-insertable\*/<C, [ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<R>>` b) Otherwise, the return expression is equivalent to: | | | | | --- | --- | --- | | ``` to<C>(r | views::transform([](auto&& elem) { return to<ranges::range_value_t<R>>(std::forward<decltype(elem)>(elem)); }, std::forward<Args>(args)...) ``` | | | Which allows nested range constructions within the range if `[ranges::input\_range](http://en.cppreference.com/w/cpp/ranges/input_range)<[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<R>>` is `true`. Otherwise, the program is ill-formed. 2) Constructs an object of deduced type from the elements of `r`. Let `/*input-iterator*/` be an exposition only type that satisfies [LegacyInputIterator](../named_req/inputiterator "cpp/named req/InputIterator"): | | | | | --- | --- | --- | | ``` struct /*input-iterator*/ { // exposition only using iterator_category = std::input_iterator_tag; using value_type = ranges::range_value_t<R>; using difference_type = std::ptrdiff_t; using pointer = std::add_pointer_t<ranges::range_reference_t<R>>; using reference = ranges::range_reference_t<R>; reference operator*() const; // not defined pointer operator->() const; // not defined /*input-iterator*/& operator++(); // not defined /*input-iterator*/ operator++(int); // not defined bool operator==(const /*input-iterator*/&) const; // not defined }; ``` | | | Let `/*DEDUCE-EXPR*/` be defined as follows: * `C([std::declval](http://en.cppreference.com/w/cpp/utility/declval)<R>(), [std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...)`, if that expression is valid. * Otherwise, `C(std::from\_range, [std::declval](http://en.cppreference.com/w/cpp/utility/declval)<R>(), [std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...)`, if that expression is valid. * Otherwise, `C([std::declval](http://en.cppreference.com/w/cpp/utility/declval)</\*input-iterator\*/>(), [std::declval](http://en.cppreference.com/w/cpp/utility/declval)</\*input-iterator\*/>(), [std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...)`, if that expression is valid. * Otherwise, the program is ill-formed. The call is equivalent to `to<decltype(/\*DEDUCE-EXPR\*/)>([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<R>(r), [std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...)`. 3-4) Returns a perfect forwarding call wrapper that is also a [range adaptor closure object](../ranges#Range_adaptor_closure_objects "cpp/ranges"). 5) The exposition-only variable template `/*reservable-container*/<Container>` is `true` if it satisfies `[ranges::sized\_range](http://en.cppreference.com/w/cpp/ranges/sized_range)` and is eligible to be reservable. 6) The exposition-only variable template `/*container-insertable*/<Container, Reference>` is `true` if `Container` is back insertable by a member function call `push_back` or `insert`. 7) The exposition-only function template `/*container-inserter*/` returns an output iterator of type `std::back_insert_iterator` if member function `push_back` is available, otherwise the type is `std::insert_iterator`. 8) The exposition-only concept `/*container-compatible-range*/` is used in the definition of containers in constructing an input range `R` and its range reference type must be convertible to `T`. ### Parameters | | | | | --- | --- | --- | | r | - | a source range object | | args | - | list of the arguments to (1-2) construct a range or (3-4) bind to the last parameters of range adaptor closure object. | ### Return value 1-2) a constructed non-view range object 3-4) a range adaptor closure object of unspecified type, with the following properties: ranges::to return type ----------------------- The return type is derived from `ranges::range_adaptor_closure</*return-type*/>`. #### Member objects The returned object behaves as if it has no target object, and an `[std::tuple](../utility/tuple "cpp/utility/tuple")` object `tup` constructed with `[std::tuple](http://en.cppreference.com/w/cpp/utility/tuple)<[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<Args>...>([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<Args>(args)...)`, except that the returned object's assignment behavior is unspecified and the names are for exposition only. #### Constructors The return type of `ranges::to` (3-4) behaves as if its copy/move constructors perform a memberwise copy/move. It is [CopyConstructible](../named_req/copyconstructible "cpp/named req/CopyConstructible") if all of its member objects (specified above) are CopyConstructible, and is [MoveConstructible](../named_req/moveconstructible "cpp/named req/MoveConstructible") otherwise. #### Member function `operator()` Given an object `G` obtained from an earlier call to `range::to</* see below */>(args...)`, when a glvalue `g` designating `G` is invoked in a function call expression `g(r)`, an invocation of the stored object takes place, as if by. * `ranges::to</\* see below \*/>(r, [std::get](http://en.cppreference.com/w/cpp/utility/variant/get)<Ns>(g.tup)...)`, where + `r` is a source range object that must satisfy [`input_range`](input_range "cpp/ranges/input range") + `Ns` is an integer pack `0, 1, ..., (sizeof...(Args) - 1)` + `g` is an lvalue in the call expression if it is an lvalue in the call expression, and is an rvalue otherwise. Thus `std::move(g)(r)` can move the bound arguments into the call, where `g(r)` would copy. + The specified template argument is (3) `C` or (4) the deduced type from a class template `C` that must not satisfy [`view`](view "cpp/ranges/view"). The program is ill-formed if `g` has volatile-qualified type. ### Exceptions Only throws if construction of a range object throws. ### Notes The insertion of elements into the container may involve copy which can be more inefficient than move because lvalue references are produced during the indirection call. Users can opt-in to use `views::as_rvalue` to adapt the range in order for their elements to always produce an rvalue reference during the indirection call which implies move. | [Feature-test](../utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_ranges_to_container`](../feature_test#Library_features "cpp/feature test") | `202202L` | (C++23) | ### Example
programming_docs
cpp std::ranges::views::join, std::ranges::join_view std::ranges::views::join, std::ranges::join\_view ================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::input_range V > requires ranges::view<V> && ranges::input_range<ranges::range_reference_t<V>> class join_view : public ranges::view_interface<join_view<V>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /* unspecified */ join = /* unspecified */; } ``` | (2) | (since C++20) | 1) A range adaptor that represents [`view`](view "cpp/ranges/view") consisting of the sequence obtained from flattening a view of ranges. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::join(e)` is *expression-equivalent* to `join_view<[views::all\_t](http://en.cppreference.com/w/cpp/ranges/all_view)<decltype((e))>>{e}` for any suitable subexpressions `e`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Member functions | | | | --- | --- | | [(constructor)](join_view/join_view "cpp/ranges/join view/join view") (C++20) | constructs a `join_view` (public member function) | | [base](join_view/base "cpp/ranges/join view/base") (C++20) | returns a copy of the underlying (adapted) view (public member function) | | [begin](join_view/begin "cpp/ranges/join view/begin") (C++20) | returns an iterator to the beginning (public member function) | | [end](join_view/end "cpp/ranges/join view/end") (C++20) | returns an iterator or a sentinel to the end (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](join_view/deduction_guides "cpp/ranges/join view/deduction guides") ### Nested classes | | | | --- | --- | | [*iterator*](join_view/iterator "cpp/ranges/join view/iterator") (C++20) | the iterator type (exposition-only member class template) | | [*sentinel*](join_view/sentinel "cpp/ranges/join view/sentinel") (C++20) | the sentinel type (exposition-only member class template) | ### Notes Before [P2328R1](https://wg21.link/P2328R1) was adopted, the inner range type (`[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>`) cannot be a container type (but can be reference to container). For example, it was not allowed to join a [`transform_view`](transform_view "cpp/ranges/transform view") of `[std::string](../string/basic_string "cpp/string/basic string")` prvalue. ``` struct Person { int age; std::string name; }; auto f(std::vector<Person>& v) { // return v | std::views::transform([](auto& p) { return p.name; }) // | std::views::join; // error before P2328R1 return v | std::views::transform([](auto& p) -> std::string& { return p.name; }) | std::views::join; // OK } ``` ### Example ``` #include <iostream> #include <ranges> #include <string_view> #include <vector> int main() { using namespace std::literals; const auto bits = { "https:"sv, "//"sv, "cppreference"sv, "."sv, "com"sv }; for (char const c : bits | std::views::join) std::cout << c; std::cout << '\n'; const std::vector<std::vector<int>> v{ {1,2}, {3,4,5}, {6}, {7,8,9} }; auto jv = std::ranges::join_view(v); for (int const e : jv) std::cout << e << ' '; std::cout << '\n'; } ``` Output: ``` https://cppreference.com 1 2 3 4 5 6 7 8 9 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3474](https://cplusplus.github.io/LWG/issue3474) | C++20 | `views::join(e)` returns a copy of `e` when `e` is a `join_view` | returns a nested `join_view` | | [P2328R1](https://wg21.link/P2328R1) | C++20 | non-view [`range`](range "cpp/ranges/range") prvalues can't be joined by `join_view` | made joinable | ### See also | | | | --- | --- | | [ranges::join\_with\_viewviews::join\_with](join_with_view "cpp/ranges/join with view") (C++23) | a [`view`](view "cpp/ranges/view") consisting of the sequence obtained from flattening a view of ranges, with the delimiter in between elements (class template) (range adaptor object) | | [ranges::split\_viewviews::split](split_view "cpp/ranges/split view") (C++20) | a [`view`](view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) | | [ranges::lazy\_split\_viewviews::lazy\_split](lazy_split_view "cpp/ranges/lazy split view") (C++20) | a [`view`](view "cpp/ranges/view") over the subranges obtained from splitting another [`view`](view "cpp/ranges/view") using a delimiter (class template) (range adaptor object) | cpp std::ranges::borrowed_iterator_t, std::ranges::borrowed_subrange_t std::ranges::borrowed\_iterator\_t, std::ranges::borrowed\_subrange\_t ====================================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::range R > using borrowed_iterator_t = /* see below */; ``` | (1) | (since C++20) | | ``` template< ranges::range R > using borrowed_subrange_t = /* see below */; ``` | (2) | (since C++20) | 1) `std::[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<R>` if `R` models [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"), `[std::ranges::dangling](dangling "cpp/ranges/dangling")` otherwise. 2) `std::[ranges::subrange](http://en.cppreference.com/w/cpp/ranges/subrange)<std::[ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<R>>` if `R` models [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"), `[std::ranges::dangling](dangling "cpp/ranges/dangling")` otherwise. These two alias templates are used by some [constrained algorithms](../algorithm/ranges "cpp/algorithm/ranges") to avoid returning potentially dangling iterators or views. ### Possible implementation | First version | | --- | | ``` template< std::ranges::range R > using borrowed_iterator_t = std::conditional_t<std::ranges::borrowed_range<R>, std::ranges::iterator_t<R>, std::ranges::dangling>; ``` | | Second version | | ``` template< std::ranges::range R > using borrowed_subrange_t = std::conditional_t<std::ranges::borrowed_range<R>, std::ranges::subrange<std::ranges::iterator_t<R>>, std::ranges::dangling>; ``` | ### See also | | | | --- | --- | | [ranges::dangling](dangling "cpp/ranges/dangling") (C++20) | a placeholder type indicating that an iterator or a `subrange` should not be returned since it would be dangling (class) | cpp std::ranges::views::drop_while, std::ranges::drop_while_view std::ranges::views::drop\_while, std::ranges::drop\_while\_view =============================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::view V, class Pred > requires ranges::input_range<V> && std::is_object_v<Pred> && std::indirect_unary_predicate<const Pred, ranges::iterator_t<V>> class drop_while_view : public ranges::view_interface<drop_while_view<V, Pred>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /*unspecified*/ drop_while = /*unspecified*/; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< ranges::viewable_range R, class Pred > requires /* see below */ constexpr ranges::view auto drop_while( R&& r, Pred&& pred ); ``` | | (since C++20) | | ``` template< class Pred > constexpr /*range adaptor closure*/ drop_while( Pred&& pred ); ``` | | (since C++20) | 1) A range adaptor that represents [`view`](view "cpp/ranges/view") of elements from an underlying sequence, beginning at the first element for which the predicate returns `false`. 2) [Range adaptor objects](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::drop_while(e, f)` is *expression-equivalent* to `drop_while_view(e, f)` for any suitable subexpressions `e` and `f`. `drop_while_view` models the concepts [`contiguous_range`](contiguous_range "cpp/ranges/contiguous range"), [`random_access_range`](random_access_range "cpp/ranges/random access range"), [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`forward_range`](forward_range "cpp/ranges/forward range"), [`input_range`](input_range "cpp/ranges/input range"), and [`common_range`](common_range "cpp/ranges/common range") when the underlying view `V` models respective concepts. It also models [`sized_range`](sized_range "cpp/ranges/sized range") if `[ranges::forward\_range](http://en.cppreference.com/w/cpp/ranges/forward_range)<V>` and `[std::sized\_sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sized_sentinel_for)<[ranges::sentinel\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>, [ranges::iterator\_t](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/iterator_t)<V>>` are modeled. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Member functions | | | | --- | --- | | [(constructor)](drop_while_view/drop_while_view "cpp/ranges/drop while view/drop while view") (C++20) | constructs a `drop_while_view` (public member function) | | [base](drop_while_view/base "cpp/ranges/drop while view/base") (C++20) | returns a copy of the underlying (adapted) view (public member function) | | [pred](drop_while_view/pred "cpp/ranges/drop while view/pred") (C++20) | returns a reference to the stored predicate (public member function) | | [begin](drop_while_view/begin "cpp/ranges/drop while view/begin") (C++20) | returns an iterator to the beginning (public member function) | | [end](drop_while_view/end "cpp/ranges/drop while view/end") (C++20) | returns an iterator or a sentinel to the end (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [data](view_interface/data "cpp/ranges/view interface/data") (C++20) | Gets the address of derived view's data. Provided if its iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"). (public member function of `std::ranges::view_interface<D>`) | | [size](view_interface/size "cpp/ranges/view interface/size") (C++20) | Returns the number of elements in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range") and its sentinel and iterator type satisfy [`sized_sentinel_for`](../iterator/sized_sentinel_for "cpp/iterator/sized sentinel for"). (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](drop_while_view/deduction_guides "cpp/ranges/drop while view/deduction guides") ### Helper templates | | | | | --- | --- | --- | | ``` template<class T, class Pred> inline constexpr bool enable_borrowed_range<std::ranges::drop_while_view<T, Pred>> = std::ranges::enable_borrowed_range<T>; ``` | | (since C++20) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `drop_while_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when the underlying view satisfies it. ### Notes In order to provide the amortized constant time complexity required by the [`range`](range "cpp/ranges/range") concept, the result of [`begin`](drop_while_view/begin "cpp/ranges/drop while view/begin") is cached within the `drop_while_view` object. If the underlying range is modified after the first call to `begin()`, subsequent uses of the `drop_while_view` object might have unintuitive behavior. ### Example ``` #include <iomanip> #include <iostream> #include <ranges> #include <string> #include <string_view> using std::operator""sv; [[nodiscard]] constexpr bool is_space(char ch) noexcept { switch (ch) { case ' ': case '\t': case '\n': case '\v': case '\r': case '\f': return true; } return false; }; [[nodiscard]] constexpr std::string_view trim_left(std::string_view const in) noexcept { return in | std::views::drop_while(is_space); } [[nodiscard]] std::string trim(std::string_view const in) { auto view = in | std::views::drop_while(is_space) | std::views::reverse | std::views::drop_while(is_space) | std::views::reverse ; return {view.begin(), view.end()}; } int main() { static_assert(trim_left(" \n C++23") == "C++23"sv); const auto s = trim(" \f\n\t\r\vHello, C++20!\f\n\t\r\v "); std::cout << "s = " << std::quoted(s) << '\n'; static constexpr auto v = {0, 1, 2, 3, 4, 5}; for (int n : v | std::views::drop_while([](int i) { return i < 3; })) { std::cout << n << ' '; } } ``` Output: ``` s = "Hello, C++20!" 3 4 5 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3494](https://cplusplus.github.io/LWG/issue3494) | C++20 | `drop_while_view` was never a `borrowed_range` | it is a `borrowed_range` if its underlying view is | ### See also | | | | --- | --- | | [ranges::drop\_viewviews::drop](drop_view "cpp/ranges/drop view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of elements of another [`view`](view "cpp/ranges/view"), skipping the first N elements (class template) (range adaptor object) | cpp std::ranges::data std::ranges::data ================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ data = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::remove_reference_t<ranges::range_reference_t<T>>* data( T&& t ); ``` | | (since C++20) | Returns a pointer to the first element of a contiguous range. If the argument is an lvalue or `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `true`, a call to `ranges::data` is expression-equivalent to: 1. `[std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t).data()` converted to its [decayed type](../types/decay "cpp/types/decay"), if that converted expression is valid, and its return type is a pointer to an object type. 2. Otherwise, `[std::to\_address](http://en.cppreference.com/w/cpp/memory/to_address)([ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t)))`, if `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t))` is valid and returns a type that models `[std::contiguous\_iterator](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator")`. If `[std::remove\_all\_extents\_t](http://en.cppreference.com/w/cpp/types/remove_all_extents)<[std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<T>>` is incomplete, then `[ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t))` is ill-formed, no diagnostic required. In all other cases, a call to `ranges::data` is ill-formed, which can result in [substitution failure](../language/sfinae "cpp/language/sfinae") when `[ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)(e)` appears in the immediate context of a template instantiation. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::data` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_data\_fn*`. All instances of `*\_\_data\_fn*` are equal. The effects of invoking different instances of type `*\_\_data\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::data` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::data` above, `*\_\_data\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__data_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __data_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__data_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __data_fn&, Args...>`. Otherwise, no function call operator of `*\_\_data\_fn*` participates in overload resolution. ### Notes If the argument is an rvalue (i.e. `T` is an object type) and `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `false`, the call to `ranges::data` is ill-formed, which also results in substitution failure. If `[ranges::data](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/data)(e)` is valid for an expression `e`, then it returns a pointer to an object. The C++20 standard requires that if the underlying `data` function call returns a prvalue, the return value is move-constructed from the materialized temporary object. All implementations directly return the prvalue instead. The requirement is corrected by the post-C++20 proposal [P0849R8](https://wg21.link/P0849R8) to match the implementations. ### Example ``` #include <cstring> #include <iostream> #include <ranges> #include <string> int main() { std::string s {"Hello world!\n"}; char a[20]; // storage for a C-style string std::strcpy(a, std::ranges::data(s)); // [data(s), data(s) + size(s)] is guaranteed to be an NTBS std::cout << a; } ``` Output: ``` Hello world! ``` ### See also | | | | --- | --- | | [ranges::cdata](cdata "cpp/ranges/cdata") (C++20) | obtains a pointer to the beginning of a read-only contiguous range (customization point object) | | [ranges::begin](begin "cpp/ranges/begin") (C++20) | returns an iterator to the beginning of a range (customization point object) | | [data](../iterator/data "cpp/iterator/data") (C++17) | obtains the pointer to the underlying array (function template) |
programming_docs
cpp std::ranges::cend std::ranges::cend ================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ cend = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::sentinel_for<ranges::iterator_t<T>> auto cend( T&& t ); ``` | | (since C++20) | Returns a sentinel indicating the end of a const-qualified range. ![range-begin-end.svg]() Let `CT` be. 1. `const [std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<T>&` if the argument is a lvalue (i.e. `T` is an lvalue reference type), 2. `const T` otherwise, a call to `ranges::cend` is expression-equivalent to `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(static\_cast<CT&&>(t))`. If `[ranges::cend](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/cend)(e)` is valid for an expression `e`, where `decltype((e))` is `T`, then `CT` models `[std::ranges::range](range "cpp/ranges/range")`, and `[std::sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sentinel_for)<S, I>` is `true` in all cases, where `S` is `decltype([ranges::cend](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/cend)(e))`, and `I` is `decltype([ranges::cbegin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/cbegin)(e))`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::cend` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_cend\_fn*`. All instances of `*\_\_cend\_fn*` are equal. The effects of invoking different instances of type `*\_\_cend\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::cend` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::cend` above, `*\_\_cend\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__cend_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __cend_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__cend_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __cend_fn&, Args...>`. Otherwise, no function call operator of `*\_\_cend\_fn*` participates in overload resolution. ### Example ``` #include <algorithm> #include <iostream> #include <ranges> #include <vector> int main() { std::vector<int> v = { 3, 1, 4 }; namespace ranges = std::ranges; if (ranges::find(v, 5) != ranges::cend(v)) { std::cout << "found a 5 in vector v!\n"; } int a[] = { 5, 10, 15 }; if (ranges::find(a, 5) != ranges::cend(a)) { std::cout << "found a 5 in array a!\n"; } } ``` Output: ``` found a 5 in array a! ``` ### See also | | | | --- | --- | | [ranges::end](end "cpp/ranges/end") (C++20) | returns a sentinel indicating the end of a range (customization point object) | | [endcend](../iterator/end "cpp/iterator/end") (C++11)(C++14) | returns an iterator to the end of a container or array (function template) | cpp std::ranges::views::all, std::ranges::views::all_t std::ranges::views::all, std::ranges::views::all\_t =================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline constexpr /* unspecified */ all = /* unspecified */; ``` | (1) | (since C++20) | | ``` template <ranges::viewable_range R> using all_t = decltype(views::all(std::declval<R>())); ``` | (2) | (since C++20) | 1) A [range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges") (also a [range adaptor closure object](../ranges#Range_adaptor_closure_objects "cpp/ranges")) that returns a [`view`](view "cpp/ranges/view") that includes all elements of its [`range`](range "cpp/ranges/range") argument. The expression `views::all(e)` is expression-equivalent (has the same effect) to: * Implicitly converting `e` to a `[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<decltype((e))>` prvalue, if the result type models [`view`](view "cpp/ranges/view"). * Otherwise, `std::[ranges::ref\_view](http://en.cppreference.com/w/cpp/ranges/ref_view){e}` if that expression is well-formed. * Otherwise, `std::ranges::owning_view{e}`. 2) Calculates the suitable [`view`](view "cpp/ranges/view") type of a [`viewable_range`](viewable_range "cpp/ranges/viewable range") type. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Example ``` #include <ranges> #include <vector> #include <iostream> #include <type_traits> int main() { std::vector<int> v{0,1,2,3,4,5}; for(int n : std::views::all(v) | std::views::take(2) ) { std::cout << n << ' '; } static_assert(std::is_same< decltype(std::views::single(42)), std::ranges::single_view<int> >{}); static_assert(std::is_same< decltype(std::views::all(v)), std::ranges::ref_view<std::vector<int, std::allocator<int>>> >{}); int a[]{1,2,3,4}; static_assert(std::is_same< decltype(std::views::all(a)), std::ranges::ref_view<int [4]> >{}); static_assert(std::is_same< decltype(std::ranges::subrange{std::begin(a)+1, std::end(a)-1}), std::ranges::subrange<int*, int*, std::ranges::subrange_kind(1)> >{}); } ``` Output: ``` 0 1 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2415R2](https://wg21.link/P2415R2) | C++20 | `views::all` returned a `subrange` for a non-[`view`](view "cpp/ranges/view") rvalue [`range`](range "cpp/ranges/range") | returns a `owning_view` for it | ### See also | | | | --- | --- | | [ranges::empty\_viewviews::empty](empty_view "cpp/ranges/empty view") (C++20) | an empty [`view`](view "cpp/ranges/view") with no elements (class template) (variable template) | | [ranges::single\_viewviews::single](single_view "cpp/ranges/single view") (C++20) | a [`view`](view "cpp/ranges/view") that contains a single element of a specified value (class template) (customization point object) | | [ranges::owning\_view](owning_view "cpp/ranges/owning view") (C++20) | a [`view`](view "cpp/ranges/view") with unique ownership of some [`range`](range "cpp/ranges/range") (class template) | cpp std::ranges::crend std::ranges::crend ================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ crend = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::sentinel_for<ranges::iterator_t<T>> auto crend( T&& t ); ``` | | (since C++20) | Returns a sentinel indicating the end of a const-qualified range that is treated as a reversed sequence. ![range-rbegin-rend.svg]() Let `CT` be. 1. `const [std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<T>&` if the argument is a lvalue (i.e. `T` is an lvalue reference type), 2. `const T` otherwise, a call to `ranges::crend` is expression-equivalent to `[ranges::rend](http://en.cppreference.com/w/cpp/ranges/rend)(static\_cast<CT&&>(t))`. If `[ranges::crend](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/crend)(e)` is valid for an expression `e`, then `[std::sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sentinel_for)<S, I>` is `true` in all cases, where `S` is `decltype([ranges::crend](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/crend)(e))`, and `I` is `decltype([ranges::crbegin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/crbegin)(e))`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::crend` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_crend\_fn*`. All instances of `*\_\_crend\_fn*` are equal. The effects of invoking different instances of type `*\_\_crend\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::crend` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::crend` above, `*\_\_crend\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__crend_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __crend_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__crend_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __crend_fn&, Args...>`. Otherwise, no function call operator of `*\_\_crend\_fn*` participates in overload resolution. ### Example ``` #include <iostream> #include <vector> #include <iterator> #include <algorithm> int main() { int a[] = {4, 6, -3, 9, 10}; std::cout << "Array backwards: "; namespace ranges = std::ranges; ranges::copy(ranges::rbegin(a), ranges::rend(a), std::ostream_iterator<int>(std::cout, " ")); std::cout << "\nVector backwards: "; std::vector<int> v = {4, 6, -3, 9, 10}; ranges::copy(ranges::rbegin(v), ranges::rend(v), std::ostream_iterator<int>(std::cout, " ")); } ``` Output: ``` Array backwards: 10 9 -3 6 4 Vector backwards: 10 9 -3 6 4 ``` ### See also | | | | --- | --- | | [ranges::rend](rend "cpp/ranges/rend") (C++20) | returns a reverse end iterator to a range (customization point object) | | [rendcrend](../iterator/rend "cpp/iterator/rend") (C++14) | returns a reverse end iterator for a container or array (function template) | cpp std::ranges::views::zip, std::ranges::zip_view std::ranges::views::zip, std::ranges::zip\_view =============================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::input_range... Views > requires (ranges::view<Views> && ...) && (sizeof...(Views) > 0) class zip_view : public ranges::view_interface<zip_view<Views...>> ``` | (1) | (since C++23) | | ``` namespace views { inline constexpr /*unspecified*/ zip = /*unspecified*/; } ``` | (2) | (since C++23) | | Call signature | | | | ``` template< ranges::viewable_range... Rs > requires /* see below */ constexpr auto zip( Rs&&... rs ); ``` | | (since C++23) | | Helper alias templates | | | | ``` template< class... Ts > using /*tuple-or-pair*/ = /* see description */; // exposition only ``` | (3) | (since C++23) | | Helper concepts | | | | ``` template< class... Rs > concept /*zip-is-common*/ = // exposition only (sizeof...(Rs) == 1 && (ranges::common_range<Rs> && ...)) || (!(ranges::bidirectional_range<Rs> && ...) && (ranges::common_range<Rs> && ...)) || ((ranges::random_access_range<Rs> && ...) && (ranges::sized_range<Rs> && ...)); ``` | (4) | (since C++23) | | Helper function templates | | | | ``` template< class F, class Tuple > constexpr auto /*tuple-transform*/( F&& f, Tuple&& tuple ) { // exposition only return std::apply([&]<class... Ts>(Ts&&... elements) { return /*tuple-or-pair*/<std::invoke_result_t<F&, Ts>...>( std::invoke(f, std::forward<Ts>(elements))... ); }, std::forward<Tuple>(tuple)); } ``` | (5) | (since C++23) | | ``` template< class F, class Tuple > constexpr void /*tuple-for-each*/( F&& f, Tuple&& tuple ) { // exposition only std::apply([&]<class... Ts>(Ts&&... elements) { (std::invoke(f, std::forward<Ts>(elements)), ...); }, std::forward<Tuple>(tuple)); } ``` | (6) | (since C++23) | 1) `zip_view` is a range adaptor that takes one or more [`view`s](view "cpp/ranges/view"), and produces a [`view`](view "cpp/ranges/view") whose `*i*`th element is a tuple-like value consisting of the `*i*`th elements of all views. The size of produced view is the minimum of sizes of all adapted views. 2) `views::zip` is a customization point object. When calling with no argument, `views::zip()` is *expression-equivalent* to `auto([views::empty](http://en.cppreference.com/w/cpp/ranges/empty_view)<[std::tuple](http://en.cppreference.com/w/cpp/utility/tuple)<>>)`. Otherwise, `views::zip(rs...)` is. *expression-equivalent* to `ranges::zip\_view<[views::all\_t](http://en.cppreference.com/w/cpp/ranges/all_view)<decltype((rs))>...>(rs...)`. 3) Let `Ts` be some pack of types, then `/*tuple-or-pair*/<Ts...>` denotes: * `[std::pair](http://en.cppreference.com/w/cpp/utility/pair)<Ts...>`, if `sizeof...(Ts)` is 2, * `[std::tuple](http://en.cppreference.com/w/cpp/utility/tuple)<Ts...>` otherwise. `zip_view` always models [`input_range`](input_range "cpp/ranges/input range"), and models [`forward_range`](forward_range "cpp/ranges/forward range"), [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`random_access_range`](random_access_range "cpp/ranges/random access range"), or [`sized_range`](sized_range "cpp/ranges/sized range") if all adapted [`view`](view "cpp/ranges/view") types model the corresponding concept. `zip_view` models [`common_range`](common_range "cpp/ranges/common range") if. * `sizeof...(Views)` is equal to 1, and the only adapted view type models [`common_range`](common_range "cpp/ranges/common range"), or * at least one adapted view type does not model [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), and every adapted view type models [`common_range`](common_range "cpp/ranges/common range"), or * every adapted view type models both [`random_access_range`](random_access_range "cpp/ranges/random access range") and [`sized_range`](sized_range "cpp/ranges/sized range"). ### Customization point objects The name `views::zip` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_zip\_fn*`. All instances of `*\_\_zip\_fn*` are equal. The effects of invoking different instances of type `*\_\_zip\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::zip` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::zip` above, `*\_\_zip\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__zip_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __zip_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__zip_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __zip_fn&, Args...>`. Otherwise, no function call operator of `*\_\_zip\_fn*` participates in overload resolution. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `zip_view` hold only one non-static data member: a `[std::tuple](http://en.cppreference.com/w/cpp/utility/tuple)<Views...>` holding all adapted view objects. For the purpose of exposition, the view objects in that `[std::tuple](../utility/tuple "cpp/utility/tuple")` are shown as `*vs\_*...` here. ### Member functions | | | | --- | --- | | [(constructor)](zip_view/zip_view "cpp/ranges/zip view/zip view") (C++23) | constructs a `zip_view` (public member function) | | [begin](zip_view/begin "cpp/ranges/zip view/begin") (C++23) | returns an iterator to the beginning (public member function) | | [end](zip_view/end "cpp/ranges/zip view/end") (C++23) | returns an iterator or a sentinel to the end (public member function) | | [size](zip_view/size "cpp/ranges/zip view/size") (C++23) | returns the number of elements. Provided only if each underlying (adapted) range satisfies [`sized_range`](sized_range "cpp/ranges/sized range"). (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](zip_view/deduction_guides "cpp/ranges/zip view/deduction guides") ### Nested classes | | | | --- | --- | | [*iterator*](zip_view/iterator "cpp/ranges/zip view/iterator") (C++23) | the iterator type (exposition-only member class template) | | [*sentinel*](zip_view/sentinel "cpp/ranges/zip view/sentinel") (C++23) | the sentinel type used when `zip_view` is not a [`common_range`](common_range "cpp/ranges/common range") (exposition-only member class template) | ### Helper templates | | | | | --- | --- | --- | | ``` template< class... Views > inline constexpr bool enable_borrowed_range<ranges::zip_view<Views...>> = (ranges::enable_borrowed_range<Views> && ...); ``` | | (since C++23) | This specialization of [`ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes **`zip_view`** satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when each underlying view satisfies it. ### Notes | [Feature-test](../utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_ranges_zip`](../feature_test#Library_features "cpp/feature test") | `202110L` | (C++23) | ### Example Can be checked online on [Compiler Explorer site](https://godbolt.org/z/KMPq1KP6Y). ``` #include <list> #include <array> #include <tuple> #include <ranges> #include <vector> #include <string> #include <iostream> void print(auto const rem, auto const& range) { for (std::cout << rem; auto const& elem : range) std::cout << elem << ' '; std::cout << '\n'; } int main() { auto x = std::vector{1, 2, 3, 4}; auto y = std::list<std::string>{"α", "β", "γ", "δ", "ε"}; auto z = std::array{'A', 'B', 'C', 'D', 'E', 'F'}; print("Source views:", ""); print("x: ", x); print("y: ", y); print("z: ", z); print("\nzip(x,y,z):", ""); for (std::tuple<int&, std::string&, char&> elem : std::views::zip(x, y, z)) { std::cout << std::get<0>(elem) << ' ' << std::get<1>(elem) << ' ' << std::get<2>(elem) << '\n'; std::get<char&>(elem) += ('a' - 'A'); // modifies the element of z } print("\nAfter modification, z: ", z); } ``` Output: ``` Source views: x: 1 2 3 4 y: α β γ δ ε z: A B C D E F zip(x,y,z): 1 α A 2 β B 3 γ C 4 δ D After modification, z: a b c d E F ``` ### See also | | | | --- | --- | | [ranges::zip\_transform\_viewviews::zip\_transform](zip_transform_view "cpp/ranges/zip transform view") (C++23) | a [`view`](view "cpp/ranges/view") consisting of tuples of results of application of a transformation function to corresponding elements of the adapted views (class template) (customization point object) | | [ranges::elements\_viewviews::elements](elements_view "cpp/ranges/elements view") (C++20) | takes a [`view`](view "cpp/ranges/view") consisting of tuple-like values and a number N and produces a [`view`](view "cpp/ranges/view") of N'th element of each tuple (class template) (range adaptor object) |
programming_docs
cpp std::ranges::forward_range std::ranges::forward\_range =========================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept forward_range = ranges::input_range<T> && std::forward_iterator<ranges::iterator_t<T>>; ``` | | (since C++20) | The `forward_range` concept is a refinement of [`range`](range "cpp/ranges/range") for which `ranges::begin` returns a model of [`forward_iterator`](../iterator/forward_iterator "cpp/iterator/forward iterator"). cpp std::ranges::views::values, std::ranges::values_view std::ranges::views::values, std::ranges::values\_view ===================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class R > using values_view = ranges::elements_view<R, 1>; ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr auto values = ranges::elements<1>; } ``` | (2) | (since C++20) | Takes a [`view`](view "cpp/ranges/view") of *tuple-like* values (e.g. `[std::tuple](../utility/tuple "cpp/utility/tuple")` or `[std::pair](../utility/pair "cpp/utility/pair")`), and produces a view with a *value-type* of the *second* element of the adapted view's value-type. 1) An alias for `[ranges::elements\_view](http://en.cppreference.com/w/cpp/ranges/elements_view)<R, 1>`. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::values(e)` is *expression-equivalent* to `values_view<[views::all\_t](http://en.cppreference.com/w/cpp/ranges/all_view)<decltype((e))>>{e}` for any suitable subexpression `e`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Notes `values_view` can be useful for extracting *values* from associative containers, e.g. `for (auto const& value : std::views::values(map)) { /*...*/ }`. ### Example ``` #include <iostream> #include <ranges> #include <map> int main() { const auto list = { std::pair{1, 11.1} , {2, 22.2}, {3, 33.3} }; std::cout << "pair::second values in the list: "; for (double value : list | std::views::values) std::cout << value << ' '; std::map<char, int> map{ {'A', 1}, {'B', 2}, {'C', 3}, {'D', 4}, {'E', 5} }; auto odd = [](int x) { return 0 != (x & 1); }; std::cout << "\nodd values in the map: "; for (int value : map | std::views::values | std::views::filter(odd)) std::cout << value << ' '; } ``` Output: ``` pair::second values in the list: 11.1 22.2 33.3 odd values in the map: 1 3 5 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3563](https://cplusplus.github.io/LWG/issue3563) | C++20 | `keys_view` is unable to participate in CTAD due to its use of `views::all_t` | `views::all_t` removed | ### See also | | | | --- | --- | | [ranges::keys\_viewviews::keys](keys_view "cpp/ranges/keys view") (C++20) | takes a [`view`](view "cpp/ranges/view") consisting of pair-like values and produces a [`view`](view "cpp/ranges/view") of the first elements of each pair (class template) (range adaptor object) | | [ranges::elements\_viewviews::elements](elements_view "cpp/ranges/elements view") (C++20) | takes a [`view`](view "cpp/ranges/view") consisting of tuple-like values and a number N and produces a [`view`](view "cpp/ranges/view") of N'th element of each tuple (class template) (range adaptor object) | | [slice](../numeric/valarray/slice "cpp/numeric/valarray/slice") | BLAS-like slice of a valarray: starting index, length, stride (class) | cpp std::ranges::cbegin std::ranges::cbegin =================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ cbegin = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::input_or_output_iterator auto cbegin( T&& t ); ``` | | (since C++20) | Returns an iterator to the first element of the const-qualified argument. ![range-begin-end.svg]() Let `CT` be. 1. `const [std::remove\_reference\_t](http://en.cppreference.com/w/cpp/types/remove_reference)<T>&` if the argument is a lvalue (i.e. `T` is an lvalue reference type), 2. `const T` otherwise, a call to `ranges::cbegin` is expression-equivalent to `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(static\_cast<CT&&>(t))`. The return type models `[std::input\_or\_output\_iterator](../iterator/input_or_output_iterator "cpp/iterator/input or output iterator")` in both cases. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::cbegin` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_cbegin\_fn*`. All instances of `*\_\_cbegin\_fn*` are equal. The effects of invoking different instances of type `*\_\_cbegin\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::cbegin` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::cbegin` above, `*\_\_cbegin\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__cbegin_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __cbegin_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__cbegin_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __cbegin_fn&, Args...>`. Otherwise, no function call operator of `*\_\_cbegin\_fn*` participates in overload resolution. ### Example ``` #include <iostream> #include <ranges> #include <vector> int main() { std::vector<int> v = { 3, 1, 4 }; auto vi = std::ranges::cbegin(v); std::cout << *vi << '\n'; // *vi = 42; // Error: read-only variable is not assignable int a[] = { -5, 10, 15 }; auto ai = std::ranges::cbegin(a); std::cout << *ai << '\n'; // *ai = 42; // Error: read-only variable is not assignable } ``` Output: ``` 3 -5 ``` ### See also | | | | --- | --- | | [ranges::begin](begin "cpp/ranges/begin") (C++20) | returns an iterator to the beginning of a range (customization point object) | | [begincbegin](../iterator/begin "cpp/iterator/begin") (C++11)(C++14) | returns an iterator to the beginning of a container or array (function template) | cpp std::ranges::dangling std::ranges::dangling ===================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` struct dangling; ``` | | (since C++20) | `dangling` is a placeholder type and an empty class type, used together with the template aliases [`ranges::borrowed_iterator_t`](borrowed_iterator_t "cpp/ranges/borrowed iterator t") and [`ranges::borrowed_subrange_t`](borrowed_iterator_t "cpp/ranges/borrowed iterator t"). When some [constrained algorithms](../algorithm/ranges "cpp/algorithm/ranges") that usually return an iterator or a subrange of a [`range`](range "cpp/ranges/range") take a particular rvalue `range` argument that does not model [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"), `dangling` will be returned instead to avoid returning potentially dangling results. ### Member functions std::ranges::dangling::dangling -------------------------------- | | | | | --- | --- | --- | | ``` constexpr dangling() noexcept = default; ``` | (1) | | | ``` template<class... Args> constexpr dangling(Args&&...) noexcept { } ``` | (2) | | 1) `dangling` is trivially default constructible. 2) `dangling` can be constructed from arguments of arbitrary number and arbitrary non-void type. The construction does not have any side-effect itself. In other words, after replacing the type (e.g. an iterator type) in a well-formed non-aggregate initialization with `dangling`, the resulting initialization is also well-formed. ### Example ``` #include <algorithm> #include <array> #include <iostream> #include <ranges> #include <type_traits> #include <string_view> int main() { auto get_array_by_value = [] { return std::array{0, 1, 0, 1}; }; auto dangling_iter = std::ranges::max_element(get_array_by_value()); static_assert(std::is_same_v<std::ranges::dangling, decltype(dangling_iter)>); // std::cout << *dangling_iter << '\n'; // compilation error: no match for 'operator*' // (operand type is 'std::ranges::dangling') auto get_persistent_array = []() -> const std::array<int, 4>& { static constexpr std::array a{0, 1, 0, 1}; return a; }; auto valid_iter = std::ranges::max_element(get_persistent_array()); static_assert(!std::is_same_v<std::ranges::dangling, decltype(valid_iter)>); std::cout << *valid_iter << ' '; // 1 auto get_string_view = [] { return std::string_view{"alpha"}; }; auto valid_iter2 = std::ranges::min_element(get_string_view()); // OK: std::basic_string_view models borrowed_range static_assert(!std::is_same_v<std::ranges::dangling, decltype(valid_iter2)>); std::cout << '\'' << *valid_iter2 << '\'' << '\n'; // 'a' } ``` Output: ``` 1 'a' ``` ### See also | | | | --- | --- | | [ranges::borrowed\_iterator\_tranges::borrowed\_subrange\_t](borrowed_iterator_t "cpp/ranges/borrowed iterator t") (C++20) | obtains iterator type or `subrange` type of a [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") (alias template) | | [ranges::borrowed\_range](borrowed_range "cpp/ranges/borrowed range") (C++20) | specifies that a type is a [`range`](range "cpp/ranges/range") and iterators obtained from an expression of it can be safely returned without danger of dangling (concept) | cpp std::ranges::view, std::ranges::enable_view, std::ranges::view_base std::ranges::view, std::ranges::enable\_view, std::ranges::view\_base ===================================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template<class T> concept view = ranges::range<T> && std::movable<T> && ranges::enable_view<T>; ``` | (1) | (since C++20) | | ``` template<class T> inline constexpr bool enable_view = std::derived_from<T, view_base> || /*is-derived-from-view-interface*/<T>; ``` | (2) | (since C++20) | | ``` struct view_base { }; ``` | (3) | (since C++20) | 1) The `view` concept specifies the requirements of a [`range`](range "cpp/ranges/range") type that has suitable semantic properties for use in constructing range adaptor pipelines. 2) The `enable_view` variable template is used to indicate whether a [`range`](range "cpp/ranges/range") is a `view`. `/*is-derived-from-view-interface*/<T>` is `true` if and only if `T` has exactly one public base class `[ranges::view\_interface](http://en.cppreference.com/w/cpp/ranges/view_interface)<U>` for some type `U`, and `T` has no base classes of type `[ranges::view\_interface](http://en.cppreference.com/w/cpp/ranges/view_interface)<V>` for any other type `V`. Users may specialize `enable_view` to `true` for cv-unqualified program-defined types which model `view`, and `false` for types which do not. Such specializations must be [usable in constant expressions](../language/constant_expression#Usable_in_constant_expressions "cpp/language/constant expression") and have type `const bool`. 3) Deriving from `view_base` enables [`range`](range "cpp/ranges/range") types to model `view`. ### Semantic requirements 1) `T` models `view` only if: * move construction of `T` has constant time complexity, and * if \(\scriptsize N\)N copies and/or moves are made from a `T` object holding \(\scriptsize M\)M elements, then these \(\scriptsize N\)N objects have \(\scriptsize \mathcal{O}{(N+M)}\)𝓞(N+M) destruction (which implies that a moved-from `view` object has \(\scriptsize \mathcal{O}{(1)}\)𝓞(1) destruction), and * either `[std::copy\_constructible](http://en.cppreference.com/w/cpp/concepts/copy_constructible)<T>` is `false`, or copy construction of `T` has constant time complexity, and * either `[std::copyable](http://en.cppreference.com/w/cpp/concepts/copyable)<T>` is `false`, or copy assignment of `T` has no more time complexity than destruction followed by copy construction. ### Notes Examples of `view` types are: * A [`range`](range "cpp/ranges/range") type that wraps a pair of iterators, e.g., `std::[ranges::subrange](http://en.cppreference.com/w/cpp/ranges/subrange)<I>`. * A [`range`](range "cpp/ranges/range") type that holds its elements by `[std::shared\_ptr](../memory/shared_ptr "cpp/memory/shared ptr")` and shares ownership with all its copies. * A [`range`](range "cpp/ranges/range") type that generates its elements on demand, e.g., `[std::ranges::iota\_view](iota_view "cpp/ranges/iota view")`. A copyable container such as `[std::vector](http://en.cppreference.com/w/cpp/container/vector)<[std::string](http://en.cppreference.com/w/cpp/string/basic_string)>` generally does not meet the semantic requirements of `view` since copying the container copies all of the elements, which cannot be done in constant time. While views were originally described as cheaply copyable and non-owning ranges, a type is not required to be copyable or non-owning for it to model `view`. However, it must still be cheap to copy (if it is copyable), move, assign, and destroy, so that [range adaptors](../ranges#Range_adaptors "cpp/ranges") will not have unexpected complexity. By default, a type modeling [`movable`](../concepts/movable "cpp/concepts/movable") and [`range`](range "cpp/ranges/range") is considered a view if it is publicly and unambiguously derived from `view_base`, or exactly one specialization of `std::ranges::view_interface`. ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2325R3](https://wg21.link/P2325R3) | C++20 | `view` required [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable") | does not require | | [LWG 3549](https://cplusplus.github.io/LWG/issue3549) | C++20 | `enable_view` did not detect inheritance from `view_interface` | detects | | [P2415R2](https://wg21.link/P2415R2) | C++20 | the restriction on the time complexity of destruction was too strict | relaxed | cpp std::ranges::rend std::ranges::rend ================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ rend = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr std::sentinel_for<decltype(ranges::rbegin(std::declval<T>()))> auto rend( T&& t ); ``` | | (since C++20) | Returns a sentinel indicating the end of a reversed range. ![range-rbegin-rend.svg]() Let `t` be an object of type `T`. If the argument is an lvalue or `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `true`, then a call to `ranges::rend` is expression-equivalent to: 1. `t.rend()` converted to its [decayed type](../types/decay "cpp/types/decay"), if that expression with conversion is valid, and its converted type models `[std::sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sentinel_for)<decltype([ranges::rbegin](http://en.cppreference.com/w/cpp/ranges/rbegin)([std::declval](http://en.cppreference.com/w/cpp/utility/declval)<T>()))>`. 2. Otherwise, `rend(t)` converted to its [decayed type](../types/decay "cpp/types/decay"), if `T` is a class or enumeration type, the aforementioned unqualified call with conversion is valid, and its converted type models `[std::sentinel\_for](http://en.cppreference.com/w/cpp/iterator/sentinel_for)<decltype([ranges::rbegin](http://en.cppreference.com/w/cpp/ranges/rbegin)([std::declval](http://en.cppreference.com/w/cpp/utility/declval)<T>()))>`, where the [overload resolution](../language/overload_resolution "cpp/language/overload resolution") is performed with the following candidates: * `void rend(auto&) = delete;` * `void rend(const auto&) = delete;` * any declarations of `rend` found by [argument-dependent lookup](../language/adl "cpp/language/adl"). 3. Otherwise, `[std::make\_reverse\_iterator](http://en.cppreference.com/w/cpp/iterator/make_reverse_iterator)([ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(t))` if both `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(t)` and `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(t)` are valid expressions, have the same type, and that type models `[std::bidirectional\_iterator](../iterator/bidirectional_iterator "cpp/iterator/bidirectional iterator")`. In all other cases, a call to `ranges::rend` is ill-formed, which can result in [substitution failure](../language/sfinae "cpp/language/sfinae") when `ranges::rend(t)` appears in the immediate context of a template instantiation. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::rend` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_rend\_fn*`. All instances of `*\_\_rend\_fn*` are equal. The effects of invoking different instances of type `*\_\_rend\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::rend` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::rend` above, `*\_\_rend\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__rend_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __rend_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__rend_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __rend_fn&, Args...>`. Otherwise, no function call operator of `*\_\_rend\_fn*` participates in overload resolution. ### Notes If the argument is an rvalue (i.e. `T` is an object type) and `[ranges::enable\_borrowed\_range](http://en.cppreference.com/w/cpp/ranges/borrowed_range)<[std::remove\_cv\_t](http://en.cppreference.com/w/cpp/types/remove_cv)<T>>` is `false`, or if it is of an array type of unknown bound, the call to `ranges::rend` is ill-formed, which also results in substitution failure. If `ranges::rend([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t))` is valid, then `decltype(ranges::rend([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t)))` and `decltype([ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t)))` model `[std::sentinel\_for](../iterator/sentinel_for "cpp/iterator/sentinel for")` in all cases, while `T` models `[std::ranges::range](range "cpp/ranges/range")`. The C++20 standard requires that if the underlying `rend` function call returns a prvalue, the return value is move-constructed from the materialized temporary object. All implementations directly return the prvalue instead. The requirement is corrected by the post-C++20 proposal [P0849R8](https://wg21.link/P0849R8) to match the implementations. ### Example ``` #include <algorithm> #include <iostream> #include <ranges> #include <vector> int main() { std::vector<int> v = { 3, 1, 4 }; namespace ranges = std::ranges; if (ranges::find(ranges::rbegin(v), ranges::rend(v), 5) != ranges::rend(v)) { std::cout << "found a 5 in vector `v`!\n"; } int a[] = { 5, 10, 15 }; if (ranges::find(ranges::rbegin(a), ranges::rend(a), 5) != ranges::rend(a)) { std::cout << "found a 5 in array `a`!\n"; } } ``` Output: ``` found a 5 in array `a`! ``` ### See also | | | | --- | --- | | [ranges::crend](crend "cpp/ranges/crend") (C++20) | returns a reverse end iterator to a read-only range (customization point object) | | [ranges::rbegin](rbegin "cpp/ranges/rbegin") (C++20) | returns a reverse iterator to a range (customization point object) | | [rendcrend](../iterator/rend "cpp/iterator/rend") (C++14) | returns a reverse end iterator for a container or array (function template) |
programming_docs
cpp std::ranges::views::repeat, std::ranges::repeat_view std::ranges::views::repeat, std::ranges::repeat\_view ===================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< std::move_constructible W, std::semiregular Bound = std::unreachable_sentinel_t > requires (std::is_object_v<W> && std::same_as<W, std::remove_cv_t<W>> && (/*is-integer-like*/<Bound> || std::same_as<Bound, std::unreachable_sentinel_t>)) class repeat_view : public ranges::view_interface<repeat_view<W, Bound>> ``` | (1) | (since C++23) | | ``` namespace views { inline constexpr /*unspecified*/ repeat = /*unspecified*/; } ``` | (2) | (since C++23) | | Call signature | | | | ``` template< class W > requires /* see below */ constexpr /* see below */ repeat( W&& value ); ``` | | (since C++23) | | ``` template< class W, class Bound > requires /* see below */ constexpr /* see below */ repeat( W&& value, Bound&& bound ); ``` | | (since C++23) | 1) A range factory that generates a sequence of elements by repeatedly producing the same value. Can be either bounded or unbounded (infinite). 2) `views::repeat(e)` and `views::repeat(e, f)` are *expression-equivalent* to (has the same effect as) `repeat_view(e)` and `repeat_view(e, f)` respectively for any suitable subexpressions `e` and `f`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `views::repeat` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_repeat\_fn*`. All instances of `*\_\_repeat\_fn*` are equal. The effects of invoking different instances of type `*\_\_repeat\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::repeat` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::repeat` above, `*\_\_repeat\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__repeat_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __repeat_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__repeat_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __repeat_fn&, Args...>`. Otherwise, no function call operator of `*\_\_repeat\_fn*` participates in overload resolution. ### Data members Typical implementation of `repeat_view` holds two non-static data members: the beginning value `value_` of wrapped type `/*movable-box*/<W>` and the sentinel value `bound_` of type `Bound`. The names shown here are exposition-only. ### Member functions | | | | --- | --- | | (constructor) (C++23) | creates a `repeat_view` (public member function) | | begin (C++23) | obtains the beginning iterator of an `repeat_view` (public member function) | | end (C++23) | obtains the sentinel denoting the end of an `repeatview` (public member function) | | size (C++23) | obtains the size of an `repeat_view` if it is sized (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::repeat\_view::repeat\_view ---------------------------------------- | | | | | --- | --- | --- | | ``` repeat_view() requires std::default_initializable<W> = default; ``` | (1) | (since C++23) | | ``` constexpr explicit repeat_view( const W& value, Bound bound = Bound() ); ``` | (2) | (since C++23) | | ``` constexpr explicit repeat_view( W&& value, Bound bound = Bound() ); ``` | (3) | (since C++23) | | ``` template < class... WArgs, class... BoundArgs > requires std::constructible_from<W, WArgs...> && std::constructible_from<Bound, BoundArgs...> constexpr explicit repeat( std::piecewise_construct_t, std::tuple<WArgs...> value_args, std::tuple<BoundArgs...> bound_args = std::tuple<>{} ); ``` | (4) | (since C++23) | 1) Value-initializes `value_` and `bound_` via their default member initializers (`= W()` and `= Bound()`). 2) Initializes `value_` with `value` and initializes `bound_` with `bound`. The behavior is undefined if `Bound` is not `[std::unreachable\_sentinel\_t](http://en.cppreference.com/w/cpp/iterator/unreachable_sentinel_t)` and `bool(bound >= 0)` is `false`. 3) Initializes `value_` with `std::move(value)` and initializes `bound_` with `bound`. The behavior is undefined if `Bound` is not `[std::unreachable\_sentinel\_t](http://en.cppreference.com/w/cpp/iterator/unreachable_sentinel_t)` and `bool(bound >= 0)` is `false`. 4) Initializes `value_` and `bound_` through piecewise construction. ### Parameters | | | | | --- | --- | --- | | value | - | the value to be repeatedly produced | | bound | - | the bound | std::ranges::repeat\_view::begin --------------------------------- | | | | | --- | --- | --- | | ``` constexpr /*iterator*/ begin() const; ``` | | (since C++23) | Returns an [iterator](repeat_view/iterator "cpp/ranges/repeat view/iterator") initialized with `[std::addressof](http://en.cppreference.com/w/cpp/memory/addressof)(\*value_)`. std::ranges::repeat\_view::end ------------------------------- | | | | | --- | --- | --- | | ``` constexpr /*iterator*/ end() const requires (!std::same_as<Bound, std::unreachable_sentinel_t>); ``` | (1) | (since C++23) | | ``` constexpr std::unreachable_sentinel_t end() const; ``` | (2) | (since C++23) | 1) Returns an [iterator](repeat_view/iterator "cpp/ranges/repeat view/iterator") initialized with `[std::addressof](http://en.cppreference.com/w/cpp/memory/addressof)(\*value_)` and `bound_`. 2) Returns an `[std::unreachable\_sentinel](http://en.cppreference.com/w/cpp/iterator/unreachable_sentinel_t)`. std::ranges::repeat\_view::size -------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto size() const requires (!std::same_as<Bound, std::unreachable_sentinel_t>) { return /*to-unsigned-like*/(bound_); } ``` | | (since C++23) | Returns the size of the view if the view is bounded. The exposition-only function template `*to-unsigned-like*` converts its argument (which must be [integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable")) to the corresponding unsigned version of the argument type. ### Deduction guides | | | | | --- | --- | --- | | ``` template< class W, class Bound > repeat_view( W, Bound ) -> repeat_view<W, Bound>; ``` | | (since C++23) | ### Nested classes | | | | --- | --- | | [*iterator*](repeat_view/iterator "cpp/ranges/repeat view/iterator") (C++23) | the iterator type (exposition-only member class) | ### Notes If `Bound` is not `[std::unreachable\_sentinel\_t](../iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t")`, the `repeat_view` models [`sized_range`](sized_range "cpp/ranges/sized range") and [`common_range`](common_range "cpp/ranges/common range"). | [Feature-test](../utility/feature_test "cpp/utility/feature test") macro | Value | Std | | --- | --- | --- | | [`__cpp_lib_ranges_repeat`](../feature_test#Library_features "cpp/feature test") | `202207L` | (C++23) | ### Example ``` #include <ranges> #include <iostream> #include <string_view> using namespace std::literals; int main() { // bounded overload for (auto s: std::views::repeat("C++"sv, 4)) { std::cout << s << ' '; } std::cout << '\n'; // unbounded overload for (auto s: std::views::repeat("C++"sv) | std::views::take(4)) { std::cout << s << ' '; } std::cout << '\n'; } ``` Output: ``` C++ C++ C++ C++ C++ C++ C++ C++ ``` ### See also | | | | --- | --- | | [ranges::iota\_viewviews::iota](iota_view "cpp/ranges/iota view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of a sequence generated by repeatedly incrementing an initial value (class template) (customization point object) | cpp std::ranges::range std::ranges::range ================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< class T > concept range = requires( T& t ) { ranges::begin(t); // equality-preserving for forward iterators ranges::end (t); }; ``` | | (since C++20) | The `range` concept defines the requirements of a type that allows iteration over its elements by providing an iterator and sentinel that denote the elements of the range. ### Semantic requirements Given an expression `E` such that `decltype((E))` is `T`, `T` models `range` only if. * [`[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(E)`, `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(E)`) denotes a range, and * both `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(E)` and `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(E)` are amortized constant time and do not alter the value of `E` in a manner observable to equality-preserving expressions, and * if the type of `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(E)` models [`forward_iterator`](../iterator/forward_iterator "cpp/iterator/forward iterator"), `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(E)` is equality-preserving (in other words, forward iterators support multi-pass algorithms) Note: In the definition above, the required expressions `[ranges::begin](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/begin)(t)` and `[ranges::end](http://en.cppreference.com/w/cpp/ranges-ranges-placeholder/end)(t)` do not require [implicit expression variations](../concepts/equality_comparable#Implicit_expression_variations "cpp/concepts/equality comparable"). ### Example ``` #include <ranges> #include <vector> #include <iostream> template <typename T> struct range_t : private T { using T::begin, T::end; /*...*/ }; static_assert(std::ranges::range< range_t<std::vector<int>> >); template <typename T> struct scalar_t { T t{}; /* no begin/end */ }; static_assert(not std::ranges::range< scalar_t<int> >); int main() { if constexpr (range_t<std::vector<int>> r; std::ranges::range<decltype(r)>) { std::cout << "r is a range\n"; } if constexpr (scalar_t<int> s; not std::ranges::range<decltype(s)>) { std::cout << "s is not a range\n"; } } ``` Output: ``` r is a range s is not a range ``` cpp std::ranges::views::counted std::ranges::views::counted =========================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline constexpr /*unspecified*/ counted = /*unspecified*/; ``` | | (since C++20) | | Call signature | | | | ``` template< class Iterator, class DifferenceType > requires /* see below */ constexpr /*span-or-subrange*/ counted( Iterator&& it, DifferenceType&& count ); ``` | | (since C++20) | A counted view presents a [`view`](view "cpp/ranges/view") of the elements of the *counted range* `[i, n)` for some iterator `i` and non-negative integer `n`. A counted range `[i, n)` is the `n` elements starting with the element pointed to by `i` and up to but not including the element, if any, pointed to by the result of `n` applications of `++i`. If `n == 0`, the counted range is valid and empty. Otherwise, the counted range is only valid if `n` is positive, `i` is dereferenceable, and `[++i, --n)` is a valid counted range. Formally, if `it` and `count` are expressions, `T` is `[std::decay\_t](http://en.cppreference.com/w/cpp/types/decay)<decltype((it))>`, and `D` is `[std::iter\_difference\_t](http://en.cppreference.com/w/cpp/iterator/iter_t)<T>`, then. if `T` models [`input_or_output_iterator`](../iterator/input_or_output_iterator "cpp/iterator/input or output iterator") and `decltype((count))` models `[std::convertible\_to](http://en.cppreference.com/w/cpp/concepts/convertible_to)<D>`, * if `T` models [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"), then `[views::counted](http://en.cppreference.com/w/cpp/ranges/counted_view)(it, count)` is expression-equivalent to `[std::span](http://en.cppreference.com/w/cpp/container/span)([std::to\_address](http://en.cppreference.com/w/cpp/memory/to_address)(it), static\_cast<[std::size\_t](http://en.cppreference.com/w/cpp/types/size_t)>(static\_cast<D>(count)))`, * otherwise, if `T` models [`random_access_iterator`](../iterator/random_access_iterator "cpp/iterator/random access iterator"), then `[views::counted](http://en.cppreference.com/w/cpp/ranges/counted_view)(it, count)` is expression-equivalent to `[ranges::subrange](http://en.cppreference.com/w/cpp/ranges/subrange)(it, it + static\_cast<D>(count))`, * otherwise, `[views::counted](http://en.cppreference.com/w/cpp/ranges/counted_view)(it, count)` is expression-equivalent to `[ranges::subrange](http://en.cppreference.com/w/cpp/ranges/subrange)([std::counted\_iterator](http://en.cppreference.com/w/cpp/iterator/counted_iterator)(it, count), [std::default\_sentinel](http://en.cppreference.com/w/cpp/iterator/default_sentinel))`. Otherwise, `[views::counted](http://en.cppreference.com/w/cpp/ranges/counted_view)(it, count)` is ill-formed. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `views::counted` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_counted\_fn*`. All instances of `*\_\_counted\_fn*` are equal. The effects of invoking different instances of type `*\_\_counted\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::counted` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::counted` above, `*\_\_counted\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__counted_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __counted_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__counted_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __counted_fn&, Args...>`. Otherwise, no function call operator of `*\_\_counted\_fn*` participates in overload resolution. ### Notes `[views::counted](http://en.cppreference.com/w/cpp/ranges/counted_view)` does not check if the range is long enough to provide all `count` elements: use `[views::take](http://en.cppreference.com/w/cpp/ranges/take_view)` if that check is necessary. ### Example ``` #include <ranges> #include <iostream> int main() { const int a[] = {1, 2, 3, 4, 5, 6, 7}; for(int i : std::views::counted(a, 3)) std::cout << i << ' '; std::cout << '\n'; const auto il = {1, 2, 3, 4, 5}; for (int i : std::views::counted(il.begin() + 1, 3)) std::cout << i << ' '; std::cout << '\n'; } ``` Output: ``` 1 2 3 2 3 4 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2393R1](https://wg21.link/P2393R1) | C++20 | implicit conversion from an integer-class type to `size_t` might be invalid | made explicit | ### See also | | | | --- | --- | | [ranges::take\_viewviews::take](take_view "cpp/ranges/take view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the first N elements of another [`view`](view "cpp/ranges/view") (class template) (range adaptor object) | | [ranges::subrange](subrange "cpp/ranges/subrange") (C++20) | combines an iterator-sentinel pair into a [`view`](view "cpp/ranges/view") (class template) | | [counted\_iterator](../iterator/counted_iterator "cpp/iterator/counted iterator") (C++20) | iterator adaptor that tracks the distance to the end of the range (class template) | | [ranges::countranges::count\_if](../algorithm/ranges/count "cpp/algorithm/ranges/count") (C++20)(C++20) | returns the number of elements satisfying specific criteria (niebloid) |
programming_docs
cpp std::ranges::views::as_const, std::ranges::as_const_view std::ranges::views::as\_const, std::ranges::as\_const\_view =========================================================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::view V > requires ranges::input_range<V> class as_const_view : public ranges::view_interface<as_const_view<V>> ``` | (1) | (since C++23) | | ``` namespace views { inline constexpr /* unspecified */ as_const = /* unspecified */; } ``` | (2) | (since C++23) | | Call signature | | | | ``` template< ranges::viewable_range R > requires /* see below */ constexpr ranges::view auto as_const( R&& r ); ``` | | (since C++23) | 1) A range adaptor that represents a view of underlying [`view`](view "cpp/ranges/view") that is also a [`constant_range`](constant_range "cpp/ranges/constant range"). An `as_const_view` always has read-only elements (if not empty). 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). Let `e` be a subexpression, let `T` be `decltype((e))`, and let `U` be `[std::remove\_cvref\_t](http://en.cppreference.com/w/cpp/types/remove_cvref)<T>`. Then the expression `views::as_const(e)` is *expression-equivalent* to: * `[views::all](http://en.cppreference.com/w/cpp/ranges/all_view)(e)`, if it is a well-formed expression and `[views::all\_t](http://en.cppreference.com/w/cpp/ranges/all_view)<T>` models [`constant_range`](constant_range "cpp/ranges/constant range"); * otherwise, `[std::span](http://en.cppreference.com/w/cpp/container/span)<const X, Extent>(e)` for some type `X` and some extent `Extent` if `U` denotes `[std::span](http://en.cppreference.com/w/cpp/container/span)<X, Extent>`; * otherwise, `[ranges::ref\_view](http://en.cppreference.com/w/cpp/ranges/ref_view)(static\_cast<const U&>(e))` if `e` is an lvalue, `const U` models [`constant_range`](constant_range "cpp/ranges/constant range"), and `U` does not model [`view`](view "cpp/ranges/view"). * otherwise, `as_const_view{e}`. `as_const_view` always models [`constant_range`](constant_range "cpp/ranges/constant range"), and it models the [`contiguous_range`](contiguous_range "cpp/ranges/contiguous range"), [`random_access_range`](random_access_range "cpp/ranges/random access range"), [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), [`forward_range`](forward_range "cpp/ranges/forward range"), [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"), [`common_range`](common_range "cpp/ranges/common range"), and [`sized_range`](sized_range "cpp/ranges/sized range") when the underlying view `V` models respective concepts. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Data members Typical implementations of `as_const_view` only hold one non-static data member: * the underlying view of type `V` (shown here as `*base\_*`, the name is exposition only). ### Member functions | | | | --- | --- | | (constructor) (C++23) | constructs an `as_const_view` (public member function) | | base (C++23) | returns the underlying view `V` (public member function) | | begin (C++23) | returns the beginning iterator of the `as_const_view` (public member function) | | end (C++23) | returns the end iterator of the `as_const_view` (public member function) | | size (C++23) | returns the size of the view if it is bounded (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [data](view_interface/data "cpp/ranges/view interface/data") (C++20) | Gets the address of derived view's data. Provided if its iterator type satisfies [`contiguous_iterator`](../iterator/contiguous_iterator "cpp/iterator/contiguous iterator"). (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::as\_const\_view::as\_const\_view ---------------------------------------------- | | | | | --- | --- | --- | | ``` as_const_view() requires std::default_initializable<V> = default; ``` | (1) | (since C++23) | | ``` constexpr explicit as_const_view(V base); ``` | (2) | (since C++23) | 1) Value-initializes `*base\_*` via its default member initializer (`= V()`). 2) Initializes `*base\_*` with `std::move(base)`. ### Parameters | | | | | --- | --- | --- | | base | - | a view | std::ranges::as\_const\_view::base ----------------------------------- | | | | | --- | --- | --- | | ``` constexpr V base() const& requires std::copy_constructible<V>; ``` | (1) | (since C++23) | | ``` constexpr V base() &&; ``` | (2) | (since C++23) | Returns the underlying view. 1) Copy-constructs the result from the underlying view. Equivalent to `return base_;`. 2) Move-constructs the result from the underlying view. Equivalent to `return std::move(base_);`. std::ranges::as\_const\_view::begin ------------------------------------ | | | | | --- | --- | --- | | ``` constexpr auto begin() requires (!/*simple-view*/<V>) { return ranges::cbegin(base_); } ``` | (1) | (since C++23) | | ``` constexpr auto begin() const requires ranges::range<const V> { return ranges::cbegin(base_); } ``` | (2) | (since C++23) | Returns the constant iterator of the view. std::ranges::as\_const\_view::end ---------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto end() requires (!/*simple-view*/<V>) { return ranges::cend(base_); } ``` | (1) | (since C++23) | | ``` constexpr auto end() const requires ranges::range<const V> { return ranges::cend(base_); } ``` | (2) | (since C++23) | Returns the constant sentinel of the view. std::ranges::as\_const\_view::size ----------------------------------- | | | | | --- | --- | --- | | ``` constexpr auto size() requires ranges::sized_range<V> { return ranges::size(base_); } ``` | (1) | (since C++23) | | ``` constexpr auto size() const requires ranges::sized_range<const V> { return ranges::size(base_); } ``` | (2) | (since C++23) | Returns the size of the view if the view is bounded. ### Deduction guides | | | | | --- | --- | --- | | ``` template<class R> as_const_view(R&&) -> as_const_view<views::all_t<R>>; ``` | | (since C++23) | ### Helper templates | | | | | --- | --- | --- | | ``` template<class T> inline constexpr bool enable_borrowed_range<std::ranges::as_const_view<T>> = std::ranges::enable_borrowed_range<T>; ``` | | (since C++23) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `as_const_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range") when the underlying view satisfies it. ### Example ### See also | | | | --- | --- | | [ranges::cbegin](cbegin "cpp/ranges/cbegin") (C++20) | returns an iterator to the beginning of a read-only range (customization point object) | | [ranges::cend](cend "cpp/ranges/cend") (C++20) | returns a sentinel indicating the end of a read-only range (customization point object) | cpp std::ranges::views::iota, std::ranges::iota_view std::ranges::views::iota, std::ranges::iota\_view ================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< std::weakly_incrementable W, std::semiregular Bound = std::unreachable_sentinel_t > requires __WeaklyEqualityComparableWith<W, Bound> && std::copyable<W> class iota_view : public ranges::view_interface<iota_view<W, Bound>> ``` | (1) | (since C++20) | | ``` namespace views { inline constexpr /*unspecified*/ iota = /*unspecified*/; } ``` | (2) | (since C++20) | | Call signature | | | | ``` template< class W > requires /* see below */ constexpr /* see below */ iota( W&& value ); ``` | | (since C++20) | | ``` template< class W, class Bound > requires /* see below */ constexpr /* see below */ iota( W&& value, Bound&& bound ); ``` | | (since C++20) | 1) A range factory that generates a sequence of elements by repeatedly incrementing an initial value. Can be either bounded or unbounded (infinite). 2) `views::iota(e)` and `views::iota(e, f)` are *expression-equivalent* to (has the same effect as) `iota_view(e)` and `iota_view(e, f)` respectively for any suitable subexpressions `e` and `f`. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `views::iota` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_iota\_fn*`. All instances of `*\_\_iota\_fn*` are equal. The effects of invoking different instances of type `*\_\_iota\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `views::iota` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `views::iota` above, `*\_\_iota\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__iota_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __iota_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__iota_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __iota_fn&, Args...>`. Otherwise, no function call operator of `*\_\_iota\_fn*` participates in overload resolution. ### Data members Typical implementation of `iota_view` holds two non-static data members: the beginning value `value_` of type `W` and the sentinel value `bound_` of type `Bound`. The names shown here are exposition-only. ### Member functions | | | | --- | --- | | (constructor) (C++20) | creates a `iota_view` (public member function) | | begin (C++20) | obtains the beginning iterator of an `iota_view` (public member function) | | end (C++20) | obtains the sentinel denoting the end of an `iota_view` (public member function) | | size (C++20) | obtains the size of an `iota_view` if it is sized (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | | [operator[]](view_interface/operator_at "cpp/ranges/view interface/operator at") (C++20) | Returns the nth element in the derived view. Provided if it satisfies [`random_access_range`](random_access_range "cpp/ranges/random access range"). (public member function of `std::ranges::view_interface<D>`) | std::ranges::iota\_view::iota\_view ------------------------------------ | | | | | --- | --- | --- | | ``` iota_view() requires std::default_initializable<W> = default; ``` | (1) | (since C++20) | | ``` constexpr explicit iota_view( W value ); ``` | (2) | (since C++20) | | ``` constexpr iota_view( std::type_identity_t<W> value, std::type_identity_t<Bound> bound ); ``` | (3) | (since C++20) | | ``` constexpr iota_view( /*iterator*/ first, /* see below */ last ); ``` | (4) | (since C++20) | 1) Value-initializes `value_` and `bound_` via their default member initializers (`= W()` and `= Bound()`). 2) Initializes `value_` with `value` and value-initializes `bound_`. This constructor is used to create unbounded `iota_view`s, e.g. `iota(0)` yields numbers 0,1,2..., infinitely. 3) Initializes `value_` with `value` and `bound_` with `bound`. The behavior is undefined if `[std::totally\_ordered\_with](http://en.cppreference.com/w/cpp/concepts/totally_ordered)<W, Bound>` is modeled and `bool(value <= bound)` is `false`. This constructor is used to create bounded iota views, e.g. `iota(10, 20)` yields numbers from 10 to 19. 4) Same as (3), except that `value_` is initialized with the `W` value stored in `first`, and * if `W` and `Bound` are the same type, then the type of `last` is `/*iterator*/` and `bound_` initialized with the `W` value stored in `last`, * otherwise, if the `iota_view` is unbounded (i.e. `Bound` is `[std::unreachable\_sentinel\_t](../iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t")`), then the type of `last` is `[std::unreachable\_sentinel\_t](../iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t")` and `bound_` initialized with `[std::unreachable\_sentinel](../iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t")`. * otherwise, the type of `last` is `/*sentinel*/` and `bound_` initialized with the `Bound` value stored in `last`. In any case, the type of `last` is same as `decltype(end())`. For (2), (3), and (4), the behavior is undefined if the `iota_view` is bounded (i.e. `Bound` is not `[std::unreachable\_sentinel\_t](../iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t")`) and `bound_` is initialized to a value unreachable from `value_`. ### Parameters | | | | | --- | --- | --- | | value | - | the starting value | | bound | - | the bound | | first | - | the iterator denoting the starting value | | last | - | the iterator or sentinel denoting the bound | std::ranges::iota\_view::begin ------------------------------- | | | | | --- | --- | --- | | ``` constexpr /*iterator*/ begin() const; ``` | | (since C++20) | Returns an [iterator](iota_view/iterator "cpp/ranges/iota view/iterator") initialized with `value_`. std::ranges::iota\_view::end ----------------------------- | | | | | --- | --- | --- | | ``` constexpr auto end() const; ``` | (1) | (since C++20) | | ``` constexpr /*iterator*/ end() const requires std::same_as<W, Bound>; ``` | (2) | (since C++20) | 1) Returns a [sentinel](iota_view/sentinel "cpp/ranges/iota view/sentinel") of a specific type (shown as `/*sentinel*/` here) initialized with `bound_` if this view is bounded, or `[std::unreachable\_sentinel](../iterator/unreachable_sentinel_t "cpp/iterator/unreachable sentinel t")` if this view is unbounded. 2) Returns an [iterator](iota_view/iterator "cpp/ranges/iota view/iterator") initialized with `bound_`. std::ranges::iota\_view::size ------------------------------ | | | | | --- | --- | --- | | ``` constexpr auto size() const requires (std::same_as<W, Bound> && /*advanceable*/<W>) || (/*is-integer-like*/<W> && /*is-integer-like*/<Bound>) || std::sized_sentinel_for<Bound, W> { if constexpr (/*is-integer-like*/<W> && /*is-integer-like*/<Bound>) return (value_ < 0) ? ((bound_ < 0) ? /*to-unsigned-like*/(-value_) - /*to-unsigned-like*/(-bound_) : /*to-unsigned-like*/(bound_) + /*to-unsigned-like*/(-value_)) : /*to-unsigned-like*/(bound_) - /*to-unsigned-like*/(value_); else return /*to-unsigned-like*/(bound_ - value_); } ``` | | (since C++20) | Returns the size of the view if the view is bounded. The exposition-only concept `*advanceable*` is described in [this page](iota_view/iterator "cpp/ranges/iota view/iterator"). The exposition-only function template `*to-unsigned-like*` converts its argument (which must be [integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable")) to the corresponding unsigned version of the argument type. ### Deduction guides | | | | | --- | --- | --- | | ``` template<class W, class Bound> requires (!/*is-integer-like*/<W> || !/*is-integer-like*/<Bound> || /*is-signed-integer-like*/<W> == /*is-signed-integer-like*/<Bound>) iota_view(W, Bound) -> iota_view<W, Bound>; ``` | | (since C++20) | For any type `T`, `/*is-integer-like*/<T>` is `true` if and only if `T` is [integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable"), and `/*is-signed-integer-like*/<T>` is `true` if and only if `T` is integer-like and capable of representing negative values. Note that the guide protects itself against signed/unsigned mismatch bugs, like `views::iota(0, v.size())`, where `​0​` is a (signed) `int` and `v.size()` is an (unsigned) `[std::size\_t](../types/size_t "cpp/types/size t")`. ### Nested classes | | | | --- | --- | | [*iterator*](iota_view/iterator "cpp/ranges/iota view/iterator") (C++20) | the iterator type (exposition-only member class) | | [*sentinel*](iota_view/sentinel "cpp/ranges/iota view/sentinel") (C++20) | the sentinel type used when the `iota_view` is bounded and `Bound` and `W` are not the same type (exposition-only member class) | ### Helper templates | | | | | --- | --- | --- | | ``` template<std::weakly_incrementable W, std::semiregular Bound> inline constexpr bool enable_borrowed_range<ranges::iota_view<W, Bound>> = true; ``` | | (since C++20) | This specialization of [`std::ranges::enable_borrowed_range`](borrowed_range "cpp/ranges/borrowed range") makes `iota_view` satisfy [`borrowed_range`](borrowed_range "cpp/ranges/borrowed range"). ### Example ``` #include <ranges> #include <iostream> #include <algorithm> int main() { for (int i : std::ranges::iota_view{1, 10}) std::cout << i << ' '; std::cout << '\n'; for (int i : std::views::iota(1, 10)) std::cout << i << ' '; std::cout << '\n'; struct Bound { int bound; bool operator==(int x) const { return x == bound; } }; for (int i : std::views::iota(1, Bound{10})) std::cout << i << ' '; std::cout << '\n'; for (int i : std::views::iota(1) | std::views::take(9)) std::cout << i << ' '; std::cout << '\n'; std::ranges::for_each(std::views::iota(1, 10), [](int i) { std::cout << i << ' '; }); std::cout << '\n'; } ``` Output: ``` 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3523](https://cplusplus.github.io/LWG/issue3523) | C++20 | iterator-sentinel pair constructor might use wrong sentinel type | corrected | | [P2325R3](https://wg21.link/P2325R3) | C++20 | `iota_view` required that `W` is [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular")as [`view`](view "cpp/ranges/view") required [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable") | only requires that `W` is [`copyable`](../concepts/copyable "cpp/concepts/copyable") | | [LWG 3610](https://cplusplus.github.io/LWG/issue3610) | C++20 | `size` might reject integer-class types | made accepting if possible | ### See also | | | | --- | --- | | [iota](../algorithm/iota "cpp/algorithm/iota") (C++11) | fills a range with successive increments of the starting value (function template) | | [ranges::iota](../algorithm/ranges/iota "cpp/algorithm/ranges/iota") (C++23) | fills a range with successive increments of the starting value (niebloid) | | [ranges::repeat\_viewviews::repeat](repeat_view "cpp/ranges/repeat view") (C++23) | a [`view`](view "cpp/ranges/view") consisting of a generated sequence by repeatedly producing the same value (class template) (customization point object) |
programming_docs
cpp Copyable wrapper (C++20) Copyable wrapper (C++20) ======================== | | | | | --- | --- | --- | | ``` template<class T> requires std::copy_constructible<T> && std::is_object_v<T> class /*copyable-box*/; ``` | | (since C++20) | [`ranges::single_view`](single_view "cpp/ranges/single view") and range adaptors that store an invocable object are specified in terms of an exposition-only class template `*copyable-box*`. The name `*copyable-box*` is for exposition purposes only. `/*copyable-box*/<T>` behaves exactly like `[std::optional](http://en.cppreference.com/w/cpp/utility/optional)<T>`, except that the default constructor, copy assignment operator, and move assignment operator are (conditionally) different from those of `[std::optional](../utility/optional "cpp/utility/optional")`, which makes it always satisfy [`copyable`](../concepts/copyable "cpp/concepts/copyable"). If `T` is already [`copyable`](../concepts/copyable "cpp/concepts/copyable"), or both `[std::is\_nothrow\_move\_constructible\_v](http://en.cppreference.com/w/cpp/types/is_move_constructible)<T>` and `[std::is\_nothrow\_copy\_constructible\_v](http://en.cppreference.com/w/cpp/types/is_copy_constructible)<T>` are `true`, `/*copyable-box*/<T>` may store only a `T` object, because it always contains a value. ### Template parameters | | | | | --- | --- | --- | | T | - | the type of the contained value, must be a object type that models [`copy_constructible`](../concepts/copy_constructible "cpp/concepts/copy constructible") | ### Member functions *copyable-box*::*copyable-box* ------------------------------- | | | | | --- | --- | --- | | ``` constexpr /*copyable-box*/() noexcept(std::is_nothrow_default_constructible_v<T>) requires std::default_initializable<T> : /*copyable-box*/(std::in_place) { } ``` | | (since C++20) | The default constructor is provided if and only if `T` models [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable"). A default-constructed `/*copyable-box*/<T>` contains a value-initialized `T` object. *copyable-box*::operator= -------------------------- | | | | | --- | --- | --- | | ``` constexpr /*copyable-box*/& operator=(const /*copyable-box*/& other); noexcept(/* see below */); ``` | (1) | (since C++20) | | ``` constexpr /*copyable-box*/& operator=(/*copyable-box*/&& other) noexcept(std::is_nothrow_move_constructible_v<T>); ``` | (2) | (since C++20) | 1) If `[std::copyable](http://en.cppreference.com/w/cpp/concepts/copyable)<T>` is not modeled, the copy assignment operator is equivalently defined as: `constexpr /\*copyable-box\*/& operator=(const /\*copyable-box\*/& other) noexcept([std::is\_nothrow\_copy\_constructible\_v](http://en.cppreference.com/w/cpp/types/is_copy_constructible)<T>) { if (this != [std::addressof](http://en.cppreference.com/w/cpp/memory/addressof)(other)) { if (other) emplace(\*other); else reset(); } return \*this; }` Otherwise, it is identical to [the copy assignment operator of `std::optional`](../utility/optional/operator= "cpp/utility/optional/operator="). 2) If `[std::movable](http://en.cppreference.com/w/cpp/concepts/movable)<T>` is not modeled, the move assignment operator is equivalently defined as: `constexpr /\*copyable-box\*/& operator=(/\*copyable-box\*/&& other) noexcept([std::is\_nothrow\_move\_constructible\_v](http://en.cppreference.com/w/cpp/types/is_move_constructible)<T>) { if (this != [std::addressof](http://en.cppreference.com/w/cpp/memory/addressof)(other)) { if (other) emplace(std::move(\*other)); else reset(); } return \*this; }` Otherwise, it is identical to [the move assignment operator of `std::optional`](../utility/optional/operator= "cpp/utility/optional/operator="). Members identical to std::optional ----------------------------------- ### Member functions | | | | --- | --- | | [(constructor)](../utility/optional/optional "cpp/utility/optional/optional") | constructs the optional object (public member function of `std::optional<T>`) | | [(destructor)](../utility/optional/~optional "cpp/utility/optional/~optional") | destroys the contained value, if there is one (public member function of `std::optional<T>`) | | [operator=](../utility/optional/operator= "cpp/utility/optional/operator=") | assigns contents (public member function of `std::optional<T>`) | | Observers | | [operator->operator\*](../utility/optional/operator* "cpp/utility/optional/operator*") | accesses the contained value (public member function of `std::optional<T>`) | | [operator boolhas\_value](../utility/optional/operator_bool "cpp/utility/optional/operator bool") | checks whether the object contains a value (public member function of `std::optional<T>`) | | Modifiers | | [reset](../utility/optional/reset "cpp/utility/optional/reset") | destroys any contained value (public member function of `std::optional<T>`) | | [emplace](../utility/optional/emplace "cpp/utility/optional/emplace") | constructs the contained value in-place (public member function of `std::optional<T>`) | ### Notes A `*copyable-box*` does not contain a value only if. * `T` does not model [`movable`](../concepts/movable "cpp/concepts/movable") or [`copyable`](../concepts/copyable "cpp/concepts/copyable"), and an exception is thrown on move assignment or copy assignment respectively, or * it is initialized/assigned from another valueless wrapper. Before [P2325R3](https://wg21.link/P2325R3), the wrapper was called `*semiregular-box*` in the standard and always satisfied [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular"), as the default constructor was always provided (which might construct a valueless wrapper). ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [P2325R3](https://wg21.link/P2325R3) | C++20 | if `T` is not [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable"), the default constructorconstructs a wrapper which does not contain a value | the wrapper is alsonot [`default_initializable`](../concepts/default_initializable "cpp/concepts/default initializable") | | [LWG 3572](https://cplusplus.github.io/LWG/issue3572) | C++20 | conditionally different assignment operators were not constexpr | made constexpr | ### See also | | | | --- | --- | | [ranges::single\_viewviews::single](single_view "cpp/ranges/single view") (C++20) | a [`view`](view "cpp/ranges/view") that contains a single element of a specified value (class template) (customization point object) | | [ranges::filter\_viewviews::filter](filter_view "cpp/ranges/filter view") (C++20) | a [`view`](view "cpp/ranges/view") that consists of the elements of a [`range`](range "cpp/ranges/range") that satisfies a predicate (class template) (range adaptor object) | | [ranges::transform\_viewviews::transform](transform_view "cpp/ranges/transform view") (C++20) | a [`view`](view "cpp/ranges/view") of a sequence that applies a transformation function to each element (class template) (range adaptor object) | | [ranges::take\_while\_viewviews::take\_while](take_while_view "cpp/ranges/take while view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the initial elements of another [`view`](view "cpp/ranges/view"), until the first element on which a predicate returns false (class template) (range adaptor object) | | [ranges::drop\_while\_viewviews::drop\_while](drop_while_view "cpp/ranges/drop while view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the elements of another [`view`](view "cpp/ranges/view"), skipping the initial subsequence of elements until the first element where the predicate returns false (class template) (range adaptor object) | | [ranges::zip\_transform\_viewviews::zip\_transform](zip_transform_view "cpp/ranges/zip transform view") (C++23) | a [`view`](view "cpp/ranges/view") consisting of tuples of results of application of a transformation function to corresponding elements of the adapted views (class template) (customization point object) | | [ranges::adjacent\_transform\_viewviews::adjacent\_transform](https://en.cppreference.com/mwiki/index.php?title=cpp/ranges/adjacent_transform_view&action=edit&redlink=1 "cpp/ranges/adjacent transform view (page does not exist)") (C++23) | a [`view`](view "cpp/ranges/view") consisting of tuples of results of application of a transformation function to adjacent elements of the adapted view (class template) (range adaptor object) | cpp std::ranges::ssize std::ranges::ssize ================== | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` inline namespace /*unspecified*/ { inline constexpr /*unspecified*/ ssize = /*unspecified*/; } ``` | | (since C++20) (customization point object) | | Call signature | | | | ``` template< class T > requires /* see below */ constexpr /*signed-integer-like*/ ssize( T&& t ); ``` | | (since C++20) | Returns the size of a range converted to a signed type. If `[ranges::size](http://en.cppreference.com/w/cpp/ranges/size)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t))` is well-formed, a call to `ranges::ssize` is expression-equivalent to `static\_cast<MadeSigned>([ranges::size](http://en.cppreference.com/w/cpp/ranges/size)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t)))`, where `MadeSigned` denotes. * the corresponding signed version of `decltype([ranges::size](http://en.cppreference.com/w/cpp/ranges/size)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t)))`, if it is wider than `[std::ptrdiff\_t](../types/ptrdiff_t "cpp/types/ptrdiff t")`, or * `[std::ptrdiff\_t](../types/ptrdiff_t "cpp/types/ptrdiff t")` otherwise. If `[ranges::size](http://en.cppreference.com/w/cpp/ranges/size)([std::forward](http://en.cppreference.com/w/cpp/utility/forward)<T>(t))` is ill-formed, a call to `ranges::ssize` is also ill-formed, which can result in [substitution failure](../language/sfinae "cpp/language/sfinae") when `ranges::ssize(t)` appears in the immediate context of a template instantiation. ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Customization point objects The name `ranges::ssize` denotes a *customization point object*, which is a const [function object](../named_req/functionobject "cpp/named req/FunctionObject") of a [literal](../named_req/literaltype "cpp/named req/LiteralType") [`semiregular`](../concepts/semiregular "cpp/concepts/semiregular") class type. For exposition purposes, the cv-unqualified version of its type is denoted as `*\_\_ssize\_fn*`. All instances of `*\_\_ssize\_fn*` are equal. The effects of invoking different instances of type `*\_\_ssize\_fn*` on the same arguments are equivalent, regardless of whether the expression denoting the instance is an lvalue or rvalue, and is const-qualified or not (however, a volatile-qualified instance is not required to be invocable). Thus, `ranges::ssize` can be copied freely and its copies can be used interchangeably. Given a set of types `Args...`, if `[std::declval](http://en.cppreference.com/w/cpp/utility/declval)<Args>()...` meet the requirements for arguments to `ranges::ssize` above, `*\_\_ssize\_fn*` models . * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__ssize_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __ssize_fn, Args...>`, * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<__ssize_fn&, Args...>`, and * `[std::invocable](http://en.cppreference.com/w/cpp/concepts/invocable)<const __ssize_fn&, Args...>`. Otherwise, no function call operator of `*\_\_ssize\_fn*` participates in overload resolution. ### Notes If `ranges::ssize(e)` is valid for an expression `e`, the return type is a [signed-integer-like](../iterator/weakly_incrementable#Integer-like_types "cpp/iterator/weakly incrementable") type, i.e. an integer type for which `[std::is\_signed\_v](../types/is_signed "cpp/types/is signed")` is `true`, or a signed-integer-class type. The width of integer-like types can be detected by `[std::numeric\_limits::digits](../types/numeric_limits/digits "cpp/types/numeric limits/digits")`. ### Example ``` #include <array> #include <iostream> #include <ranges> #include <type_traits> int main() { std::array arr{1, 2, 3, 4, 5}; auto s = std::ranges::ssize(arr); std::cout << "ranges::ssize(arr) = " << s << '\n' << "ranges::ssize is " << (std::is_signed_v<decltype(s)> ? "signed" : "unsigned") << '\n'; std::cout << "reversed arr: "; for (--s; s >= 0; --s) std::cout << arr[s] << ' '; std::cout << "\n" "s = " << s << '\n'; } ``` Output: ``` ranges::ssize(arr) = 5 ranges::ssize is signed reversed arr: 5 4 3 2 1 s = -1 ``` ### Defect reports The following behavior-changing defect reports were applied retroactively to previously published C++ standards. | DR | Applied to | Behavior as published | Correct behavior | | --- | --- | --- | --- | | [LWG 3403](https://cplusplus.github.io/LWG/issue3403) | C++20 | `ranges::size` worked for some non-range types but `ranges::ssize` didn't | made work | ### See also | | | | --- | --- | | [ranges::size](size "cpp/ranges/size") (C++20) | returns an integer equal to the size of a range (customization point object) | | [ranges::sized\_range](sized_range "cpp/ranges/sized range") (C++20) | specifies that a range knows its size in constant time (concept) | | [sizessize](../iterator/size "cpp/iterator/size") (C++17)(C++20) | returns the size of a container or array (function template) | cpp std::ranges::views::join_with, std::ranges::join_with_view std::ranges::views::join\_with, std::ranges::join\_with\_view ============================================================= | Defined in header `[<ranges>](../header/ranges "cpp/header/ranges")` | | | | --- | --- | --- | | ``` template< ranges::input_range V, ranges::forward_range Pattern > requires ranges::view<V> && ranges::input_range<ranges::range_reference_t<V>> && ranges::view<Pattern> && /* range_reference_t<V> and Pattern have compatible elements (see below) */ class join_with_view : ranges::view_interface<join_with_view<V, Pattern>> ``` | (1) | (since C++23) | | ``` namespace views { inline constexpr /*unspecified*/ join_with = /*unspecified*/; } ``` | (2) | (since C++23) | | Call signature | | | | ``` template< ranges::viewable_range R, class Pattern > requires /* see below */ constexpr ranges::view auto join_with( R&& r, Pattern&& pattern ); ``` | | (since C++23) | | ``` template< class Pattern > constexpr /* range adaptor closure */ join_with( Pattern&& pattern ); ``` | | (since C++23) | 1) A range adaptor that represents [`view`](view "cpp/ranges/view") consisting of the sequence obtained from flattening a view of ranges, with every element of the delimiter inserted in between elements of the view. The delimiter can be a single element or a view of elements. 2) [Range adaptor object](../ranges#Range_adaptor_objects "cpp/ranges"). The expression `views::join_with(e, f)` is *expression-equivalent* to `join_with_view(e, f)` for any suitable subexpressions `e` and `f`. `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` and `Pattern` have compatible elements if, let `Inner` denote `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>`, all of the following concepts are modeled: * `[std::common\_with](http://en.cppreference.com/w/cpp/concepts/common_with)<range_value_t<Inner>, range_value_t<Pattern>>` * `[std::common\_reference\_with](http://en.cppreference.com/w/cpp/concepts/common_reference_with)<range_reference_t<Inner>, range_reference_t<Pattern>>` * `[std::common\_reference\_with](http://en.cppreference.com/w/cpp/concepts/common_reference_with)<range_rvalue_reference_t<Inner>, range_rvalue_reference_t<Pattern>>` `join_with_view` models [`input_range`](input_range "cpp/ranges/input range"). `join_with_view` models [`forward_range`](forward_range "cpp/ranges/forward range") when: * `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` is a reference, and * `V` and `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` each model [`forward_range`](forward_range "cpp/ranges/forward range"). `join_with_view` models [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") when: * `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` is a reference, * `V`, `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>`, and `Pattern` each models [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range"), and * `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` and `Pattern` each model [`common_range`](common_range "cpp/ranges/common range"). `join_with_view` models [`common_range`](common_range "cpp/ranges/common range") when: * `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` is a reference, * `V` and `[ranges::range\_reference\_t](http://en.cppreference.com/w/cpp/ranges/iterator_t)<V>` each model [`forward_range`](forward_range "cpp/ranges/forward range") and [`common_range`](common_range "cpp/ranges/common range"). ### Expression-equivalent Expression `e` is *expression-equivalent* to expression `f`, if. * `e` and `f` have the same effects, and * either both are [constant subexpressions](../language/constant_expression "cpp/language/constant expression") or else neither is a constant subexpression, and * either both are [potentially-throwing](../language/noexcept_spec "cpp/language/noexcept spec") or else neither is potentially-throwing (i.e. `noexcept(e) == noexcept(f)`). ### Member functions | | | | --- | --- | | [(constructor)](join_with_view/join_with_view "cpp/ranges/join with view/join with view") (C++23) | constructs a `join_with_view` (public member function) | | [base](join_with_view/base "cpp/ranges/join with view/base") (C++23) | returns a copy of the underlying (adapted) view (public member function) | | [begin](join_with_view/begin "cpp/ranges/join with view/begin") (C++23) | returns an iterator to the beginning (public member function) | | [end](join_with_view/end "cpp/ranges/join with view/end") (C++23) | returns an iterator or a sentinel to the end (public member function) | | Inherited from `[std::ranges::view\_interface](view_interface "cpp/ranges/view interface")` | | [empty](view_interface/empty "cpp/ranges/view interface/empty") (C++20) | Returns whether the derived view is empty. Provided if it satisfies [`sized_range`](sized_range "cpp/ranges/sized range") or [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [operator bool](view_interface/operator_bool "cpp/ranges/view interface/operator bool") (C++20) | Returns whether the derived view is not empty. Provided if `[ranges::empty](empty "cpp/ranges/empty")` is applicable to it. (public member function of `std::ranges::view_interface<D>`) | | [front](view_interface/front "cpp/ranges/view interface/front") (C++20) | Returns the first element in the derived view. Provided if it satisfies [`forward_range`](forward_range "cpp/ranges/forward range"). (public member function of `std::ranges::view_interface<D>`) | | [back](view_interface/back "cpp/ranges/view interface/back") (C++20) | Returns the last element in the derived view. Provided if it satisfies [`bidirectional_range`](bidirectional_range "cpp/ranges/bidirectional range") and [`common_range`](common_range "cpp/ranges/common range"). (public member function of `std::ranges::view_interface<D>`) | ### [Deduction guides](join_with_view/deduction_guides "cpp/ranges/join with view/deduction guides") ### Nested classes | | | | --- | --- | | [*iterator*](join_with_view/iterator "cpp/ranges/join with view/iterator") (C++23) | the iterator type (exposition-only member class template) | | [*sentinel*](join_with_view/sentinel "cpp/ranges/join with view/sentinel") (C++23) | the sentinel type (exposition-only member class template) | ### Example ``` #include <iostream> #include <ranges> #include <vector> #include <string_view> int main() { using namespace std::literals; std::vector v{"This"sv, "is"sv, "a"sv, "test."sv}; auto joined = v | std::views::join_with(' '); for (auto c : joined) std::cout << c; std::cout << '\n'; } ``` Output: ``` This is a test. ``` ### See also | | | | --- | --- | | [ranges::join\_viewviews::join](join_view "cpp/ranges/join view") (C++20) | a [`view`](view "cpp/ranges/view") consisting of the sequence obtained from flattening a [`view`](view "cpp/ranges/view") of [`range`s](range "cpp/ranges/range") (class template) (range adaptor object) |
programming_docs